Data Processing, AI, and Cloud Servers at Telecommunications

The introduction of 5G is not without challenges. It’s expensive and is distributed in ways that have not been distributed in the past. There is an extensive cost involved in building this type of network. First of all, we need to review data processing, and what is this? According to Carl French (1996), in his book, “Data Processing and Information Technology is the collection and manipulation of items of data to produce meaningful information”. The author explained the difference between data and information. Data is the term used to describe basic facts. This could be like the raw material that later of pass through some process “data processing” becomes meaningful information, and telecommunications could take advantage of it.

Online gaming continues to increase in popularity, and network connectivity is integral to esports today. Learn how network connectivity has adapted to meet the demands of gaming

Today we are in an age where all our daily activities produce a lot of data, and this data travels from our devices through the ISP provider and finally gets into the Internet. To give you a quick idea, VR with a resolution equivalent to 4K TV resolution would require a bandwidth of 1Gbps for a smooth play or 2.5 Gbps for interactive, both requiring a minimal latency of 10ms and minimum delay. And that’s for round-trip time. Soon these applications will target the smartphone, putting additional strains on networks. As AR/VR services grow in popularity, the proposed 5G networks will yield the speed and the needed performance.

Also, an average telecom operator generates billions of records per day, and data should be analyzed in real or near real-time to gain maximum benefit for both the subscribers (using personalized call plans based on usage patterns, using data from social networks to optimize marketing campaigns) and the company itself. Among other advantages related to the company, we have the operations maintenance could be improved with the prediction of networking loops, commutation errors, server errors, and this could make a preventive diagnostics in the network elements. Some of the sources of data that could be taken into account to make an application of some data analysis are transactions, network data (TCP, UDP messages), log data, geospatial information,  data from sensors.

With the enablement of 5G networks, we have a vast amount of data, the majority of this will be due to video-based applications. A deployment model of vendor-centric equipment cannot sustain this exponential growth in traffic, so is needed the movement of the processing and compute, storage, and network functionality to the edge. Eventually, this will create a real-time network at the edge. 

Real-time network at the edge is similar or resemble the term “Edge computing” we heard about this since the nineties. Every new IoT device, computer, and mobile device will generate more and more data over time, using technologies like Virtual Reality, video 4k, Massively Multiplayer Online Games (MMOGs), VR-MMOGs, real-time video with great quality, augmented reality applications. Edge computing is here to improve response times and reduce the usage of bandwidth (The algorithms sitting on the edge of the network will help to reduce the traffic sent to the backbone).

AI techniques into networking are one way the industry is addressing these complexities. AI technique is a key component that needs to be adapted to the network to help manage and control this change. Another important use case for AI is for network planning and operations. So with 5g will be approximately 100,000s of small cells everywhere where each cell is connected to a fiber line, and it’s estimated that will grow up millions of cell and devices globally, and because those telecommunications companies are figuring out how to plan and design all these cells would be beyond human capability. This is where AI can do site evaluations and tell you what throughput you have with certain designs.

Edge computing software stack is a set of new tools,  open-source or not, that could be aimed at the managed data, monitoring, and improve the design of our networks. In this article, we are going to mention some of them:

  • Airship is New Open Infrastructure Project for OpenStack. Declaratively define your OpenStack & Kubernetes Infrastructure.

 

  • Akraino Edge Stack aims to create an open-source software stack that supports high-availability cloud services optimized for edge computing systems and applications, Akraino follows the standard defined by “LF Edge Foundation” and have the complete support of Linux Foundation. These services are optimized for edge computing systems and applications. The Akraino R1 release includes 10 “ready and proven” blueprints and delivers a fully functional edge stack for edge use cases. These range from Industrial IoT, Telco 5G Core & vRAN, uCPE, SDWAN, edge media processing, and carrier edge media processing.

 

  • Intel® ONP for Servers is designed to make it easier to test and deploy SDN (Software-defined networking) and NFV solutions and made a partnership with Linux foundation, Openshift, red hat.

 

  • Amazon AWS for telecom with many products amazon is a completed solution, today they are in partnership with Cisco, and Nvidia for processing data let enhances the customer experience through machine learning and AI. Some of his products are Amazon S3), Amazon Redshift, and Snowflake, Amazon Ec2, Kx, RDS, AWS IoT GreenGrass, AWS snowball edge, AWS Sagemaker, AWS Lambda.

Online gaming continues to increase in popularity, and network connectivity is integral to esports today. Learn how network connectivity has adapted to meet the demands of gaming

 

 

 

Menu