Although the wireless carriers across the globe are pushing to deliver 5G service to their clients as soon as possible, the new radio access networks are expected to be more sophisticated than what came earlier.
The networks will depend on emerging technologies such as millimeter waves as well as huge antenna arrays dubbed as massive MIMO.
During the recent 5G summit held in Taipei, Taiwan at the Computex trade show, Rajeev Agrawal said that Nokia is currently incorporating machine learning to fix some of the issues resulting from that complexity with the hope that artificial intelligence (AI) can cut costs and improve network performance.
Agrawal, the person in charge of Nokia’s radio access network provisions, provided three possibilities for both 5G and machine learning that Nokia had studied internally even though they were not yet documented in academic research papers.
Scheduling Beamforming in Massive MIMO Networks
As far as a MIMO network is concerned, cellular base stations relay and receive radio frequency signals in parallel via more antennas that the ones commonly utilized on a base station. In turn, this means that the base station can receive and transmit additional data, even though the signals impede each other.
Beamforming entails a signal processing technology that allows base stations to transmit targeted beams of data to users, minimizing interference and efficiently using the radiofrequency spectrum.
One of the impediments of creating such systems is deciding how the beams should be scheduled.
The possible ways to schedule four out of 32 beams amount to over 30,000 options. What’s more, a base station lacks the adequate processing power to quickly determine the ideal schedule for such combinations.
Nokia claims that it successfully trained neural networks the way to spot the ideal schedule offline as well as rapidly forecast the excellent schedules on demand.
An alternative method of making more efficient use of spectrum, particularly in 5G networks is to create miniature base stations or even small cells that can provide wireless service nearer to the physical locations of customers.
According to Agrawal, the radio frequency data of a small cell network can be utilized in training a machine learning algorithm to identify the positions of network users’ equipment.
The approach taken by Nokia involves first measure the received signal strengths generated by each cell, mainly for multiple points. Subsequently, the company utilizes the maps in training neural networks to forecast the position of a device by looking at the signal strength it gets from adjacent cells.
Configuring Downlink and Uplink Channels
For a smartphone to operate appropriately on a cellular network, engineers require effectively configuring the size of the uplink control channel of a device that sends feedback, particularly on network quality. What this entails is that the more spectrum used by the uplink control channel, the better the data transmission quality from a client’s smartphone. However, this results in a limited spectrum for data transmission, which makes the situation a tradeoff.
According to Agrawal, a machine learning system first forecasts the characteristics of a user equipment including mobility. Then, it predicts what the downlink or uplink outputs would be against various settings before selecting the ideal setting.