Authors
Lemlem Kassa1, Jianhua Deng1, Mark Davis2 and Jingye Cai1, 1University of Electronic Science and Technology China (UESTC), China, 2Communication Network Research Institute (CNRI), Technological University, Ireland
Abstract
Machine Learning (ML) is an innovative solution that can autonomously extract patterns and predict trends based on environmental measurements and performance indicators as input to provide self-driven intelligent network systems that can configure and optimize themselves. Under the effects of heterogeneous traffic demand among users and varying channel conditions in WLAN downlink MU-MIMO channels, achieving the maximum system throughput performance is challenging. In addressing these issues, the existing studies have proposed different approaches. However, most of these approaches did not consider a machine-learning based optimization solution. The main contribution of this paper is to propose a machine-learning based adaptive approach that can optimize system frame size that would maximize the system throughput of WLAN in the downlink MUMIMO channel. In this approach, the Access Point (AP) performs the maximum system throughput measurement and collects the “frame size-system throughput patterns” which contains knowledge about the effects of traffic condition, channel condition, and number of stations (STAs). Based on these patterns, our approach uses neural networks to correctly model the system throughput as a function of the system frame size. After training the neural network, we obtain the gradient information to adjust the system frame size. The performance of the proposed ML approach is evaluated over the FIFO aggregation algorithm under the effects of heterogenous traffic patterns for VoIP and Video traffic applications, channel conditions, and number of STAs.
Keywords
Frame Size Optimization, Downlink MU-MIMO, WLAN, Network Traffic, Machine Learning, Neural Network, Throughput Optimization.