Selected article for: "high fidelity and neural network"

Author: Demirtas, Ali Murat; Seyfioglu, Mehmet Saygin; Bor-Yaliniz, Irem; Tavli, Bulent; Yanikomeroglu, Halim
Title: Autonomous UAV Base Stations for Next Generation Wireless Networks: A Deep Learning Approach
  • Cord-id: aoaqcm0c
  • Document date: 2021_7_29
  • ID: aoaqcm0c
    Snippet: To address the ever-growing connectivity demands of wireless communications, the adoption of ingenious solutions, such as Unmanned Aerial Vehicles (UAVs) as mobile Base Stations (BSs), is imperative. In general, the location of a UAV Base Station (UAV-BS) is determined by optimization algorithms, which have high computationally complexities and place heavy demands on UAV resources. In this paper, we show that a Convolutional Neural Network (CNN) model can be trained to infer the location of a UA
    Document: To address the ever-growing connectivity demands of wireless communications, the adoption of ingenious solutions, such as Unmanned Aerial Vehicles (UAVs) as mobile Base Stations (BSs), is imperative. In general, the location of a UAV Base Station (UAV-BS) is determined by optimization algorithms, which have high computationally complexities and place heavy demands on UAV resources. In this paper, we show that a Convolutional Neural Network (CNN) model can be trained to infer the location of a UAV-BS in real time. In so doing, we create a framework to determine the UAV locations that considers the deployment of Mobile Users (MUs) to generate labels by using the data obtained from an optimization algorithm. Performance evaluations reveal that once the CNN model is trained with the given labels and locations of MUs, the proposed approach is capable of approximating the results given by the adopted optimization algorithm with high fidelity, outperforming Reinforcement Learning (RL)-based approaches. We also explore future research challenges and highlight key issues.

    Search related documents:
    Co phrase search for related documents
    • absolute error and loss function: 1, 2
    • activation function and loss function: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12