Aviation Surveillance Information Fusion Technology Based on Recurrent Neural Network
Abstract— Aviation surveillance information fusion is aimed at merging the detection data from multiple sources of the same target aircraft to obtain more accurate monitoring information, including aircraft position, heading, acceleration and other information. The traditional Kalman filter-based fusion technology has shortcomings, such as poor integration in the maneuvering state, and it takes a lot of manpower and material resources to repeat the adjustment. Therefore, this paper uses the recurrent neural network to conduct the experiment of aviation surveillance information fusion. Firstly, the recurrent neural network is used to identify the maneuver state of the aircraft, and the weighted least squares method is used to predict the position of the aircraft according to the maneuvering state, so as to obtain the monitoring information of each radar at the same time. After that, the recurrent neural network model is used to fuse the monitoring information of multiple radars. The experimental results show that the maneuvering state discriminant model based on recurrent neural network can effectively identify the maneuvering state. The least square method based on maneuvering state can accurately predict the position of the aircraft. The aeronautical surveillance information fusion model based on recurrent neural network can also obtain more accurate fusion results. The whole process includes four parts: preprocessing, maneuver status discrimination, position prediction and information fusion. The total time is about 500ms.
Index Terms— Maneuvering state, Position prediction, Aviation surveillance information fusion, RNN.
Zhanchun Gao, Anyu Song
School of Computer Science Beijing University of Posts and Telecommunications, CHINA
Cite: Zhanchun Gao, Anyu Song, "Aviation Surveillance Information Fusion Technology Based on Recurrent Neural Network," Proceedings of 2019 the 9th International Workshop on Computer Science and Engineering, pp. 329-336, Hong Kong, 15-17 June, 2019.