[1]闵欢,卢虎,史浩东.采用深度神经网络的无人机蜂群视觉协同控制算法[J].西安交通大学学报,2020,54(09):173-179+196.[doi:10.7652/xjtuxb202009020]
 MIN Huan,LU Hu,SHI Haodong.Visual Collaborative Control for Unmanned Aerial Vehicle Swarm Based on Deep Neural Network[J].Journal of Xi'an Jiaotong University,2020,54(09):173-179+196.[doi:10.7652/xjtuxb202009020]
点击复制

采用深度神经网络的无人机蜂群视觉协同控制算法
分享到:

《西安交通大学学报》[ISSN:0253-987X/CN:61-1069/T]

卷:
54
期数:
2020年第09期
页码:
173-179+196
栏目:
出版日期:
2020-09-10

文章信息/Info

Title:
Visual Collaborative Control for Unmanned Aerial Vehicle Swarm Based on Deep Neural Network
文章编号:
0253-987X(2020)09-0173-07
作者:
闵欢 卢虎 史浩东
空军工程大学信息与导航学院, 710077, 西安
Author(s):
MIN Huan LU Hu SHI Haodong
College of Information and Navigation, Air Force Engineering University, Xi’an 710077, China
关键词:
无人机编队 端到端控制 目标检测识别 神经网络剪枝 视觉跟随控制
Keywords:
unmanned aerial vehicle end to end control target detection and recognition neural network slimming visual following control
分类号:
TP29
DOI:
10.7652/xjtuxb202009020
文献标志码:
A
摘要:
针对现有基于无线通信的无人机蜂群协同控制在电磁拒止环境下无法使用的不足,提出了一种仅利用机载视觉传感器进行无人机编队的端到端控制算法。对经典YOLOv3目标检测识别网络进行了神经网络剪枝,使之适应于嵌入式系统。采用剪枝后的深度神经网络设计了无人机视觉跟随控制算法,利用深度神经网络提取目标无人机的边界框,计算该边界框与期望边界框之间的坐标误差与尺寸误差,并以坐标误差信号作为偏航角控制的反馈输入,以尺寸误差信号作为速度控制的反馈输入,控制协同无人机完成对目标无人机的跟随。与基于无线电的领航跟随算法进行了仿真对比,结果表明,采用视觉跟随的无人机跟随控制在通信干扰、全球定位系统拒止等某些特定环境下具有更好的控制性能优势。利用两架bebop2无人机进行了实测实验,结果表明:所提算法可以在2 s内实现对目标无人机的状态跟随,速度控制稳态误差在5%以内,偏航角控制稳态误差在3%以内,具有良好的应用前景。
Abstract:
The existing unmanned aerial vehicle(UAV)swarm cooperative control based on wireless communication cannot be used in an electromagnetic rejection environment. We propose an end-to-end control algorithm considering only onboard vision sensors for UAV formation. Slimming operation for the classic YOLOv3 neural network makes it suitable for embedded systems, and the UAV visual following control algorithm is designed via deep neural network after slimming. The deep neural network is used to extract the bounding box of the pilot drone and calculate the coordinate error and dimensional error between the bounding box and the expected bounding box, the coordinate error signal is taken as the feedback input for yaw angle control, and the size error signal is taken as the feedback input for speed control to control the follower drone to complete the operation following the pilot drone. Compared with the radio-based pilot-following algorithm, the simulation shows that the UAV-following control with vision achieves the better control performance in certain specific environments with such as communication interference and GPS rejection. Two bebop2 drones are employed to carry out the actual measurement experiments, it is found that the proposed algorithm can realize the status following of the pilot drone within 2 s, the steady-state error for speed control is within 5%, and for yaw angle control is within 3%.

参考文献/References:

[1] LIN W. Distributed UAV formation control using differential game approach [J]. Aerospace Science and Technology, 2014, 35: 54-62.
[2] HENGSTER-MOVRIC K, BOGDAN S, DRAGANJAC I. Multi-agent formation control based on bell-shaped potential functions [J]. Journal of Intelligent and Robotic Systems, 2010, 58(2): 165-189.
[3] 张佳龙, 闫建国, 张普, 等. 基于一致性算法的无人机协同编队避障研究 [J]. 西安交通大学学报, 2018, 52(9): 168-174.
ZHANG Jialong, YAN Jianguo, ZHANG Pu, et al. Collision avoidance of unmanned aerial vehicle formation based on consensus control algorithm [J]. Journal of Xi’an Jiaotong University, 2018, 52(9): 168-174.
[4] CHEN T, GAO Q, GUO M Y. An improved multiple UAVs cooperative flight algorithm based on Leader Follower strategy [C]∥2018 Chinese Control and Decision Conference(CCDC). Piscataway, NJ, USA: IEEE, 2018: 165-169.
[5] BERTINETTO L, VALMADRE J, HENRIQUES J F, et al. Fully-convolutional siamese networks for object tracking [C]∥European Conference on Computer Vision. Cham, Germany: Springer, 2016: 850-865.
[6] PESTANA J, SANCHEZ-LOPEZ J L, SARIPALLI S, et al. Computer vision based general object following for GPS-denied multirotor unmanned vehicles [C]∥2014 American Control Conference. Piscataway, NJ, USA: IEEE, 2014: 1886-1891.
[7] KALAL Z, MIKOLAJCZYK K, MATAS J. Tracking-learning-detection [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2012, 34(7): 1409-1422.
[8] SASKA M, BACA T, THOMAS J, et al. System for deployment of groups of unmanned micro aerial vehicles in GPS-denied environments using onboard visual relative localization [J]. Autonomous Robots, 2017, 41(4): 919-944.
[9] LIN F, PENG K M, DONG X X, et al. Vision-based formation for UAVs [C]∥11th IEEE International Conference on Control & Automation(ICCA). Piscataway, NJ, USA: IEEE, 2014: 1375-1380.
[10] 赵太飞, 许杉, 屈瑶, 等. 基于无线紫外光隐秘通信的侦察无人机蜂群分簇算法 [J]. 电子与信息学报, 2019, 41(4): 967-972.
ZHAO Taifei, XU Shan, QU Yao, et al. Cluster-based algorithm of reconnaissance UAV swarm based on wireless ultraviolet secret communication [J]. Journal of Electronics & Information Technology, 2019, 41(4): 967-972.
[11] LIU Z, LI J G, SHEN Z Q, et al. Learning efficient convolutional networks through network slimming [C]∥2017 IEEE International Conference on Computer Vision(ICCV). Piscataway, NJ, USA: IEEE, 2017: 2755-2763.
[12] ZHANG Z. A flexible new technique for camera calibration [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2000, 22(11): 1330-1334.
[13] REN S Q, HE K M, GIRSHICK R, et al. Faster R-CNN: towards real-time object detection with region proposal networks [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2017, 39(6): 1137 -1149.
[14] DAI Jifeng, LI Yi, HE Kaiming, et al. R-FCN: object detection via region-based fully convolutional networks [EB/OL]. [2020-01-20]. https:∥arxiv.org/abs/1605.06409.
[15] LIU W, ANGUELOV D, ERHAN D, et al. SSD: single shot multibox detector [C]∥European Conference on Computer Vision. Cham, Germany: Springer, 2016: 21-37.
[16] HUANG J, RATHOD V, SUN C, et al. Speed/accuracy trade-offs for modern convolutional object detectors [C]∥2017 IEEE Conference on Computer Vision and Pattern Recognition(CVPR). Piscataway, NJ, USA: IEEE, 2017: 3296-3297.
[17] REDMON J, FARHADI A. YOLOv3: an incremental improvement [EB/OL]. [2020-01-20]. https:∥arxiv.org/abs/1804.02767.
[18] TURPIN M, MICHAEL N, KUMAR V. Trajectory design and control for aggressive formation flight with quadrotors [J]. Autonomous Robots, 2012, 33(1): 143-156.

相似文献/References:

[1]张佳龙,闫建国,肖冰,等.无人机编队协同追踪控制律和编队信息架构[J].西安交通大学学报,2019,53(06):134.[doi:10.7652/xjtuxb201906018]
 ZHANG Jialong,YAN Jianguo,XIAO Bing,et al.Design of Tracking Control Laws and Information Architecture for Cooperative Formation of Unmanned Aerial Vehicles[J].Journal of Xi'an Jiaotong University,2019,53(09):134.[doi:10.7652/xjtuxb201906018]

备注/Memo

备注/Memo:
收稿日期: 2020-03-25。作者简介: 闵欢(1996—),男,硕士生; 卢虎(通信作者),男,教授。基金项目: 国防科技创新特区项目(1816300TS0050010X)。
更新日期/Last Update: 2020-09-10