English  |  正體中文  |  简体中文  |  Post-Print筆數 : 27 |  Items with full text/Total items : 109952/140887 (78%)
Visitors : 46379497      Online Users : 814
RC Version 6.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version
    政大機構典藏 > 資訊學院 > 資訊科學系 > 學位論文 >  Item 140.119/133894
    Please use this identifier to cite or link to this item: https://nccur.lib.nccu.edu.tw/handle/140.119/133894


    Title: 無人機於建築物周圍指定區域之視覺導航降落方法
    Visual Navigation for UAV Landing on Accessory Building Floor
    Authors: 劉効哲
    Liu, Hsiao-Che
    Contributors: 劉吉軒
    Liu, Jyi-Shane
    劉効哲
    Liu, Hsiao-Che
    Keywords: 無人機
    決策控制
    行為樹
    圖像/目標特徵點辨識
    視覺導航
    Date: 2020
    Issue Date: 2021-02-01 14:10:34 (UTC+8)
    Abstract: 近年來無人機不只在軍事方面的應用,與人類日常生活的應用也逐漸普及,許多領域開始將無人機技術結合,進行開發具有自主行為能力的行為。如Google母公司Alphabet的無人機子公司Wing為全美第一家使用無人機送貨公司,利用偵測目的地和搜索著陸點的技術,實際應用在貨物運送上;美國亞馬遜在無人機上裝置感應裝置,及一般相機和紅外線相機分析周遭環境,發展能夠長途飛行的送貨無人機。
    在大多數應用於現實世界的無人機任務中,降落是相當重要的關鍵步驟,尤其是在貨物運送及交付方面。當無人機成功著陸或低空盤旋於目標降落點時,貨物的交付才算成功。對於精確的著陸要求,基於視覺的導航技術具有高度的可靠性和準確性。 在本文中,我們介紹了用於自主降落在建築物周圍附屬平台上的精確視覺導航的研究工作。我們結合了一些基於視覺的先進方法,開發了其他功能組件,透過行為樹進行決策邏輯的控制,整合視覺模組及無人機的飛行導航控制,以提供可用於建築物附近精確著陸的實用自主導航系統。在現實世界中的初始實驗顯示出利用視覺方式進行導航的結果,執行精確著陸的成功率很高。
    Reference: [1] B.D.O. Anderson, C. Yu, B. Fidan, D. Van der Walle, “UAV Formation Control: Theory and Application,” in Recent Advances in Learning and Control, New York:Springer-Verlag, pp. 15-34, 2008.
    [2] T. Arnold, M. De Biasio, A. Fritz, and R. Leitner, “UAV-based measurement of vegetation indices for environmental monitoring,” in Proc. 7th Int. Conf. Sens. Technol. (ICST), pp. 704–707, Dec. 2013.
    [3] M. Silvagni, A. Tonoli, E. Zenerino and M. Chiaberge, “Multipurpose UAV for search and rescue operations in mountain avalanche events,” Geomatics Natural Hazards Risk, vol. 8, no. 1, pp. 18-33, 2017.
    [4] Z. Wang, M. Zheng, J. Guo, and H. Huang, “Uncertain UAV ISR mission planning problem with multiple correlated objectives,” J. Intell., Fuzzy Syst., vol. 32, no. 1, pp. 321–335, Jan. 2017
    [5] G. Brunner, B. Szebedy, S. Tanner and R. Wattenhofer, “The Urban Last Mile Problem: Autonomous Drone Delivery to Your Balcony,” 2019 International Conference on Unmanned Aircraft Systems (ICUAS), Atlanta, GA, USA, pp. 1005-1012, 2019.
    [6] H. Lee, S. Jung and D. H. Shim, “Vision-based UAV landing on the moving vehicle,” 2016 International Conference on Unmanned Aircraft Systems (ICUAS), Arlington, VA, pp. 1-7, 2016.
    [7] A. Bachrach, S. Prentice, R. He, and N. Roy, “Range-robust autonomous navigation in GPS-denied environments,” J. Field Robot., vol. 28, no. 5, pp. 644–666, 2011.
    [8] S. Ahrens, D. Levine, G. Andrews and J. P. How, “Vision-based guidance and control of a hovering vehicle in unknown, GPS-denied environments,” 2009 IEEE International Conference on Robotics and Automation, Kobe, pp. 2643-2648, 2009.
    [9] K. Peng, “A secure network for mobile wireless service,” Journal of Information Processing Systems, vol. 9, no. 2, pp. 247–258, 2013.
    [10] P. A. Zandbergen and L. L. Arnold, “Positional accuracy of the wide area augmentation system in consumer-grade GPS units,” Computers and Geosciences, vol. 37, no. 7, pp. 883–892, 2011.
    [11] G. Xu, “GPS: Theory, Algorithms and Applications,” Springer, Berlin, Germany, 2nd edition, 2007.
    [12] G. Conte and P. Doherty, “An integrated UAV navigation system based on aerial image matching,” Proc. IEEE Aerosp. Conf., pp. 3142-3151, 2008.
    [13] C. Forster, M. Faessler, F. Fontana, M. Werlberger, and D. Scaramuzza, “Continuous on-board monocular-vision-based elevation mapping applied to autonomous landing of micro aerial vehicles,” in 2015 IEEE International Conference on Robotics and Automation (ICRA). IEEE, pp. 111–118, 2015.
    [14] P. H. Nguyen, M. Arsalan, J. H. Koo, R. A. Naqvi, N. Q. Truong, and K. R. Park, “LightDenseYOLO: A fast and accurate marker tracker for autonomous UAV landing by visible light camera sensor on drone, “ Sensors, vol. 18, no. 6, p. 1703, 2018.
    [15] J. Wubben, F. Fabra, C. T. Calafate, T. Krzeszowski, J. M. Marquez-Barja, J. C. Cano, and P. Manzoni, “Accurate landing of unmanned aerial vehicles using ground pattern recognition,” Electronics, 8, 1532, 2019.
    [16] S. Hening, C. A. Ippolito, K. S. Krishnakumar, V. Stepanyan, and M. Teodorescu, “3d lidar slam integration with gps/ins for uavs in urban gps-degraded environments,” in AIAA Information Systems-AIAA Infotech@ Aerospace, pp. 448–457, 2017.
    [17] M. Pierzchała, P. Giguere, and R. Astrup, “Mapping forests using an unmanned ground vehicle with 3d lidar and graph-slam,” Computers and Electronics in Agriculture, vol. 145, pp. 217–225, 2018.
    [18] J. Joglekar, S. S. Gedam and B. Krishna Mohan, “Image Matching Using SIFT Features and Relaxation Labeling Technique—A Constraint Initializing Method for Dense Stereo Matching,” in IEEE Transactions on Geoscience and Remote Sensing, vol. 52, no. 9, pp. 5643-5652, Sept. 2014.
    [19] D. Lee, T. Ryan and H. J. Kim, “Autonomous landing of a VTOL UAV on a moving platform using image-based visual servoing,” 2012 IEEE International Conference on Robotics and Automation, Saint Paul, MN, 2012, pp. 971-976, 2012.
    [20] H. Choi, M. Geeves, B. Alsalam, and F. Gonzalez, “Open Source Computer-Vision Based Guidance System for UAVs On-Board Decision Making,” IEEE Aerospace conference, Big sky, Montana, 2016.
    [21] S. G. Lin, M. A. Garratt, A. J. Lambert, “Monocular vision-based real-time target recognition and tracking for autonomously landing an UAV in a cluttered shipboard environment,” Autonomous Robots, 41(4): pp. 881–901, 2017.
    [22] K. E.Wenzel, A. Masselli, and A. Zell, “Automatic Take Off , Tracking and Landing of a Miniature UAV on a Moving Carrier Vehicle,” Journal of Intelligent & Robotic Systems, vol. 61, no. 1, pp. 221-238, 2010.
    [23] M. Bryson and S. Sukkarieh, “Building a robust implementation of bearing-only inertial SLAM for a UAV,” J. Field Robot., vol. 24, no. 1/2, pp. 113-143, Jan./Feb. 2007.
    [24] C. Kanellakis and G. Nikolakopoulos, “Survey on Computer Vision for UAVs: Current Developments and Trends,” J. Intell. Robot. Syst., vol. 87, no. 1, pp. 141-168, Jan. 2017.
    [25] F. Caballero, L. Merino, J. Ferruz, and A. Ollero, “Vision-based odometry and slam for medium and high altitude flying uavs,” Journal of Intelligent Robotic Systems, vol. 54, pp. 137-161, 2009.
    [26] J. Engel, V. Koltun and D. Cremers, “Direct sparse odometry,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 40, no. 3,pp. 611-625,Mar.2018.
    [27] E. Karami, S. Prasad and M. Shehata, “Image matching using sift, surf, brief and orb: Performance comparison for distorted images,” in Proc. Newfoundland Electr. Comput. Eng. Conf., St. John’s, NL, Canada, Nov.2015.
    [28] David G Lowe, “Distinctive Image Features from Scale-Invariant Keypoints,” International Journal of Computer Vision, vol.50, No. 2, pp.91-110, 2004.
    [29] Y. Li, N. Snavely, and D. P. Huttenlocher, “Location recognition using prioritized feature matching,” in Proc. European Conf. Computer Vision (ECCV), Crete, Greece, Sept. 2010.
    [30] Zhen Liu, Ziying Zhao, Yida Fan, Dong Tian, “Automated change detection of multi-level icebergs near Mertz Glacier region using feature vector matching,” The international conference on Image processing, computer vision and Pattern Recogniton, 2013.
    [31] R. Mur-Artal and J. D. Tardos, “ORB-SLAM2: an Open-Source SLAM system for monocu lar, stereo and RGB-D cameras,” arXiv:1610.06475, Oct.2016.
    [32] M. Colledanchise and P. Ogren, “Behavior trees in robotics and AI: an ¨ introduction,” CoRR, vol. abs/1709.00084, 2017.
    [33] M. Colledanchise, D. Almeida, and P. Ogren, “Towards blended reactive planning and acting using behavior trees,” arXiv preprint arXiv:1611.00230, 11 2016.
    [34] M. F. Sani and G. Karimian, “Automatic navigation and landing of an indoor ar. drone quadrotor using aruco marker and inertial sensors,” in 2017 International Conference on Computer and Drone Applications (IConDA). IEEE, pp. 102–107, 2017.
    [35] P. H. Nguyen, K. W. Kim, Y. W. Lee, and K. R. Park, “Remote marker-based tracking for uav landing using visible-light camera sensor,” Sensors, vol. 17, no. 9, p. 1987, 2017.
    [36] S. A. K. Tareen and Z. Saleem, “A comparative analysis of sift, surf, kaze, akaze, orb, and brisk,” in 2018 International Conference on Computing, Mathematics and Engineering Technologies (iCoMET), pp. 1–10, March 2018.
    [37] M. A. Fischler and R. C. Bolles, “Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography,” Commun. ACM, vol. 24, no. 6, pp. 381–395, Jun. 1981.
    [38] H. Zhou, Y. Yuan, and C. Shi, “Object tracking using SIFT features and mean shift,” J. Comput. Vision Image Understand., vol. 113, no. 3, pp. 345–352, Mar. 2009.
    [39] S. Choi, J. Park, W. Yu, “Resolving Scale Ambiguity for Monocular Visual Odometry,” in Proc. IEEE URAI, pp. 604-608, 2013.
    [40] O. Esrafilian and H. D. Taghirad, “Autonomous flight and obstacle avoidance of a quadrotor by monocular slam,” in 2016 4th International Conference on Robotics and Mechatronics (ICROM), pp. 240–245, Oct 2016.
    Description: 碩士
    國立政治大學
    資訊科學系
    107753028
    Source URI: http://thesis.lib.nccu.edu.tw/record/#G0107753028
    Data Type: thesis
    DOI: 10.6814/NCCU202100033
    Appears in Collections:[資訊科學系] 學位論文

    Files in This Item:

    File Description SizeFormat
    302801.pdf3206KbAdobe PDF2360View/Open


    All items in 政大典藏 are protected by copyright, with all rights reserved.


    社群 sharing

    著作權政策宣告 Copyright Announcement
    1.本網站之數位內容為國立政治大學所收錄之機構典藏,無償提供學術研究與公眾教育等公益性使用,惟仍請適度,合理使用本網站之內容,以尊重著作權人之權益。商業上之利用,則請先取得著作權人之授權。
    The digital content of this website is part of National Chengchi University Institutional Repository. It provides free access to academic research and public education for non-commercial use. Please utilize it in a proper and reasonable manner and respect the rights of copyright owners. For commercial use, please obtain authorization from the copyright owner in advance.

    2.本網站之製作,已盡力防止侵害著作權人之權益,如仍發現本網站之數位內容有侵害著作權人權益情事者,請權利人通知本網站維護人員(nccur@nccu.edu.tw),維護人員將立即採取移除該數位著作等補救措施。
    NCCU Institutional Repository is made to protect the interests of copyright owners. If you believe that any material on the website infringes copyright, please contact our staff(nccur@nccu.edu.tw). We will remove the work from the repository and investigate your claim.
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - Feedback