政大機構典藏-National Chengchi University Institutional Repository(NCCUR):Item 140.119/116499
English  |  正體中文  |  简体中文  |  Post-Print筆數 : 27 |  全文笔数/总笔数 : 109953/140903 (78%)
造访人次 : 46055845      在线人数 : 989
RC Version 6.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
搜寻范围 查询小技巧:
  • 您可在西文检索词汇前后加上"双引号",以获取较精准的检索结果
  • 若欲以作者姓名搜寻,建议至进阶搜寻限定作者字段,可获得较完整数据
  • 进阶搜寻


    请使用永久网址来引用或连结此文件: https://nccur.lib.nccu.edu.tw/handle/140.119/116499


    题名: ACCURACY ASSESSMENT OF PRECISE INDIRECT GEOREFERENCING FOR IMAGES OF UNMANNED AIRCRAFT SYSTEMS
    UAS影像精確間接地理定位精度研究
    作者: 邱式鴻
    Chio, Shih-Hong
    贡献者: 地政系
    关键词: 虛擬基站即時動態定位;光束法平差;空中三角測量;無人飛行載具系統
    VBS RTK GPS;bundle adjustment;aerial triangulation;UAS
    日期: 2016-08
    上传时间: 2018-03-22 17:17:02 (UTC+8)
    摘要: 無人飛行載具系統UAS可以蒐集高解析度和高品質的影像提供局部製圖。以UAS影像用來局部區域製圖時,UAS影像必須先經精確定位定向。利用衛星定位系統GPS結合慣性導航單元IMU的定位定向POS系統執行直接地理定位是最佳的方法,然而受限於酬載能力,一般民間UAS無法酬載高精度的IMU執行直接定理定位。因此,本研究將討論UAS影像的間接地理定位精度。間接地理定位的第一個處理方法是完全使用分布良好的地面控制點,而不使用GPS觀測量做為空中控制的一般的空中三角測量平差(簡稱空三平差);另一個處理方法是則是使用GPS觀測量做為空中控制的GPS輔助空三平差。本文中,以野外率定場執行相機率定,並提出兩種UAS影像間接地理定位的精度。根據20個平面和29個高程檢核點檢核成果顯示,用Canon EOS 5D Mark II搭配24 mm F/1.4L II USM鏡頭於航高550公尺所蒐集的UAS影像,執行一般空三平差後的立體製圖平面精度0.26公尺(約1.73像元)、高程0.27公尺(約1.80像元);而執行GPS輔助UAS影像空三平差後的立體製圖平面精度則為0.44公尺(約2.93像元)、而高程精度為0.55公尺(約3.67像元)。結果顯示兩種UAS影像間接地理定精度均能滿足台灣地區1/5,000地形圖之製圖精度要求。
    Unmanned Aircraft Systems (UASs) can collect high resolution and high quality images for local mapping. Before the UAS images can be used for accurate mapping tasks in local areas, the precise position and orientation of the UAS images should first be determined. Direct georeferencing by POS (Position and Orientation System), a combination of GPS (Global Positioning System) and IMU (Inertial Measurement Unit), is the best choice; however, most commercial UASs cannot carry highly accurate IMUs because of the limited payload. Therefore, this study will discuss the accuracy of indirect georeferencing for UAS images. One approach for indirect georeferencing is general aerial triangulation (AT) by using well-distributed ground control points (GCPs). The other one is GPS-supported AT with GPS observations as airborne controls. In this paper, the camera is calibrated by the field method, and the accuracy of these two approaches for indirect georeferencing is presented. Based on 20 horizontal check points and 29 vertical check points, this study shows the stereoscopic viewing accuracy of general AT for UAS images, collected by Canon EOS 5D Mark II camera with 24 mm F/1.4L II USM lens at a flying height of 550 m, is about 0.26 m (ca. 1.73 pixels) in planimetry and 0.27 m (ca. 1.80 pixels) in height. GPS-supported AT produced the stereoscopic viewing accuracy about 0.44 m (ca. 2.93 pixels) in planimetry and 0.55 m (ca. 3.67 pixels) in height. The test results show that the accuracy of these two indirect georeferencing approaches of fixed-wing UAS images can be used for updating local 1/5,000 topographic maps in Taiwan.
    關聯: Journal of the Chinese Institute of Civil and Hydraulic Engineering, Vol.28, No.2, pp.67-76
    中國土木水利工程學刊 ; 28卷2期 (2016 / 06 / 01) , P95 - 104
    数据类型: article
    DOI 連結: http://dx.doi.org/10.6652/JoCICHE%2f2016-02802-03
    DOI: 10.6652/JoCICHE%2f2016-02802-03
    显示于类别:[地政學系] 期刊論文

    文件中的档案:

    档案 描述 大小格式浏览次数
    2016-02802-03.pdf807KbAdobe PDF2334检视/开启


    在政大典藏中所有的数据项都受到原著作权保护.


    社群 sharing

    著作權政策宣告 Copyright Announcement
    1.本網站之數位內容為國立政治大學所收錄之機構典藏,無償提供學術研究與公眾教育等公益性使用,惟仍請適度,合理使用本網站之內容,以尊重著作權人之權益。商業上之利用,則請先取得著作權人之授權。
    The digital content of this website is part of National Chengchi University Institutional Repository. It provides free access to academic research and public education for non-commercial use. Please utilize it in a proper and reasonable manner and respect the rights of copyright owners. For commercial use, please obtain authorization from the copyright owner in advance.

    2.本網站之製作,已盡力防止侵害著作權人之權益,如仍發現本網站之數位內容有侵害著作權人權益情事者,請權利人通知本網站維護人員(nccur@nccu.edu.tw),維護人員將立即採取移除該數位著作等補救措施。
    NCCU Institutional Repository is made to protect the interests of copyright owners. If you believe that any material on the website infringes copyright, please contact our staff(nccur@nccu.edu.tw). We will remove the work from the repository and investigate your claim.
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 回馈