English  |  正體中文  |  简体中文  |  Post-Print筆數 : 11 |  Items with full text/Total items : 88613/118155 (75%)
Visitors : 23493745      Online Users : 349
RC Version 6.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version
    Please use this identifier to cite or link to this item: http://nccur.lib.nccu.edu.tw/handle/140.119/55377


    Title: 智慧型3D瀏覽介面中即時運動計畫演算法的設計(II)
    Other Titles: Designing Real-Time Motion Planning Algorithms for Intelligent 3d Navigation Interface(II)
    Authors: 李蔡彥
    Contributors: 政治大學資訊科學系
    行政院國家科學委員會
    Keywords: 智慧型3D瀏覽介面;即時運動計畫;演算法
    Date: 2006
    Issue Date: 2012-11-12 11:01:46 (UTC+8)
    Abstract: 雖然電腦繪圖硬體的處理速度已有長足的進步,但由於虛擬演員的高自由度及操控介面的即時要求,在3D虛擬環境中實現動畫角色的即時控制仍是極具挑戰的課題。在本計畫中,我們以兩年的時間研究以運動計畫演算法,設計即時控制3D虛擬角色的智慧型人機介面,讓使用者能以即時互動的方式有效操控具有高自由度的動畫角色。在第一年的研究裡,我們以第一人稱視點運動控制方式為範疇,成功開發出能適應個人操控特性的線上輔助介面,並提升由電腦所產生輔助路徑的品質。在第二年的研究裡,我們嘗試將此控制方式提昇為更具挑戰的第三人稱控制法。我們根據視點是否依附在動畫角色上,設計了兩類第三人稱的控制輔助方法,即時產生能與環境互動的角色動畫。第一類是針對架設於動畫角色後方的攝影機,以分解計畫法則,產生能適應環境的人體上半身動畫。第二類是針對與動畫角色運動獨立的攝影機,以運動擷取資料庫為輔助,即時合成動畫角色多樣化的全身運動。目前這兩類技術均已能整合在一般個人電腦的操控介面上,產生能與環境互動的角色動畫。此計畫目前的研究成果已達到計畫原定目標,我們並已逐步將成果整理成論文,於各國際研討會或期刊中發表。
    Despite the advances in graphics hardware development, controlling an interactive 3D character is still a great challenge due to the high degrees of freedom involved in controlling a virtual character and the real-time requirement of interactive interface. In this project, we have used two years to investigate how to make use of motion planning algorithms to design an effective intelligent user interface for controlling a digital actor in real time in walkthrough applications. In the first year, we have succeeded in developing an intelligent navigation interface with a first-person view that is adaptive to user navigation behaviors. In the second year, we attempt to extend the goal to cover the more challenging third-person control mode, where the viewpoint is detached from the eyes of the virtual character. According to the fact whether the viewpoint moves with the character or not, we have developed two methods to generate motions in real time. By assuming that the camera is placed behind the character, in the first method we use a decoupled motion planning algorithm to generate compliant motions for the upper body of the character. In the second method, by making use of a library of captured motions, we attempt to synthesize versatile motions in real time for the full body of the character. Both methods have successfully been integrated into 3D user navigation interfaces operated with a mouse or keyboard on a regular desktop computer. We have achieved the goals specified in the proposal, and we are in the process of publishing the results in international conferences and journals.
    Relation: 應用研究
    學術補助
    研究期間:9508~ 9607
    研究經費:534仟元
    Data Type: report
    Appears in Collections:[資訊科學系] 國科會研究計畫

    Files in This Item:

    File SizeFormat
    report12.pdf1857KbAdobe PDF800View/Open


    All items in 政大典藏 are protected by copyright, with all rights reserved.


    社群 sharing

    著作權政策宣告
    1.本網站之數位內容為國立政治大學所收錄之機構典藏,無償提供學術研究與公眾教育等公益性使用,惟仍請適度,合理使用本網站之內容,以尊重著作權人之權益。商業上之利用,則請先取得著作權人之授權。
    2.本網站之製作,已盡力防止侵害著作權人之權益,如仍發現本網站之數位內容有侵害著作權人權益情事者,請權利人通知本網站維護人員(nccur@nccu.edu.tw),維護人員將立即採取移除該數位著作等補救措施。
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - Feedback