English  |  正體中文  |  简体中文  |  Post-Print筆數 : 27 |  Items with full text/Total items : 110182/141115 (78%)
Visitors : 46627923      Online Users : 486
RC Version 6.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version
    政大機構典藏 > 資訊學院 > 資訊科學系 > 學位論文 >  Item 140.119/54864
    Please use this identifier to cite or link to this item: https://nccur.lib.nccu.edu.tw/handle/140.119/54864


    Title: 多層式動作圖
    Multi-Layered Motion Graph
    Authors: 林志忠
    Lin, Chih Chung
    Contributors: 李蔡彥
    Li, Tsai Yen
    林志忠
    Lin, Chih Chung
    Keywords: 動作圖
    動作擷取
    電腦動畫
    角色動畫
    Motion Graph
    Motion Capture
    Computer Animation
    Character Animation
    Date: 2011
    Issue Date: 2012-10-30 14:01:28 (UTC+8)
    Abstract: 動作擷取法是現今相當受到歡迎的角色動作產生方法,而一般多是使用已擷取好的動作,以人工的方式將數個不同的動作混合以產生出所需的動作。但想要大量產生符合需求的混合動作仍相當不容易,因此有人提出了「動作圖」這個方法。動作圖是一種根據使用者所給定的動作擷取資料集合,經過自動化的計算找出各個動作資料之間可以連接的動作片段。藉由這個自動化的程序,各個動作擷取資料可以相互連接起來,達到在不同的動作間平順轉換,且同時保有原動作擷取資料擬真特性的目的。但縱使有上述的好處,目前動作圖的技術僅能就所擷取的全身動作進行串接,品質與彈性往往決定於一開始動作擷取資料的準備,因此如何讓既有的全身動作資料得以分解再利用,以發揮最大的價值,是一個重要的問題。在本研究中,我們提出了一個階層式的動作圖結構名為多層式動作圖,在這個多層式動作圖的結構中,我們將身體的動作區分成數個部位,分別計算各自的動作圖後再合併成一個多層式的架構,而合併的過程中我們提出「整體動作相似度」的計算方式,以做為兩個動作是否容易轉接的比較依據。我們也提出了在不同階層間動作圖運作的規則,以使計算的複雜度及系統的可用性取得合理的平衡。此外,我們更進一步提出名為Motion Script的簡易語意描述語言,來輔助控制這個具有高複雜度的動作圖結構。實驗的結果顯示,我們的方法可以即時根據使用者的指令,搜尋並產生出原動作資料所沒有的動作組合。與傳統的動作圖相比,我們的方法能更進一步的發揮原動作擷取資料的價值,以有系統的方式讓動作組合自動產生更具豐富性及彈性。
    Motion capture is a popular method for generating realistic character animation. In most applications, a motion usually is prepared by manually blending existing captured motion clips to generate a desired motion clip. However, finding a good transition points manually for two motion clips is a time-consuming task and cannot be scaled up easily. Motion Graph is a technique that has been proposed to automate this process by finding suitable connection points and the corresponding transition motions between motion data. With this automatic procedure, motions captured separately can be smoothly connected while keeping the realism of the captured motions. However, most motion graph techniques only consider the transition of full-body motions in two motion clips, and therefore, the resulting motion .depends on the variety of motions available in the motion database. It is an important issue to be able to compose new motion clips as much as possible with given motion capture database. In this research, we propose a hierarchical motion graph structure called Multi-Layered Motion Graph. In this structure, we divide motion data into layers of parts depending on the articulated structure of human body, and then compute a motion graph for each part of the motion. We then combine these motion graphs into an interconnected hierarchical structure. In order to facilitate the composition of motions for different parts from different motion clips, we propose a new metric called Overall Motion Similarity to find reasonable composition of motions in run time. We also propose several rules about how to traverse the motion graphs in different layers to generate feasible motions. Furthermore, we have designed a scripting language called Motion Script to facilitate the specification and search of desirable animation to be generated. Our experimental results reveal that our method is able to compose animations that the original motion graph cannot generate in real time. Compared to the traditional motion graph method, our method is able to make good use of existing motion capture library to compose new motions in a systematic way.
    Reference: [1] Boulic, R., N. Magnenat-Thalmann, and D. Thalmann, A global human walking model with real-time kinematic personification. Vis. Comput., 1990. 6(6): p. 344-358.
    [2] Bruderlin, A. and T. Calvert, Knowledge-driven, interactive animation of human running, in Proceedings of the conference on Graphics interface `961996, Canadian Information Processing Society: Toronto, Ontario, Canada. p. 213-221.
    [3] Bruderlin, A. and T.W. Calvert, Goal-directed, dynamic animation of human walking. SIGGRAPH Comput. Graph., 1989. 23(3): p. 233-242.
    [4] Bruderlin, A. and L. Williams, Motion signal processing, in Proceedings of the 22nd annual conference on Computer graphics and interactive techniques1995, ACM. p. 97-104.
    [5] Chen, K.-Y., Multi-graph Motion Synthesis, Master Thesis, 2005.
    [6] Heck, R. and M. Gleicher, Parametric motion graphs, in Proceedings of the 2007 symposium on Interactive 3D graphics and games2007, ACM: Seattle, Washington. p. 129-136.
    [7] Hecker, C., B. Raabe, R.W. Enslow, J. DeWeese, J. Maynard, and K.v. Prooijen, Real-time motion retargeting to highly varied user-created morphologies. ACM Trans. Graph., 2008. 27(3): p. 1-11.
    [8] Hsu, E., M.d. Silva, and J. Popovic, Guided time warping for motion editing, in Proceedings of the 2007 ACM SIGGRAPH/Eurographics symposium on Computer animation2007, Eurographics Association: San Diego, California. p. 45-52.
    [9] Jang, W.S., W.K. Lee, I.K. Lee, and J. Lee, Enriching a motion database by analogous combination of partial human motions. The Visual Computer, 2008. 24(4): p. 271-280.
    [10] Kovar, L., M. Gleicher, and F. Pighin, Motion graphs. ACM Trans. Graph., 2002. 21(3): p. 473-482.
    [11] Lee, J., J. Chai, P.S.A. Reitsma, J.K. Hodgins, and N.S. Pollard, Interactive control of avatars animated with human motion data. ACM Trans. Graph., 2002. 21(3): p. 491-500.
    [12] Perlin, K., An image synthesizer. SIGGRAPH Comput. Graph., 1985. 19(3): p. 287-296.
    [13] Perlin, K., Real Time Responsive Animation with Personality. IEEE Transactions on Visualization and Computer Graphics, 1995. 1(1): p. 5-15.
    [14] Rahim, R.A., N.M. Suaib, and A. Bade, Motion Graph for Character Animation: Design Considerations, in Proceedings of the 2009 International Conference on Computer Technology and Development - Volume 022009, IEEE Computer Society. p. 435-439.
    [15] Rose, C., B. Guenter, B. Bodenheimer, and M.F. Cohen, Efficient generation of motion transitions using spacetime constraints, in Proceedings of the 23rd annual conference on Computer graphics and interactive techniques1996, ACM. p. 147-154.
    [16] Safonova, A. and J.K. Hodgins, Construction and optimal search of interpolated motion graphs. ACM Trans. Graph., 2007. 26(3): p. 106.
    [17] Tarjan, R., Depth-first search and linear graph algorithms. SIAM J. Comput., 1972. 1: p. 146-160.
    [18] Thorne, M., D. Burke, and M.v.d. Panne, Motion doodles: an interface for sketching character motion. ACM Trans. Graph., 2004. 23(3): p. 424-431.
    [19] Tomovic, R. and R. McGhee, A finite state approach to the synthesis of bioengineering control systems. IEEE Transactions on Human Factors in Electronics, 1966: p. 65-69.
    [20] Unuma, M., K. Anjyo, and R. Takeuchi, Fourier principles for emotion-based human figure animation, in Proceedings of the 22nd annual conference on Computer graphics and interactive techniques1995, ACM. p. 91-96.
    [21] Witkin, A. and Z. Popovic, Motion warping, in Proceedings of the 22nd annual conference on Computer graphics and interactive techniques1995, ACM. p. 105-108.
    [22] Zeltzer, D., Motor Control Techniques for Figure Animation. IEEE Comput. Graph. Appl., 1982. 2(9): p. 53-59.
    [23] Zhao, L. and A. Safonova, Achieving good connectivity in motion graphs. Graph. Models, 2009. 71(4): p. 139-152.
    [24] Intel Threading Building Blocks for Open Source: http://threadingbuildingblocks.org
    [25] OGRE: http://www.ogre3d.org
    [26] Qt: http://qt.nokia.com/products
    Description: 碩士
    國立政治大學
    資訊科學學系
    98753006
    100
    Source URI: http://thesis.lib.nccu.edu.tw/record/#G0098753006
    Data Type: thesis
    Appears in Collections:[資訊科學系] 學位論文

    Files in This Item:

    File SizeFormat
    300601.pdf2604KbAdobe PDF2917View/Open


    All items in 政大典藏 are protected by copyright, with all rights reserved.


    社群 sharing

    著作權政策宣告 Copyright Announcement
    1.本網站之數位內容為國立政治大學所收錄之機構典藏,無償提供學術研究與公眾教育等公益性使用,惟仍請適度,合理使用本網站之內容,以尊重著作權人之權益。商業上之利用,則請先取得著作權人之授權。
    The digital content of this website is part of National Chengchi University Institutional Repository. It provides free access to academic research and public education for non-commercial use. Please utilize it in a proper and reasonable manner and respect the rights of copyright owners. For commercial use, please obtain authorization from the copyright owner in advance.

    2.本網站之製作,已盡力防止侵害著作權人之權益,如仍發現本網站之數位內容有侵害著作權人權益情事者,請權利人通知本網站維護人員(nccur@nccu.edu.tw),維護人員將立即採取移除該數位著作等補救措施。
    NCCU Institutional Repository is made to protect the interests of copyright owners. If you believe that any material on the website infringes copyright, please contact our staff(nccur@nccu.edu.tw). We will remove the work from the repository and investigate your claim.
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - Feedback