English  |  正體中文  |  简体中文  |  Post-Print筆數 : 27 |  Items with full text/Total items : 109952/140887 (78%)
Visitors : 46332021      Online Users : 1322
RC Version 6.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version
    政大機構典藏 > 資訊學院 > 資訊科學系 > 學位論文 >  Item 140.119/32693
    Please use this identifier to cite or link to this item: https://nccur.lib.nccu.edu.tw/handle/140.119/32693


    Title: 能表達音樂特徵的人體動畫自動產生機制
    Automatic Generation of Human Animation for Expressing Music Features
    Authors: 雷嘉駿
    Loi, Ka Chon
    Contributors: 李蔡彥
    Li, Tsai Yen
    雷嘉駿
    Loi, Ka Chon
    Keywords: 人體動畫
    虛擬環境
    音樂特徵
    human animation
    virtual enivronment
    music features
    Date: 2007
    Issue Date: 2009-09-17 14:04:22 (UTC+8)
    Abstract: 近年來電腦計算能力的進步使得3D虛擬環境得到廣泛的應用。本研究希望能在虛擬環境中結合人體動畫和音樂的特色,以人體動畫來詮釋音樂。我們希望能設計一個智慧型的人體動作產生器,賦予虛擬人物表達音樂特徵的能力,讓動作會因為“聽到”不同的音樂而有所不同。基於人類聽覺的短暫性,系統會自動抓取音樂特徵後將音樂切割成多個片段、對每一片段獨立規劃動作並產生動畫。過去動畫與音樂相關的研究中,許多生成的動作都經由修改或重組運動資料庫中的動作。本研究分析音樂和動作之間的關係,使用程序式動畫產生法自動產生多變且適當的詮釋動作。實驗顯示本系統能通用於LOA1人體模型和MIDI音樂;此外,透過調整系統中的參數,我們能產生不同風格的動畫,以符合不同使用者偏好和不同音樂曲風的特色。
    In recent years, the improvement of computing ability has contributed to the wide application of 3D virtual environment. In the thesis, we propose to combine character animation with music for music interpretation in 3D virtual environment. The system proposed in the thesis is an intelligent avatar motion generator, which generates expressive motions according to music features. The system can extract music features from input music data, segment a music into several music segments, and then plan avatar animation. In the literature, much music-related animation research uses reconstruction and modification of existing motion to compose new animations. In this work, we analyze the relationship between music and motions, and then use procedural animation to automatically generate applicable and variable motions to interpret music. Our experiments show that the system can accept LOA1 models and midi as inputs in general, and generate appropriate expressive motions by modifying parameters according to users’ preference or music style.
    Reference: [1] M. Cardle, L. Barthe, S. Brooks, and P. Robinson, “Music Driven Motion Editing: Local Motion Transformations Guided By Music Analysis,” in Proc. of the Eurographics UK Conference, 2002.
    [2] P.F. Chen, and T.Y. Li, “Generating Humanoid Lower-Body Motions with Real-time Planning,” in Proc. of 2002 Computer Graphics Workshop, 2002.
    [3] G. Cooper, and L.B. Meyer, “The rhythmic structure of music,” in Chicago:University of Chicago Press, 1960.
    [4] R. DeLone, “Aspects of Twentieth-Century Music,” Englewood Cliffs, New Jersey: Prentice-Hall, Chap. 4, pages 270-301, 1975.
    [5] W.J. Dowling, “Scale and Contour: Two components of a theory of memory for melodies,” Psychological Review, 1978.
    [6] R.O. Gjerdingen, “Apparent Motion in Music?,” Music Perception, Volume 11, pages 335-370, 1994.
    [7] R.I. Godøy, E. Haga, and A.R. Jensenius, “Playing ‘Air Instruments’: Mimicry of Sound-producing Gestures by Novices and Experts,” in Gesture in Human-Computer Interaction and Simulation: 6th International Gesture Workshop, 2005.
    [8] Humanoid Animation Working Group (H-Anim).
    http://www.h-anim.org
    [9] L. Kovar, M. Gleicher, and F. Pighin, “Motion Graph,” in Proc. of ACM SIGGRAPH02, 2002.
    [10] C.L. Krumhansl, “Cognitive Foundations of Musical Pitch,” Psychology of Music, Volume 20, pages 180-185, 1992.
    [11] R. Laban, and L. Ullmann, Mastery of Movement, Princeton Book Company Pulishers, 1960.
    [12] E.W. Large, and J.F. Kolen, “Resonance and the perception of musical meter,” Connection Science, Volume 6, pages 177-208, 1994.
    [13] H.C. Lee, and I.K. Lee, “Automatic Synchronization of Background Music and Motion in Computer Animation,” Computer Graphics Forum, Volume 24, pages 353-362, 2005.
    [14] E. Lerdahl, and R. Jackendoff, A generative theory of tonal music, Cambridge:MIT Press, 1983.
    [15] M.Y. Liao, and J.F. Liao and T.Y. Li, “An Extensible Scripting Language for Interactive Animation in a Speech-Enabled Virtual Environment," in Proc. of the IEEE In-ternational Conference on Multimedia and Expo, 2004.
    [16] M. Mancini, and G. Castellano, “Real-time analysis and synthesis of emotional gesture expressivity,” in Proc. of the Doctoral Consortium of 2nd International Conference on Affective Computing and Intelligent Interaction, 2007.
    [17] S. Mishra, and J.K. Hahn, “Mapping motion to sound and music and in computer animation and VE,” in Proc. of the Pacific Graphics `95, 1995.
    [18] F. Multon, L. France, M.P. Cani-Gascuel, and G. Debunne, “Computer Animation of Human Walking: a Survey,” Journal of Visualization and Computer Animation, 1999.
    [19] J. Nakamura, T. Kaku, T. Noma, and S. Yoshida, “Automatic Background Music Generation Based on Actors ‘Emotion and Motions’,” in Proc. of the Pacific Graphics, 1993.
    [20] S. Oore, and Y. Akiyama, “Learning to Synthesize Arm Motion to Music By Example,” in Proc. of the 14-th International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision, 2006.
    [21] Rick Parent, Computer Animation: Algorithms and Techniques, Morgan Kaufmann Publishers, 2005.
    [22] Robert Rowe, Interactive Music Systems, Cambridge: MIT Press, 1993.
    [23] T. Shiratori, A. Nakazawa, and K. Ikeuchi, “Detecting dance motion structure through music analysis,” in Proc. of IEEE Int’l Conf. on Automatic Face and Gesture Recognition, 2004.
    [24] T. Shiratori, A. Nakazawa, and K. Ikeuchi, “Dancing-to-Music Character Animation,” in Computer Graphics Forum, Volume 25, pages 449-458, 2006.
    [25] I. Shmulevich, Y.H. Olli, E. Coyle, D.J. Povel, and K. Lemström, “Perceptual Issues in Music Pattern Recognition: Complexity of Rhythm and Key Finding,” in Proc. of AISB Symposium on Musical Creativity, 2001.
    [26] M. Sung, L. Kovar, and M. Gleicher, “Fast and accurate goal-directed motion synthesis for crowds,” in Proc. of the ACM SIGGRAPH / Eurographics Symposium on Computer Animation, 2005.
    [27] IKAN(Inverse Kinematics using Analytical Methods).
    http://cg.cis.upenn.edu/hms/software/ikan/ikan.html
    [28] L. Torresani, P. Hackney, and C. Bregler, “Learning Motion Style Synthesis from Perceptual Observations,” in Proc. of the Neural Information Processing Systems Foundation, 2006.
    [29] A.L. Uitdenbogerd, and J. Zobel, “Manipulation of music for melody matching,” in Proc. of ACM International Multimedia Conference, 1998.
    [30] B. Vines, M.M. Wanderley, R. Nuzzo, D. Levitin, and C. Krumhansl, “Performance Gestures of Musicians: What Structural and Emotional Information do they Convey?,” Gesture-Based Communication in Human-Computer Interaction, Volume 2915/2004, pages 468-478, 2004.
    [31] D.J. Wiley, and J.K. Hahn, “Interpolation Synthesis of Articulated Figure Motion,” IEEE Computer Graphics and Applications, Volume 17, pages 39-45, 1997.
    Description: 碩士
    國立政治大學
    資訊科學學系
    95753006
    96
    Source URI: http://thesis.lib.nccu.edu.tw/record/#G0095753006
    Data Type: thesis
    Appears in Collections:[資訊科學系] 學位論文

    Files in This Item:

    File Description SizeFormat
    300601.pdf141KbAdobe PDF2839View/Open
    300602.pdf139KbAdobe PDF2797View/Open
    300603.pdf169KbAdobe PDF2862View/Open
    300604.pdf197KbAdobe PDF2840View/Open
    300605.pdf347KbAdobe PDF2996View/Open
    300606.pdf350KbAdobe PDF21220View/Open
    300607.pdf295KbAdobe PDF21228View/Open
    300608.pdf1815KbAdobe PDF21412View/Open
    300609.pdf533KbAdobe PDF21218View/Open
    300610.pdf411KbAdobe PDF21030View/Open
    300611.pdf3489KbAdobe PDF2893View/Open
    300612.pdf167KbAdobe PDF2837View/Open
    300613.pdf192KbAdobe PDF2868View/Open
    300614.pdf271KbAdobe PDF21104View/Open


    All items in 政大典藏 are protected by copyright, with all rights reserved.


    社群 sharing

    著作權政策宣告 Copyright Announcement
    1.本網站之數位內容為國立政治大學所收錄之機構典藏,無償提供學術研究與公眾教育等公益性使用,惟仍請適度,合理使用本網站之內容,以尊重著作權人之權益。商業上之利用,則請先取得著作權人之授權。
    The digital content of this website is part of National Chengchi University Institutional Repository. It provides free access to academic research and public education for non-commercial use. Please utilize it in a proper and reasonable manner and respect the rights of copyright owners. For commercial use, please obtain authorization from the copyright owner in advance.

    2.本網站之製作,已盡力防止侵害著作權人之權益,如仍發現本網站之數位內容有侵害著作權人權益情事者,請權利人通知本網站維護人員(nccur@nccu.edu.tw),維護人員將立即採取移除該數位著作等補救措施。
    NCCU Institutional Repository is made to protect the interests of copyright owners. If you believe that any material on the website infringes copyright, please contact our staff(nccur@nccu.edu.tw). We will remove the work from the repository and investigate your claim.
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - Feedback