English  |  正體中文  |  简体中文  |  Post-Print筆數 : 27 |  Items with full text/Total items : 110936/141856 (78%)
Visitors : 47724710      Online Users : 990
RC Version 6.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version
    Please use this identifier to cite or link to this item: https://nccur.lib.nccu.edu.tw/handle/140.119/51318


    Title: 以智慧型3D動畫角色為介面之互動數位電視系統
    Incorporating intelligent 3D character into the interface for interactive digital TV system
    Authors: 陳映似
    Chen, Ying Szu
    Contributors: 李蔡彥
    Li, Tsai Yen
    陳映似
    Chen, Ying Szu
    Keywords: 互動電視
    人工智慧
    人機介面
    情緒動畫
    Date: 2010
    Issue Date: 2011-10-05 14:43:50 (UTC+8)
    Abstract: 近年來,智慧型互動電視的應用是不少數位生活空間研究的焦點之一。我們認為,好的互動式數位電視系統必須有生動靈活的使用介面與使用者互動。在本研究中,我們提出以智慧型3D動畫角色為介面之互動數位電視系統,希望可以藉由智慧型3D動畫角色在介面上的呈現,加強使用者於互動數位電視系統的使用經驗。在我們過去所開發的互動數位電視系統SITV上,有許多不同的互動情境可納入智慧型3D動畫角色設計的考量。我們提議讓智慧型3D動畫角色在肢體動作表現上具有行動力與表達力的概念,使得動畫角色在互動數位電視系統上,能夠依據不同的情境與角色本身之狀態,選擇適當的動作在螢幕上移動並且能呈現適當的情緒,讓互動數位電視系統之服務更加友善。本研究以JAVA開發動畫系統,並設計實驗驗證不同介面對使用者的影響,結果顯示,使用者認為智慧型3D動畫角色介面是最有善的。
    In recent years, intelligent interactive digital TV is one of the most important applications in the research of digital living space. We think a good interactive TV system must have a vivid user interface to interact with users. In this research, we propose to incorporate intelligent 3D character into the interface design for interactive digital TV system, to enhance the user experience of the interactive digital TV system. In the smart interactive digital TV system we developed before, call SITV, many interactive scenarios can be considered in the design of intelligent 3D character. We propose to develop our intelligent 3D character with the concepts of mobility and expressiveness on body motion such that appropriate emotions can be presented through motions depending on the scenario and character configuration. For example, an intelligent 3D character can act like a housekeeper living in the TV monitor. He can take different actions for different scenarios to make the service friendlier. We have developed our animation system in JAVA and designed experiments to evaluate different types of user interface design on different scenarios. The experimental results show that an interface with an intelligent 3D character will be friendlier than the others.
    Reference: [1] K. Amaya, “Emotion from Motion,” in Proc. of the Conference on Graphics Interface, 1996.
    [2] E. André and T. Rist, “Presenting through performing: on the use of multiple lifelike characters in knowledge-based presentation systems,” in Proc. of the 5th international conference on Intelligent user interfaces, pp. 1-8, January 09-12, 2000.
    [3] E. André, T. Rist and J. Müller, “Employing AI Methods to Control the Behavior of Animated Interface Agents,” Applied Artificial Intelligence, v. 13, pp. 415-448, 1999.
    [4] Apple TV, http://www.apple.com/tw/appletv/what-is.html
    [5] J. Bates, “The role of emotion in believable agents,” Communications of the ACM, v. 37 n.7, pp. 122-125, July 1994.
    [6] BBC Red Button, http://www.bbc.co.uk/digital/tv/tv_interactive.shtml
    [7] Y. Blanco-Fernandez, J.J. Pazos-Arias, A. Gil-Solla, M. Ramos-Cabrer, M. Lopez-Nores and B. Barragans-Martinez, ”AVATAR: Modeling Users by Dynamic Ontologies in a TV Recommender System based on Semantic Reasoning,” in Proc. of the 3rd EuroITV Conference, 2005.
    [8] E.M.A. Diederiks, “Buddies in a Box - Animated Characters in Consumer Electronics,” in Proc. of the 8th International Conference on Intelligent User Interfaces, pp. 34-38, 2003.
    [9] J.B.D.S. Jr., R. Goularte, G.B. Faria and E.D.S. Moreira, ”Modeling of User Interaction in Context-Aware Interactive Television Application on Distributed,” in Proc. of Workshop on Personalization in Future TV, 2001.
    [10] J. Lee, J. Chai, P. S. A. Reitsma, J. K. Hodgins and N. S. Pollard, “Interactive Control of Avatars Animated with Human Motion Data,” in Proc. of the 29th Annual Conference on Computer Graphics and Interactive Techniques, pp. 491-500, 2002.
    [11] Y.-H. Lin, C.-Y. Liu, H.-W. Lee, S.-L. Huang, T.-Y. Li, ”Verification of Expressiveness of Procedural Parameters for Generating Emotional Motions,” in Proc. of the Eight International Conference on Intelligent Virtual Agents, 2008.
    [12] P.-Y. Liu, S.-W. Hsu, T.-Y. Li, H.-W. Lee, S.L. Huang, “An Experimental Platform for Smart Interactive TV in Digital Home,” in Proc. of 2007 Symposium on Digital Life and Internet Technologies, Tainan, 2007.
    [13] B. Ludwig, S. Mandl and S.V. Mammen, ”What’s on tonight - User-centered and Situation-aware Proposals for TV Programmes,” in Proc. of International Conference on Intelligent User Interfaces, pp. 258-260, 2006.
    [14] N. Magnenat-Thalmann and P. Kalra, “The Simulation of a Virtual TV Presenter,” in Proc. of Pacific Graphics 95, World Scientific, Singapore, 1995.
    [15] J. McCann and N. Pollard, “Responsive characters from motion fragments,” ACM Transactions on Graphics (TOG), v. 26 n. 3, July 2007.
    [16] MCE, http://www.microsoft.com/taiwan/windows/windows-media-center/
    [17] H. K. M. Meeren, C. C. R. J. van Heijnsbergen, and B. de Gelder, “Rapid Perceptual Integration of Facial Expression and Emotional Body Language,” in Proc. of the National Academy of Sciences, 2005.
    [18] C. de Melo, J. Gratch. “Expression of Emotions using Wrinkles, Blushing, Sweating and Tears,” 9th International Conference on Intelligent Virtual Agents, Amsterdam, 2009.
    [19] C. de Melo, L. Zheng, J. Gratch. “Expression of Moral Emotions in Cooperating Agents,” in Proc. of 9th International Conference on Intelligent Virtual Agents, Amsterdam, 2009.
    [20] MythTV, http://www.mythtv.org/.
    [21] A. Nijholt, “Where computers disappear, virtual humans appear,” in Proc. of Computers and Graphics, 2004.
    [22] T. Noma, L. Zhao, N. I. Badler, “Design of a Virtual Human Presenter, ” IEEE Computer Graphics and Applications, v. 20, n. 4, pp. 79-85, July/Aug 2000.
    [23] K. Perlin, F. Neyret, “Flow Noise,” SIGGRAPH Technical Sketches and Applications, August 2001.
    [24] H. Prendinger, C. Ma, M. Ishizuka, “Eye movements as indices for the utility of life-like interface agents: A pilot study,” Interacting with Computers, v.19 n.2, pp.281-292, March, 2007.
    [25] F. Seron, S. Baldassarri and E. Cerezo, “Maxineppt: using 3D virtual characters for natural interaction,” in Proc. of the 2nd international workshop on ubiquitous computing & ambient intelligence, 2006.
    [26] M.V. Setten and M. Veenstra, “Prediction Strategies in a TV Recommender System - Method and Experiments,” in Proc. of IADIS WWW/Internet 2003 Conference, 2003.
    [27] H. Si, Y. Kawahara, H. and T. Aoyama, ”A Stochastic Approach for Creating Context-Aware Services based on Context Histories in Smart Home,” in Proc. of the 1st International Workshop on Exploiting Context Histories in Smart Environment, 2005.
    [28] TiVo, http://www.tgc-taiwan.com.tw/.
    [29] Tukey’s range test, http://en.wikipedia.org/wiki/Tukey`s_range_test
    [30] M. Unuma, K. Anjyo and R. Takeuchi, “Fourier Principles for Emotion-based Human Figure Animation,” in Proc. of SIGGRAPH ‘95, 1995.
    [31] J. Zimmerman and K. Kurapati, “Exposing Profiles to Build Trust in a Recommender,” in Proc. of Conference on Human Factors in Computing Systems, pp. 608-609, 2002.
    [32] 中華電信MOD, http://mod.cht.com.tw/MOD/Web/index.php.
    [33] 張鈺潔,李宏偉,黃淑麗,李蔡彥, "由概念類別與記憶效果探討二維電子選單之設計與評估," 台灣心理學會第49屆年會, 2010.
    Description: 碩士
    國立政治大學
    資訊科學學系
    97753010
    99
    Source URI: http://thesis.lib.nccu.edu.tw/record/#G0097753010
    Data Type: thesis
    Appears in Collections:[Department of Computer Science ] Theses

    Files in This Item:

    File SizeFormat
    301001.pdf1560KbAdobe PDF2875View/Open


    All items in 政大典藏 are protected by copyright, with all rights reserved.


    社群 sharing

    著作權政策宣告 Copyright Announcement
    1.本網站之數位內容為國立政治大學所收錄之機構典藏,無償提供學術研究與公眾教育等公益性使用,惟仍請適度,合理使用本網站之內容,以尊重著作權人之權益。商業上之利用,則請先取得著作權人之授權。
    The digital content of this website is part of National Chengchi University Institutional Repository. It provides free access to academic research and public education for non-commercial use. Please utilize it in a proper and reasonable manner and respect the rights of copyright owners. For commercial use, please obtain authorization from the copyright owner in advance.

    2.本網站之製作,已盡力防止侵害著作權人之權益,如仍發現本網站之數位內容有侵害著作權人權益情事者,請權利人通知本網站維護人員(nccur@nccu.edu.tw),維護人員將立即採取移除該數位著作等補救措施。
    NCCU Institutional Repository is made to protect the interests of copyright owners. If you believe that any material on the website infringes copyright, please contact our staff(nccur@nccu.edu.tw). We will remove the work from the repository and investigate your claim.
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - Feedback