English  |  正體中文  |  简体中文  |  Post-Print筆數 : 27 |  全文筆數/總筆數 : 110751/141676 (78%)
造訪人次 : 47383970      線上人數 : 601
RC Version 6.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
搜尋範圍 查詢小技巧:
  • 您可在西文檢索詞彙前後加上"雙引號",以獲取較精準的檢索結果
  • 若欲以作者姓名搜尋,建議至進階搜尋限定作者欄位,可獲得較完整資料
  • 進階搜尋
    政大機構典藏 > 資訊學院 > 資訊科學系 > 學位論文 >  Item 140.119/32732
    請使用永久網址來引用或連結此文件: https://nccur.lib.nccu.edu.tw/handle/140.119/32732


    題名: 智慧型仿鏡互動顯示裝置
    Magic Mirror: A Research on Smart Display Devices
    作者: 葉致偉
    Yeh, Chih-Wei
    貢獻者: 廖文宏
    Liao, Wen-Hung
    葉致偉
    Yeh, Chih-Wei
    關鍵詞: 人機介面
    智慧顯示裝置
    智慧家電
    肢體人機介面
    Human Computer Interface
    Smart Display Device
    Smart Furniture
    Gesture User Inteface
    日期: 2005
    上傳時間: 2009-09-17 14:09:34 (UTC+8)
    摘要: 以肢體動作為基礎的人機介面一直被認為是未來家庭人機介面的表徵,然而,由於缺乏適合的應用環境和辨識技術,相關的應用尚未成熟。本研究嘗試提出一個互動式仿鏡顯示裝置作為肢體指令的平台,並提出相關的辨識技術,以設計一個可應用在智慧家庭環境中,符合人因工程的互動顯示裝置。
    Gesture-based user interfaces have long been associated with the image of future technology. However, due to the lack of proper environments and recognition technologies, practical applications of intelligent user interfaces are still rare in modern life. In this research, we propose an interactive mirror which can be controlled by gesture commands. We also provide several recognition techniques for this interactive display device. Practical applications are developed on this smart mirror, and user test is conducted to evaluate this novel user interface.
    參考文獻: [1] E. Lee, T. Nakra Marrin and J. Borchers, "You`re the conductor: A realistic interactive conducting system for children," in Proc. 2004 Int. Conf. on New Interfaces for Musical Expression, Hamamatsu, Japan, pp. 68-73, Jun. 2004.
    [2] V. Henderson, S. Lee, H. Brashear, H. Hamilton, T. Starner and S. Hamilton, "Development of an American Sign Language game for deaf children," in Proc. 4th Int. Conf. on Interaction Design and Children, Boulder, CO, USA, pp. 70-79, Jun. 2005.
    [3] L. Zhang, Y. Chen, G. Fang, X. Chen and W. Gao, "A vision-based sign language recognition system using tied-mixture density HMM," in Proc. 6th Int. Conf. on Multimodal Interfaces, State College, PA, USA, pp. 198-204, Oct. 2004.
    [4] W. Freeman and C. Weissman, "Television control by hand gestures," in Proc. IEEE Int. Workshop on Automatic Face and Gesture Recognition, 1995, pp. 179-183, Jun. 1995.
    [5] A. Wilson and N. Oliver, "GWindows: robust stereo vision for gesture-based control of windows," in Proc. 5th Int. Conf. on Multimodal Interfaces, Vancouver, British Columbia, Canada, pp. 211-218, Nov. 5-7 2003.
    [6] R. Vertegaal, "Attentive user interfaces: Introduction," Commun. ACM, vol. 46, pp. 30-33, 2003.
    [7] http://www.research.philips.com/technologies/display/mrrordisp/mirrortv/
    [8] http://www.seuratvmirror.com
    [9] T. Darrell, G. Gorden, J. Woodfill and M. Harville, "A Virtual Mirror Interface using Real-time Robust Face Tracking," in Proc. 3th IEEE Int. Conf. on Automatic Face and Gesture Recognition, Nara, Japan, pp. 616-621, Apr. 14-16, 1998.
    [10] W. Liao and T. Y. Li, "DD Theme Party: An Interactive Multimedia Showcase," in Proc. 2005 Int. Conf. on Digital Archive Technologies, Taipei, Taiwan, pp. 279-280, Jun. 16-17, 2005.
    [11] J. Goto, K. Komine, Y. Kim and N. Uratani, "A Television Control System based on Spoken Natural Language Dialogue," in Proc. 9th IFIP TC13 Int. Conf. on Human-Computer Interaction, Zürich, Switzerland, pp. 765-768, Sep. 1-5. 2003.
    [12] J.S. Shell, R. Vertegaal and A.W. Skaburskis, "EyePliances: attention-seeking devices that respond to visual attention," in extended abstracts of 2003 ACM SIGCHI Conf. on Human factors in computing systems, Fort Lauderdale, FL, USA, pp. 770-771, Apr. 2003.
    [13] R.J. Orr and G.D. Abowd, "The smart floor: a mechanism for natural user identification and tracking," in extended abstracts of 2000 ACM SIGCHI Conf. on Human factors in computing systems, Hague, Netherlands, pp. 275-276, Apr. 1-6, 2000.
    [14] M. Yang, D.J. Kriegman and N. Ahuja, "Detecting faces in images: a survey," IEEE Trans. on Pattern Analysis and Machine Intelligence, vol. 24, pp. 34-58, Jan 2002.
    [15] P. Viola and M.J. Jones, "Robust Real-Time Face Detection," Int. J. Computer Vision, vol. 57, pp. 137-154, 2004.
    [16] S. Baluja, M. Sahami and H. Rowley A., "Efficient Face Orientation Discrimination," in Proc. 2004 Int. Conf. on Image Processing, Singapore, vol. 1, pp. 589-592, Oct. 24-27, 2004.
    [17] B.D. Zarit, B.J. Super and F.K.H. Quek, "Comparison of Five Color Models in Skin Pixel Classification," in Proc. Int. Workshop on Recognition, Analysis, and Tracking of Faces and Gestures in Real-Time Systems, 1999, Corfu, Greece, pp. 58, Sep. 26-27, 1999.
    [18] J. Yang, R. Stiefelhagen, U. Meier and A. Waibel, "Visual tracking for multimodal human computer interaction," in Proc. 1998 ACM SIGCHI Conf. on Human Factors in Computing Systems, Los Angeles, CA, USA, pp. 140-147, Apr. 18-23, 1998
    [19] Y. Cheng, "Mean Shift, Mode Seeking, and Clustering," IEEE Trans. On Pattern Analysis and Machine Intelligence, vol. 17, pp. 790-799, 1995.
    [20] G.R. Bradski, "Real Time Face and Object Tracking as a Component of a Perceptual User Interface," in Proc. 4th IEEE Workshop on Applications of Computer Vision, pp. 214, Oct. 1998.
    [21] J.L. Barron, D.J. Fleet and S.S. Beauchemin, "Performance of optical flow techniques," Int. J. Computer Vision, vol. 12, pp. 43-77, 1994.
    [22] F. Bourel, C. Chibelushi C. and A. Low A., "Robust Facial Feature Tracking," in Proc. 11th British Machine Vision Conference, Bristol, UK, vol. 1, pp. 232-241, Sep. 2000.
    [23] D. Lucas and T. Kanade, "An iterative image registration technique with an application in stereo vision," in Proc. 1981 Int. Joint Conf. on Artificial Intelligence , pp. 674-679, 1981.
    [24] J. Shi and C. Tomasi, "Good features to track," in Proc. 1994 IEEE Conf. on Computer Vision and Pattern Recognition, Seattle, Washington, USA, pp. 593-600, Jun. 20-24, 1994.
    [25] Julien Letessier and François Bérard, "Visual Tracking of Bare Fingers for Interactive Surfaces", in Proc. 17th annual ACM symposium on User Interface Software and Technology, Santa Fe, New Mexico, USA, pp. 119-122, Oct. 24-27, 2004
    [26] Shinjiro Kawato and Jun Ohya, "Real-time Detection of Nodding and Head-shaking by Directly Detecting and Tracking the “Between-Eyes”", in Proc. 4th Int. Conf. on Automatic Face and Gesture Recognition, Grenoble, France, pp.40-45, Mar. 28-30, 2000
    [27] Ashish Kapoor and Rosalind W. Picard, "Real-Time Head Nod and Shake Detector", in Proc. 2001 workshop on Perceptive User interfaces, Orlando, Florida, USA, pp1-5, Nov. 15-16, 2001
    描述: 碩士
    國立政治大學
    資訊科學學系
    93753019
    94
    資料來源: http://thesis.lib.nccu.edu.tw/record/#G0937530191
    資料類型: thesis
    顯示於類別:[資訊科學系] 學位論文

    文件中的檔案:

    檔案 描述 大小格式瀏覽次數
    019101.pdf71KbAdobe PDF2753檢視/開啟
    019102.pdf71KbAdobe PDF2829檢視/開啟
    019103.pdf70KbAdobe PDF2751檢視/開啟
    019104.pdf82KbAdobe PDF2719檢視/開啟
    019105.pdf252KbAdobe PDF2815檢視/開啟
    019106.pdf424KbAdobe PDF21305檢視/開啟
    019107.pdf204KbAdobe PDF2897檢視/開啟
    019108.pdf212KbAdobe PDF2748檢視/開啟
    019109.pdf210KbAdobe PDF2664檢視/開啟
    019110.pdf79KbAdobe PDF2667檢視/開啟
    019111.pdf73KbAdobe PDF2755檢視/開啟
    019112.pdf107KbAdobe PDF2782檢視/開啟


    在政大典藏中所有的資料項目都受到原著作權保護.


    社群 sharing

    著作權政策宣告 Copyright Announcement
    1.本網站之數位內容為國立政治大學所收錄之機構典藏,無償提供學術研究與公眾教育等公益性使用,惟仍請適度,合理使用本網站之內容,以尊重著作權人之權益。商業上之利用,則請先取得著作權人之授權。
    The digital content of this website is part of National Chengchi University Institutional Repository. It provides free access to academic research and public education for non-commercial use. Please utilize it in a proper and reasonable manner and respect the rights of copyright owners. For commercial use, please obtain authorization from the copyright owner in advance.

    2.本網站之製作,已盡力防止侵害著作權人之權益,如仍發現本網站之數位內容有侵害著作權人權益情事者,請權利人通知本網站維護人員(nccur@nccu.edu.tw),維護人員將立即採取移除該數位著作等補救措施。
    NCCU Institutional Repository is made to protect the interests of copyright owners. If you believe that any material on the website infringes copyright, please contact our staff(nccur@nccu.edu.tw). We will remove the work from the repository and investigate your claim.
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 回饋