政大機構典藏-National Chengchi University Institutional Repository(NCCUR):Item 140.119/84750
English  |  正體中文  |  简体中文  |  Post-Print筆數 : 27 |  Items with full text/Total items : 97142/127787 (76%)
Visitors : 33312914      Online Users : 435
RC Version 6.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version
    Please use this identifier to cite or link to this item: http://nccur.lib.nccu.edu.tw/handle/140.119/84750

    Title: 結合多點觸控與3D手勢之手機創新介面研究
    Other Titles: Designing a Novel Interface for Mobile Devices with Multitouch and 3d Gestures
    Authors: 余能豪
    Contributors: 資訊科學系
    Keywords: 人機互動;自然使用者介面;多指觸控;行動平台;影像辨識;手勢辨 識;手勢語彙;雙手互動模式
    HCI;natural user interface;gestural user interface;multi-touch;mobiles;image recognition;gesture recognition;gesture vocabulary;bimanual input
    Date: 2012
    Issue Date: 2016-04-15 11:33:03 (UTC+8)
    Abstract: 現今智慧型手機多採用直覺操控的多指觸控操作方式 (Multi-touch)為使用者 介面,並採用電容式感測技術偵測手指在 2D 平面上的動作做為輸入指令,因此 介面設計人員不斷探詢各種自然對應的手勢語彙 (natural gesture vocabulary),以期 增加使用者操作的便利性,然而以2D 平面為主的手勢卻也限制了許多自然手勢操 作的可能性,例如抓取物件在空間中移動等。隨著科技快速發展,智慧型手機的 處理速度將有能力運行手機前、後置鏡頭之3D 手勢辨識,目前國內外已有多項針 對手指影像追蹤及簡單手勢動作辨識之研究,但其研究假設乃基於Touchless 之操 作,對於手持置裝置上與多指觸控螢幕結合之3D 手勢語彙研究卻付之闕如。 本計劃擬研究智慧型手機及平板之使用者運用手勢的方式,探討在手持置裝 置上2D 結合3D 手勢語彙之可能應用,包含一、將GUI 常見之Contextual-menu 延伸,設計用以輔助平面觸控操作之3D 手勢動態選單 (Lift-menu),並提出視覺 介面之設計準則。二、以抓取物件移動之隱喻 (Grasp-and-drop metaphor) 設計新 的拖拉放 (drag-and-drop) 操作模式,並結合2D+3D 手勢設計雙手同步操作模式。 本計劃研究成果將可運用在智慧型手機或平板上,提供使用者更自然且便捷的行 動平台使用者介面 (Mobile User Interface)。
    Multi-touch technology has been adopted widely in mobile devices. Researchers therefore try to investigate more natural gesture vocabularies to provide simple and intuitive user interface. However, the touch surfaces can only detect 2D gestures thus constrain some natural gestures such as grasp object and move it in the air. On the other hand, nowaday smart phones have better-performance CPU/GPU and contain better-quality front/back built-in camera. Some researchers utilize these features to design touchless interaction by using finger tracking or gesture recognition. To the best of our knowledge, no such research exists for exploring 2D + 3D gestures with a mobile platform, and filling this gap is thus another driver for the work presented here. In this project, we plan to investigate the daily gestures that are performed on smart phones or tablets and to explore the potential natural interaction that combines 2D and 3D gestures. The research topics include: (1) Redesign the function of contextual-menu by a method called “Lift-menu”, which users simply lift their finger to trigger it. And we will provide the UI design guidelines for this interaction. (2) Redesign the function of drag-and-drop by new metaphoric gesture: “Grasp-and-drop”. The gesture can support bimanual interaction and overcome the problem of original design. We will optimize these new gestures to develop a novel mobile user interface.
    Relation: 計畫編號 NSC101-2221-E004-013
    Data Type: report
    Appears in Collections:[Department of Computer Science ] NSC Projects

    Files in This Item:

    File Description SizeFormat
    101-2221-E004-013.pdf2212KbAdobe PDF423View/Open

    All items in 政大典藏 are protected by copyright, with all rights reserved.

    社群 sharing

    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - Feedback