政大機構典藏-National Chengchi University Institutional Repository(NCCUR):Item 140.119/32694
English  |  正體中文  |  简体中文  |  Post-Print筆數 : 27 |  全文笔数/总笔数 : 112704/143671 (78%)
造访人次 : 49721979      在线人数 : 731
RC Version 6.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
搜寻范围 查询小技巧:
  • 您可在西文检索词汇前后加上"双引号",以获取较精准的检索结果
  • 若欲以作者姓名搜寻,建议至进阶搜寻限定作者字段,可获得较完整数据
  • 进阶搜寻
    政大機構典藏 > 資訊學院 > 資訊科學系 > 學位論文 >  Item 140.119/32694


    请使用永久网址来引用或连结此文件: https://nccur.lib.nccu.edu.tw/handle/140.119/32694


    题名: 對於閱讀的感興趣程度與眼動特徵關係之研究
    The Research on the Relationship between Interesting Degree of Reading and Eye Movement Features
    作者: 王加元
    Wang, Jia Yuan
    贡献者: 陳良弼
    蔡介立

    Chen, Arbee L.P.
    Tsai, Jie Li

    王加元
    Wang, Jia Yuan
    关键词: 眼動
    閱讀
    感興趣程度
    資料探勘
    序列資料探勘
    重複片段
    分類
    eye movement
    reading
    interesting degree
    data mining
    sequence mining
    repeating pattern
    classification
    日期: 2008
    上传时间: 2009-09-17 14:04:29 (UTC+8)
    摘要: 現在有許多對於眼動軌跡與人在認知方面的研究,包括理解狀態以及感興趣的程度;其中,閱讀文章時的眼動軌跡是最常被討論及研究的題材。而本研究的目的就是希望探討讀者在閱讀時的眼動軌跡,與其感興趣程度之間是否存在關係。<br>本研究的特色在於,我們不用一般分析眼動時關心每個AOI(area of interest)上的眼動資料,而是希望將眼動資料以序列的方式分析,並且運用資料探勘的方法,找出眼動序列中區分感興趣程度的眼動軌跡特徵的片段。<br>透過對於眼動軌跡的分析,我們希望研究的結果,在未來可以運用在資訊檢索的領域上,成為一種有效的「隱含式回饋(implicit feedback)」的方式,以改善現有資訊檢索效能。
    Much research has been performed on the relationship between eye movements and human cognition, including comprehension and interesting degree. The purpose of our research is to find out if there are relationships between eye movements of reading and interesting degree.<br>Instead of analyzing the eye movements on each area of interest, the characteristic of our research is to transform eye movements to sequence data, and to determine the eye movement patterns which discriminate whether user is interesting or not by using the method of data mining.<br>Through the analysis of the eye movements, our research result can be used as one way of implicit feedback of information retrieval to improve the effectiveness of the search engine.
    參考文獻: [1] Dario D. Salvucci and Joseph H. Goldberg, “Identifying fixations and saccades in eye-tracking protocols,” Proceedings of the 2000 symposium on Eye tracking research & applications.
    [2] 蔡介立(Jie-Li Tsai), 顏妙璇(Miao-Hsuan Yen), and 汪勁安(Chin-An Wang), “眼球移動測量及在中文閱讀研究之應用,” 應用心理研究, 28期, 91-104, 2005.
    [3] Jarkko Salojarvi, Kai Puolamaki, Jaana Simola, Lauri Kovanen, Ilpo Kojo, and Samuel Kaski, “Inferrring Relevance from Eye Movements: Feature Extraction,” Helsinki University of Technology, Publications in Computer and Information Science.
    [4] Rayner, K., Chace, K. H., Slattery, T. J. and Ashby, J, “Movements as Reflections of Comprehension Processes in Reading,” Scientific Studies of Reading 10(3): 241-255.
    [5] Aulikki Hyrskykari, Paivi Majaranta, Antti Aaltonen, and Kari-Jouko Raiha, “Design Issues of iDict: A Gaze-Assisted Translation Aid,” Proceedings of the 2000 symposium on Eye tracking research & applications.
    [6] Bing Pan, Helene A. Hembrooke, Geri K. Gay, Laura A. Granka, Matthew K. Feusner, and Jill K. Newman, “The Determinants of Web Page Viewing Behavior: An Eye-Tracking Study,” Proceedings of the 2004 symposium on Eye tracking research & applications.
    [7] Julia M. West, Anne R. Haake, Evelyn P. Rozanski, and Keith S. Karn, “eyePattern: Softwore for Identifying Patterns and Similarities Across Fixation Sequences,” Proceedings of the 2006 symposium on Eye tracking research & applications.
    [8] Hidetake Uwano, Masahide Nakamura, Akito Monden and Ken-ichi Matsumoto, “Analyzing Individual Performance of Source Code Review Using Reviewer’s Eye Movement,” Proceedings of the 2006 symposium on Eye tracking research & applications.
    [9] Georg Buscher, “Attention-Based Information Retrieval,” ACM SIGIR Conference on Research and Development of Information Retrieval, 2007.
    [10] David Hardoon, John Shawe-Taylor, Antti Ajanki, Kai Puolamäki, and Samuel Kaski, “Information Retrieval by Inferring Implicit Queries from eye Movements,” Artificial Intelligence and Statistics, 2007.
    [11] 中央研究院漢語平衡語料庫,http://www.aclclp.org.tw/use_asbc_c.php
    [12] SHYAMALA DORAISAMY, and STEFAN RÜGER, “Robust Polyphonic Music Retrieval with N-grams,” Journal of Intelligent Information Systems, 21:1, 53–70, 2003.
    [13] Jia-Lien Hsu, Arbee L.P. Chen, Hung-Chen Chen, “Finding Approximate Repeating Patterns from Sequence Data,” Proc. International Symposium on Music Information Retrieval, 2004.
    描述: 碩士
    國立政治大學
    資訊科學學系
    95753009
    97
    資料來源: http://thesis.lib.nccu.edu.tw/record/#G0095753009
    数据类型: thesis
    显示于类别:[資訊科學系] 學位論文

    文件中的档案:

    档案 描述 大小格式浏览次数
    300901.pdf98KbAdobe PDF2919检视/开启
    300902.pdf124KbAdobe PDF2930检视/开启
    300903.pdf143KbAdobe PDF2922检视/开启
    300904.pdf145KbAdobe PDF2934检视/开启
    300905.pdf144KbAdobe PDF2947检视/开启
    300906.pdf174KbAdobe PDF21308检视/开启
    300907.pdf588KbAdobe PDF21320检视/开启
    300908.pdf209KbAdobe PDF2939检视/开启
    300909.pdf128KbAdobe PDF2849检视/开启
    300910.pdf117KbAdobe PDF2900检视/开启


    在政大典藏中所有的数据项都受到原著作权保护.


    社群 sharing

    著作權政策宣告 Copyright Announcement
    1.本網站之數位內容為國立政治大學所收錄之機構典藏,無償提供學術研究與公眾教育等公益性使用,惟仍請適度,合理使用本網站之內容,以尊重著作權人之權益。商業上之利用,則請先取得著作權人之授權。
    The digital content of this website is part of National Chengchi University Institutional Repository. It provides free access to academic research and public education for non-commercial use. Please utilize it in a proper and reasonable manner and respect the rights of copyright owners. For commercial use, please obtain authorization from the copyright owner in advance.

    2.本網站之製作,已盡力防止侵害著作權人之權益,如仍發現本網站之數位內容有侵害著作權人權益情事者,請權利人通知本網站維護人員(nccur@nccu.edu.tw),維護人員將立即採取移除該數位著作等補救措施。
    NCCU Institutional Repository is made to protect the interests of copyright owners. If you believe that any material on the website infringes copyright, please contact our staff(nccur@nccu.edu.tw). We will remove the work from the repository and investigate your claim.
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 回馈