English  |  正體中文  |  简体中文  |  Post-Print筆數 : 27 |  Items with full text/Total items : 94559/125088 (76%)
Visitors : 29751265      Online Users : 346
RC Version 6.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version
    政大機構典藏 > 商學院 > 統計學系 > 學位論文 >  Item 140.119/111598
    Please use this identifier to cite or link to this item: http://nccur.lib.nccu.edu.tw/handle/140.119/111598

    Title: LASSO與廣義LASSO選取變數比較
    A comparative study of lasso and a general version of lasso for variable selection
    Authors: 朱晉楠
    Chu, Chin Nan
    Contributors: 黃子銘
    Huang,Tzee Ming
    Chu,Chin Nan
    Keywords: 變數選取
    Variable selection
    Least absolute shrinkage and selection operator
    Bayesian information criterion
    Date: 2017
    Issue Date: 2017-07-31 12:53:54 (UTC+8)
    Abstract: 在建構模型時,變數的選取是非常重要的,一般使用向前選取、向後刪除、逐步迴歸來挑選變數。
    Tibshirani[4]在1996 年提出最小絕對值壓縮挑選運算least absolute
    shrinkage and selection operator;簡稱LASSO),LASSO 方法結合了變數係數的壓縮與變數選取。
    本研究針對 LASSO 的限制式做修改,另外也將搜尋參數t 的方法改良,評估統計模型優劣則使用貝氏訊息準則,最後,改良的搜尋方法能更精確找到對於反應變數有影響的解釋變數,達到選取變數的效果。
    In model construction, variable selection is a very important issue.Typical variable selection tools include
    forward selection, backward selection and stepwise selection. In 1996,Tibshirani proposed a method called LASSO (Least Absolute Shrinkage and Selection Operator), which can be used for variable selection via
    coefficient shrinkage.
    In this thesis, a general version of LASSO is proposed to improve the variable selection ability of LASSO. The proposed method is obtained by modifiying the constraints of LASSO. For both LASSO and the proposed method, the constraints depends on a shrinkage parameter that needs to be specified. In this thesis, the shrinkage parameter is selected using Bayesian information criterion. When the optimal parameter is found, the proposed method outperforms LASSO in variable selection. However, the search of the optimal parameter can be computationally intensive.
    Reference: Yoav Benjamini and Yosef Hochberg. Controlling the false discovery rate:
    a practical and powerful approach to multiple testing. Journal of the
    Royal Statistical Society, Series B, 57(1), 1995.

    Akaike Hirotsugu. A new look at the statistical model identifation.
    IEEE Transactions on Automatic Control, 19(6), 1974.

    M.R. Osborne, B. Presnell, and B.A. T urlach. A note on the least
    absolute shrink age and selection operator. unpublished manuscript. 1998.

    Tibshirani Robert. Regression shrinkage and selection via the lasso. Jour-
    nal of the Royal Statistical Society., 1996.

    Gideon E Schwarz. Estimating the dimension of a model. Annals of
    Statistics, 6(2), 1978.

    Ming Yuan and Yi Lin. Model selection and estimation in regression
    with grouped variables. Journal of the Royal Statistical Society, Series
    B, 68(1), 2006.

    Hui Zou and Hastie. Regularization and variable selection via the elastic
    net. Journal of the Royal Statistical Society, Series B, 2005.
    Description: 碩士
    Source URI: http://thesis.lib.nccu.edu.tw/record/#G0104354030
    Data Type: thesis
    Appears in Collections:[統計學系] 學位論文

    Files in This Item:

    File SizeFormat
    403001.pdf914KbAdobe PDF0View/Open

    All items in 政大典藏 are protected by copyright, with all rights reserved.

    社群 sharing

    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - Feedback