English  |  正體中文  |  简体中文  |  Post-Print筆數 : 27 |  Items with full text/Total items : 109952/140887 (78%)
Visitors : 46344953      Online Users : 814
RC Version 6.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version
    政大機構典藏 > 商學院 > 統計學系 > 學位論文 >  Item 140.119/149018
    Please use this identifier to cite or link to this item: https://nccur.lib.nccu.edu.tw/handle/140.119/149018


    Title: 無母數可加性迴歸模型下的變數選取方法之比較
    A Comparative Study of Variable Selection Methods in Nonparametric Additive Regression Models
    Authors: 楊侑
    Yang, You
    Contributors: 黃子銘
    劉惠美

    Huang, Tzee-Ming
    Liu, Hui-Mei

    楊侑
    Yang, You
    Keywords: 無母數可加性迴歸模型
    B樣條基底
    樣條函數
    變數選取
    Nonparametric additive regression model
    B-spline basis
    Spline function
    Variable selection
    Date: 2023
    Issue Date: 2024-01-02 15:17:32 (UTC+8)
    Abstract: 本論文深入探討了無母數可加性迴歸模型下的變數選取方法。我
    們採用了由 B 樣條基底組成的樣條函數建立模型,並比較了三種不同
    的變數選取方法:Group Lasso、Adaptive Group Lasso 和 Group Lasso
    結合向前選取法。透過四個不同的指標來評估這些變數選取方法的有
    效性,同時透過圖表呈現的方式,深入研究變數選取對於樣條函數估
    計的影響。
    模擬實驗的結果顯示,在變數選擇的準確度方面,Group Lasso 方
    法表現較低,但在保留非零函數方面表現優異。相較之下,Adaptive
    Group Lasso 方法在準確度上優於 Group Lasso 方法,但在保留非零函
    數方面略遜。根據模擬實驗,我們認為 Adaptive Group Lasso 對懲罰參
    數的敏感性是造成變數選取效果不理想的原因。未來的研究方向可以
    致力於優化懲罰參數選擇演算法,以提升 Adaptive Group Lasso 方法的
    性能。
    最後,Group Lasso 結合向前選取法的方法在所有指標上表現出
    色,因此,我們建議將其作為無母數可加性迴歸模型下的首選變數選
    取方法。同時,模擬實驗結果顯示,當模型選擇剛好的變數時,樣條
    函數估計效果最佳,進一步突顯了變數選取對於模型函數估計的重要
    性。
    This paper delves into variable selection methods under nonparametric additive
    regression models. We employed spline functions composed of B-spline basis to
    construct the model, comparing three different variable selection methods: Group
    Lasso, Adaptive Group Lasso, and Group Lasso combined with forward selection.
    The effectiveness of these variable selection methods was evaluated through four
    distinct metrics, and their impact on spline function estimation was thoroughly investigated through graphical representations.

    Simulation results indicate that, in terms of variable selection accuracy, Group
    Lasso performs relatively lower but excels in retaining non-zero functions. In comparison, Adaptive Group Lasso outperforms Group Lasso in accuracy but slightly
    lags behind in retaining non-zero functions. Based on the simulation experiments,
    we attribute the suboptimal variable selection performance of Adaptive Group Lasso to its sensitivity to penalty parameters. Future research directions could focus on optimizing penalty parameter selection algorithms to enhance the performance of
    Adaptive Group Lasso.

    Finally, the method of combining Group Lasso with forward selection demonstrates outstanding performance across all indicators. We recommend it as the preferred variable selection method under nonparametric additive regression models.
    Additionally, simulation results highlight that when the model selects precisely the
    right variables, the spline function estimation achieves optimal results, emphasizing the importance of variable selection in model function estimation.
    Reference: De Boor, C. (1972). On calculating with B-splines. Journal of Approximation theory,
    6(1):50–62.
    Efroymson, M. A. (1960). Multiple regression analysis. Mathematical methods for digital computers, 191–203.
    Fox, J. (2002). Nonparametric regression. Appendix to: An R and S-PLUS Companion to Applied Regression, 1–7.
    Huang, J., Horowitz, J. L., and Wei, F. (2010). Variable selection in nonparametric additive models. The Annals of Statistics, 38(4):2282–2313.
    Schoenberg, I. J. (1946). Contributions to the problem of approximation of equidistant data by analytic functions. Part B. On the problem of osculatory interpolation. A second class of analytic approximation formulae. Quarterly of Applied Mathematics, 4(2):112–141.
    Schwarz, G. (1978). Estimating the dimension of a model. The Annals of Statistics, 461–464.
    Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society: Series B (Methodological), 58(1):267–288.
    Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 68(1):49–67.
    Zou, H. (2006). The adaptive lasso and its oracle properties. Journal of the American statistical association, 101(476):1418–1429.
    Description: 碩士
    國立政治大學
    統計學系
    110354021
    Source URI: http://thesis.lib.nccu.edu.tw/record/#G0110354021
    Data Type: thesis
    Appears in Collections:[統計學系] 學位論文

    Files in This Item:

    File Description SizeFormat
    402101.pdf908KbAdobe PDF1View/Open


    All items in 政大典藏 are protected by copyright, with all rights reserved.


    社群 sharing

    著作權政策宣告 Copyright Announcement
    1.本網站之數位內容為國立政治大學所收錄之機構典藏,無償提供學術研究與公眾教育等公益性使用,惟仍請適度,合理使用本網站之內容,以尊重著作權人之權益。商業上之利用,則請先取得著作權人之授權。
    The digital content of this website is part of National Chengchi University Institutional Repository. It provides free access to academic research and public education for non-commercial use. Please utilize it in a proper and reasonable manner and respect the rights of copyright owners. For commercial use, please obtain authorization from the copyright owner in advance.

    2.本網站之製作,已盡力防止侵害著作權人之權益,如仍發現本網站之數位內容有侵害著作權人權益情事者,請權利人通知本網站維護人員(nccur@nccu.edu.tw),維護人員將立即採取移除該數位著作等補救措施。
    NCCU Institutional Repository is made to protect the interests of copyright owners. If you believe that any material on the website infringes copyright, please contact our staff(nccur@nccu.edu.tw). We will remove the work from the repository and investigate your claim.
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - Feedback