Thesis (Selection of subject)Thesis (Selection of subject)(version: 368)
Thesis details
   Login via CAS
Least Absolute Shrinkage and Selection Operator Method
Thesis title in Czech: Regresní metoda lasso
Thesis title in English: Least Absolute Shrinkage and Selection Operator Method
Academic year of topic announcement: 2016/2017
Thesis type: Bachelor's thesis
Thesis language: angličtina
Department: Institute of Economic Studies (23-IES)
Supervisor: RNDr. Michal Červinka, Ph.D.
Author: hidden - assigned by the advisor
Date of registration: 09.11.2016
Date of assignment: 09.11.2016
Date and time of defence: 14.06.2017 09:00
Venue of defence: Opletalova - Opletalova 26, O105, Opletalova - místn. č. 105
Date of electronic submission:17.05.2017
Date of proceeded defence: 14.06.2017
Opponents: PhDr. Marek Rusnák, Ph.D.
 
 
 
URKUND check:
Guidelines
The lasso (Least Absolute Shrinkage and Selection Operator) [1] is a method used to estimate important variables in models which work with high dimensional data. The lasso is a penalized regression technique which uses l1-norm (absolute value) penalization and it is based on minimization of the least-squares objective function which includes l1-penalty term. This technique performs both regularization and variable selection. We introduce related penalization techniques, namely, the so-called best subset selection method, ridge regression method and elastic net method.

The main goal of this bachelor thesis is to illuminate application of the lasso method when analyzing real economic data. We will employ the R software for numerical experiments. We shall compare the lasso estimator with several other types of estimators based on minimization of mean squared error.
References
[1] Tibshirani, R. Regression Shrinkage and Selection via the Lasso. Journal of the Royal Statistical Society. Vol.58, No.1, 267-288, 1996.

[2] Jacob, L., Obozinski, G., and Vert, J.P. Group lasso with overlap and graph lasso. Proceeding ICML ’09 Proceedings of the 26th Annual International Conference on Machine Learning, 2009.

[3] Buehlmann, P., Geer, S. Statistics for High-Dimensional Data: Methods, Theory and Applications. Springer Berlin Heidelberg, 2011.

[4] Hastie, T., Tibshirani, R., Wainwright, M. Statistical Learning with Sparsity: The Lasso and Generalizations. Chapman & Hall/CRC Monographs on Statistics & Applied Probability, 2015.

[5] Belloni, A., Chernoyhukov, V., Hansen, Ch. High-Dimensional Methods and Inference on Structural and Treatment Effects. Journal of Economic perspectives, vol.28, no.2, 2014.

[6] Zou, H. The Adaptive Lasso and Its Oracle Properties. Journal of the American Statistical Association, Volume 101, Issue 476, 2006.
Preliminary scope of work
Outline:

1. Introduction

2. Lasso regression method

3. Modifications of lasso and theoretical comparison

4. Data analysis and numerical comparison

5. Conclusion
Preliminary scope of work in English
Outline:

1. Introduction

2. Lasso regression method

3. Modifications of lasso and theoretical comparison

4. Data analysis and numerical comparison

5. Conclusion
 
Charles University | Information system of Charles University | http://www.cuni.cz/UKEN-329.html