SubjectsSubjects(version: 944)
Course, academic year 2023/2024
   Login via CAS
Introduction to Machine Learning with R - NPFL054
Title: Úvod do strojového učení v systému R
Guaranteed by: Institute of Formal and Applied Linguistics (32-UFAL)
Faculty: Faculty of Mathematics and Physics
Actual: from 2023
Semester: summer
E-Credits: 5
Hours per week, examination: summer s.:2/2, C+Ex [HT]
Capacity: unlimited
Min. number of students: unlimited
4EU+: no
Virtual mobility / capacity: no
State of the course: not taught
Language: Czech, English
Teaching methods: full-time
Teaching methods: full-time
Additional information:
Guarantor: doc. Mgr. Barbora Vidová Hladká, Ph.D.
RNDr. Martin Holub, Ph.D.
Class: DS, matematická lingvistika
Informatika Bc.
Informatika Mgr. - Matematická lingvistika
Classification: Informatics > Informatics, Software Applications, Computer Graphics and Geometry, Database Systems, Didactics of Informatics, Discrete Mathematics, External Subjects, General Subjects, Computer and Formal Linguistics, Optimalization, Programming, Software Engineering, Theoretical Computer Science, Computer and Formal Linguistics
Incompatibility : NPFL129
Interchangeability : NPFL129
Is incompatible with: NPGR035, NPFL129
Is interchangeable with: NPFL129
Annotation -
Last update: doc. Mgr. Barbora Vidová Hladká, Ph.D. (15.05.2020)
Lectures cover both theoretical background and practical algorithms of Machine Learning (ML). The emphasis is placed on comprehensive understanding of the ML process, which includes data analysis, choice of ML algorithm, learning parameters tuning, statistical evaluation and model assessment. Lab sessions aim at practical experience with ML tasks using existing R libraries. Homework assignments are practical exercises using R. The last assignment is the most extensive and includes comprehensive processing of a typical, not very demanding problem, and writing a report on solution variants and t
Aim of the course -
Last update: doc. Mgr. Barbora Vidová Hladká, Ph.D. (15.05.2020)

The aim of the course is to present the Machine Learning process from both theoretical and practical point of view. Students get familiar with the theoretical foundations of selected algorithms and learn to practically solve Machine Learning problems using libraries of the statistical system R. Students must be able to comprehensively solve an example machine learning problem and analyze and describe solution variants and their evaluation.

Course completion requirements -
Last update: doc. Mgr. Barbora Vidová Hladká, Ph.D. (29.04.2021)

During the term students have to 1) present easy homework, 2) submit two homework assignments so that the total score exceeds the required score limit, and 3) pass two written tests so that the total score exceeds the required score limit.

Obtaining the course credit is a prerequisite for taking the exam.

More details about homework assignments and tests are available on the course web site.

Literature -
Last update: doc. Mgr. Barbora Vidová Hladká, Ph.D. (15.05.2020)

James, Gareth, Daniela Witten, Trevor Hastie, and Robert Tibshirani: An Introduction to Statistical Learning. Springer, 2013.

Lantz, Brett: Machine Learning with R. Packt Publishing, 2013.

Requirements to the exam -
Last update: doc. Mgr. Barbora Vidová Hladká, Ph.D. (15.05.2020)

The exam is oral. However, the results of written tests and homework assignments are taken into account. Obtaining the course credit is a prerequisite for taking the exam.

The examination requirements correspond to the course syllabus. More details are available on the course web site.

Syllabus -
Last update: doc. Mgr. Barbora Vidová Hladká, Ph.D. (15.05.2020)

Machine learning - basic concepts, examples of practical applications, theoretical foundations. Supervised and unsupervised learning. Classification and regression tasks. Classification into two, or more classes. Training and test examples. Feature vectors. Target variable and prediction function. Machine learning development process. Curse of dimensionality. Clustering.

Decision tree learning. Learning algorithm, splitting criteria and pruning. Random forests.

Linear and logistic regression. Least squares methods. Discriminative classifiers.

Instance-based learning. k-NN algoritmus.

Naive Bayes classifier. Bayesian belief networks.

Support Vector Machines. Large and soft margin classifier. Kernel functions.

Ensemble methods. Unstable learning algorithms. Bagging and boosting. AdaBoost algorithm.

Parameters in machine learning. Hyperparameters tuning. Searching parameter space. Gradient descent algorithm. Maximum likelihood estimation.

Experiment evaluation. Working with development and test data. Sample error, generalization error. Cross-validation, leave-one-out method. Bootstrap method. Performance measures. Evaluation of binary classifiers. ROC curve.

Statistical tests. Statistical hypotheses, one-sample and two-sample t-tests, chi-square tests. Significance level, p-value. Using statistical tests for classifier evaluation. Confidence intervals.

Overfitting. How to recognize and avoid. Regularization. Bias-variance decomposition.

General principles of feature selection. Feature selection using information gain, greedy algorithms.

Dimensionality reduction, Principal Component Analysis.

Foundations of Neural Networks. Single Perceptron, Single Layer Perceptron. The architecture of multi-layer feed-forward models and the idea of back-propagation training. Remarks on deep learning.

Charles University | Information system of Charles University |