SubjectsSubjects(version: 970)
Course, academic year 2024/2025
   Login via CAS
Neural nets in particle physics - NJSF138
Title: Neuronové sítě v částicové fyzice
Guaranteed by: Institute of Particle and Nuclear Physics (32-UCJF)
Faculty: Faculty of Mathematics and Physics
Actual: from 2015
Semester: winter
E-Credits: 4
Hours per week, examination: winter s.:2/1, Ex [HT]
Capacity: unlimited
Min. number of students: unlimited
4EU+: no
Virtual mobility / capacity: no
State of the course: taught
Language: Czech, English
Teaching methods: full-time
Guarantor: Mgr. Tomáš Sýkora, Ph.D.
Teacher(s): Mgr. Tomáš Sýkora, Ph.D.
Annotation -
The use of neutral networks in particle physics. For 1st year of the Master study and higher
Last update: Krtička Milan, prof. Mgr., Ph.D. (29.04.2019)
Course completion requirements -

Oral exam, which includes a presentation of a computer solution (prepared by the student) of a pre-selected problem.

Last update: Sýkora Tomáš, Mgr., Ph.D. (12.06.2019)
Literature -

Christopher M. Bishop, Neural Networks for Pattern Recognition, ISBN-13: 978-0198538646

Christopher M. Bishop, Pattern Recognition and Machine Learning, ISBN-13: 978-0387310732

Last update: T_UCJF (19.03.2015)
Requirements to the exam -

For the exam, the student prepares a computer program, on the basis of which she / he agrees with the lecturer - either the student herself / himself suggests a the topic he / she is dealing with or chooses a topic proposed by the lecturer.

In addition to the discussion of the programe solutions, the student gets three questions in the range of the lecture syllabus.

Last update: Sýkora Tomáš, Mgr., Ph.D. (12.06.2019)
Syllabus -
  • elements of probability theory - probability density, covariance, Bayesian definition of probability, the Gaussian distribution, data approximation
  • probability distribution - the beta distribution, the Dirichlet distribution, maximum likelihood of the Gaussian distribution, the Student t-distribution
  • linear models for regression - linear basis models, maximum likelihood and least squares, sequential learning, regularized minimal squares, bias-variance decomposition, Bayesian linear regression, limitations of linear models with fixed basis
  • linear classification models - discriminant functions, probabilistic discriminative models, the Laplace approximation, Bayesian logistic regression
  • neural nets - training, error backpropagation, the Hessian matrix, regularization in neural network, Bayesian neural networks
  • dual use of neural networks - approximations and decisions

Last update: T_UCJF (18.04.2012)
 
Charles University | Information system of Charles University | http://www.cuni.cz/UKEN-329.html