SubjectsSubjects(version: 945)
Course, academic year 2023/2024
   Login via CAS
Neural nets in particle physics - NJSF138
Title: Neuronové sítě v částicové fyzice
Guaranteed by: Institute of Particle and Nuclear Physics (32-UCJF)
Faculty: Faculty of Mathematics and Physics
Actual: from 2015
Semester: winter
E-Credits: 4
Hours per week, examination: winter s.:2/1, Ex [HT]
Capacity: unlimited
Min. number of students: unlimited
4EU+: no
Virtual mobility / capacity: no
State of the course: taught
Language: Czech, English
Teaching methods: full-time
Teaching methods: full-time
Guarantor: Mgr. Tomáš Sýkora, Ph.D.
Annotation -
Last update: doc. Mgr. Milan Krtička, Ph.D. (29.04.2019)
The use of neutral networks in particle physics. For 1st year of the Master study and higher
Course completion requirements -
Last update: Mgr. Tomáš Sýkora, Ph.D. (12.06.2019)

Oral exam, which includes a presentation of a computer solution (prepared by the student) of a pre-selected problem.

Literature -
Last update: T_UCJF (19.03.2015)

Christopher M. Bishop, Neural Networks for Pattern Recognition, ISBN-13: 978-0198538646

Christopher M. Bishop, Pattern Recognition and Machine Learning, ISBN-13: 978-0387310732

Requirements to the exam -
Last update: Mgr. Tomáš Sýkora, Ph.D. (12.06.2019)

For the exam, the student prepares a computer program, on the basis of which she / he agrees with the lecturer - either the student herself / himself suggests a the topic he / she is dealing with or chooses a topic proposed by the lecturer.

In addition to the discussion of the programe solutions, the student gets three questions in the range of the lecture syllabus.

Syllabus -
Last update: T_UCJF (18.04.2012)
  • elements of probability theory - probability density, covariance, Bayesian definition of probability, the Gaussian distribution, data approximation
  • probability distribution - the beta distribution, the Dirichlet distribution, maximum likelihood of the Gaussian distribution, the Student t-distribution
  • linear models for regression - linear basis models, maximum likelihood and least squares, sequential learning, regularized minimal squares, bias-variance decomposition, Bayesian linear regression, limitations of linear models with fixed basis
  • linear classification models - discriminant functions, probabilistic discriminative models, the Laplace approximation, Bayesian logistic regression
  • neural nets - training, error backpropagation, the Hessian matrix, regularization in neural network, Bayesian neural networks
  • dual use of neural networks - approximations and decisions

 
Charles University | Information system of Charles University | http://www.cuni.cz/UKEN-329.html