SubjectsSubjects(version: 797)
Course, academic year 2016/2017
   Login via CAS
Neural Networks - NAIL002
Czech title: Neuronové sítě
Guaranteed by: Department of Theoretical Computer Science and Mathematical Logic (32-KTIML)
Faculty: Faculty of Mathematics and Physics
Actual: from 2014
Semester: winter
E-Credits: 9
Hours per week, examination: winter s.:4/2 C+Ex [hours/week]
Capacity: unlimited
Min. number of students: unlimited
State of the course: taught
Language: Czech, English
Teaching methods: full-time
Guarantor: doc. RNDr. Iveta Mrázová, CSc.
RNDr. František Mráz, CSc.
Class: Informatika Mgr. - Teoretická informatika
Classification: Informatics > Theoretical Computer Science
Annotation -
Last update: RNDr. Filip Zavoral, Ph.D. (03.04.2001)

The theory of neural networks is motivated by the results achieved in the area of the central neural system research. These inventions often represent the origin for the derived mathematical models which have (despite of significant simplifications of real neuro-physiological processes) some features of the natural intelligence. These models can be used in the design of non-traditional computational means applied in the solutions of many practical problems.
Aim of the course - Czech
Last update: T_KTI (26.05.2008)

Naučit teorii, algoritmy a používané metody pro různé architektury neuronových sítí.

Literature - Czech
Last update: doc. RNDr. Iveta Mrázová, CSc. (30.04.2015)

Abu-Mostafa Y. S., Magdon-Ismail M., Lin H.-T.: Learning From Data: A Short Course,, 2012

Goldberg D. E.: Genetic Algorithms in Search, Optimization and Machine Learning, Addison-Wesley, Reading, Mass. 1989

Haykin S.: Neural Networks and Learning Machines, 3rd Edition, Pearson, 2009

Kohonen T.: Self-Organizing Maps, Springer-Verlag, 1995

Rojas R.: Neural Networks: A Systematic Introduction, Springer-Verlag, Berlin, 1996

Šíma J. and Neruda R.: Teoretické otázky neuronových sítí, Matfyz Press, Praha, 1997

Syllabus -
Last update: doc. RNDr. Iveta Mrázová, CSc. (06.05.2015)

1. Introduction to the area of artificial neural networks

  • Biological neuron and neural networks, transmission of signals in axons and synapses, information processing in neurons, main parts of the brain.
  • History and fundamental principles of artificial neural networks.
  • Adaptation and learning, a formal description of patterns.
  • Selection and ordering of pattern features, selection and ordering of training patterns. Principal Component Analysis.

2. Early models of artificial neural networks

  • The model of a formal neuron, weights, potential, transfer function.
  • Main types of artificial neural metworks.
  • Connectionism, training and recall, supervised learning and self-organization, knowledge extraction, generalization and robustness.
  • Perceptron and linear separability, separating hyperplane. Perceptron training algorithm and its convergence, the pocket algorithm.

3. Feed-forward neural networks and error back-propagation

  • The back-propagation training algorithm and the derivation of weight adjustment rules. Training, test and validation sets, various training strategies.
  • Internal knowledge representation, generalization, over-fitting and over-sizing, Vapnik-Chervonenkis dimension.
  • Kolmogorov´s theorem, function approximation, complexity of the learning problem.
  • Main areas and principles for applications of feed-forward neural networks.

4. Associative networks

  • Recurrent neural networks, Hebbian learning, memory capacity, attractors, energy function and convergence to stable states.
  • Associative memories, bidirectional associative memories (BAM), the Hopfield model, continuous Hopfield model, simulated annealing, the Boltzmann machine.
  • Hopfield networks in the search for suboptimal solutions of NP-complete problems.

5. Self-organization and hybrid models

  • Unsupervised reinforcement learning - Oja´s algorithm for PCA.
  • Kohonen self-organizing feature maps and algorithms for their training, lateral inhibition, topological neighborhood.
  • Counter-propagation neural networks, RBF-networks, Adaptive Resonance Theory (ART).
  • Cascade correlation and modular neural networks - mixtures of local experts.

6. Genetic algorithms

  • Coding of the optimization problem, population of strings, fundamental genetic operators - selection, cross-over, mutation.
  • Fitness function. Convergence analysis - schemata theorem.
  • Applications of genetic algorithms in the field of artificial neural networks.

Charles University | Information system of Charles University |