Thesis (Selection of subject)Thesis (Selection of subject)(version: 368)
Thesis details
   Login via CAS
Evoluce učitelných systémů
Thesis title in Czech: Evoluce učitelných systémů
Thesis title in English: Evolution of Learnable Systems
Key words: evoluce, neuronové sítě, strojové učení
English key words: evolution, neural networks, machine learning
Academic year of topic announcement: 2015/2016
Thesis type: dissertation
Thesis language:
Department: Department of Software and Computer Science Education (32-KSVI)
Supervisor: RNDr. František Mráz, CSc.
Author: hidden - assigned and confirmed by the Study Dept.
Date of registration: 26.10.2016
Date of assignment: 26.10.2016
Confirmed by Study dept. on: 26.10.2016
Guidelines
Recurrent neural networks (RNN) represent a powerful time-aware model, which plays a key role in the future of machine learning. However, with the current scientific knowledge, recurrent networks are difficult to train by standard gradient-based techniques [1]. To overcome this difficulty, various alternative network designs and training algorithms have been proposed, such as Long-Short Term Memory (LSTM, [2]), Gated Recurrent Unit (GRU, [3]), and Echo State Networks [4].

In his master thesis Maximizing Computational Power by Neuroevolution, Filip Matzner has demonstrated that the performance of Echo State Networks may be further improved by neuroevolutionary methods. Well known examples of such methods are, for instance, NeuroEvolution of Augmenting Topologies (NEAT, [5]) and Hypercube-based NEAT (HyperNEAT, [6]).

The goal of this thesis is to explore the promising combination of evolution and trainable recurrent networks to a greater extend. A strong focus will be kept on producing large networks capable of solving complex tasks, e.g. from image processing.
References
[1] Razvan Pascanu, Tomas Mikolov, and Yoshua Bengio. On the difficulty of training recurrent neural networks. ICML (3), 28:1310–1318, 2013.
[2] Sepp Hochreiter and Jürgen Schmidhuber. Long short-term memory. Neural computation, 9(8):1735–1780, 1997.
[3] Kyunghyun Cho, Bart van Merrienboer, Caglar Gulcehre, Dzmitry Bahdanau, Fethi Bougares, Holger Schwenk, and Yoshua Bengio. Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation. arXiv preprint arXiv:1406.1078, 2014.
[4] Herbert Jaeger. The echo state approach to analysing and training recurrent neural networks-with an erratum note. German National Research Center for Information Technology GMD Technical Report, 148:34, 2001.
[5] Kenneth O. Stanley and Risto Miikkulainen. Evolving neural networks through augmenting topologies. Evolutionary Computation, 10(2):99–127, 2002.
[6] Kenneth O. Stanley, David B. D’Ambrosio, and Jason Gauci. A hypercube-based encoding for evolving large-scale neural networks. Artificial life, 15(2):185–212,2009.
 
Charles University | Information system of Charles University | http://www.cuni.cz/UKEN-329.html