Thesis (Selection of subject)Thesis (Selection of subject)(version: 368)
Thesis details
   Login via CAS
Aktivní hluboké učení
Thesis title in Czech: Aktivní hluboké učení
Thesis title in English: Active deep learning
Academic year of topic announcement: 2018/2019
Thesis type: dissertation
Thesis language:
Department: Department of Theoretical Computer Science and Mathematical Logic (32-KTIML)
Supervisor: prof. RNDr. Ing. Martin Holeňa, CSc.
Author: Mgr. Jiří Tumpach - assigned and confirmed by the Study Dept.
Date of registration: 30.09.2019
Date of assignment: 30.09.2019
Confirmed by Study dept. on: 04.10.2019
Guidelines
One of the most quickly developing areas of machine learning is active learning (AL) -- learning of models in such a way that training data is actively selected to be most informative or in some other way most useful for the learned model. It is very advantageous in situations in which unlabeled inputs are abundant, but labeling them is costly or time consuming, typically due to an involvement of human experts.
When applying active learning to neural networks, especially with deep neural networks, several specific problems need to be addressed:
1. The availability of only a small amount of labeled training data assumed by AL algorithms can cause a serious difficulty for deep neural networks learning.
2. That problem further increases due to extensive experimentation typically performed when working with deep neural networks.
3. Active learning typically aims at uncertainty decrease, but deep neural networks are rarely able to represent uncertainty.
4. In case of deep neural networks, active learning is typically performed as batch learning, not as the usual individual learning. Then diversity within the batches, to assure a sufficient exploration of the feature space.
The dissertation should deal with some of the above problems, both from general methodological point of view and in connection with active learning in some particular domains, which could include black-box optimization or malware detection.
References
* Y. Gal, R. Islam, and Z. Ghahramini. Deep bayesian active learning with image data. In 34th
International Conference on Machine Learning, pages 1183-1192, 2017.
* S.J. Huang, J.W. Zhao, and Z.Y. Liu. Cost-effective training of deep CNNs with active model
adaptation. In ACM ICKDDM, pages 1580-1588, 2018.
* M. Kandemir. Variational closed-form deep neural net inference. Pattern Recognition, 112:145-151,
2018.
* C. Yin, B. Qian, S. Cao, X. Li, J. Wei, Q. Zheng, and I. Davidson. Deep similarity-based batch
mode active learning with exploration-exploitation. In IEEE ICDM, pages 575-584, 2017.
 
Charles University | Information system of Charles University | http://www.cuni.cz/UKEN-329.html