Thesis (Selection of subject)Thesis (Selection of subject)(version: 368)
Thesis details
   Login via CAS
Leveraging lower fidelity proxies for neural network based NAS predictors
Thesis title in Czech: Využití výpočetně nenáročných proxy pro NAS prediktory založené na neuronových sítích
Thesis title in English: Leveraging lower fidelity proxies for neural network based NAS predictors
Key words: prohledávání architektur neuronových sítí|prediktory výkonnosti|proxy|automatické strojové učení|neuronové sítě
English key words: neural architecture search|performance predictors|proxy|AutoML|neural networks
Academic year of topic announcement: 2023/2024
Thesis type: diploma thesis
Thesis language: angličtina
Department: Department of Theoretical Computer Science and Mathematical Logic (32-KTIML)
Supervisor: Mgr. Roman Neruda, CSc.
Author: hidden - assigned and confirmed by the Study Dept.
Date of registration: 07.12.2023
Date of assignment: 07.12.2023
Confirmed by Study dept. on: 07.12.2023
Date and time of defence: 10.06.2024 09:30
Date of electronic submission:27.04.2024
Opponents: Mgr. Martin Pilát, Ph.D.
 
 
 
Guidelines
Neural architecture search (NAS) represents an important research area of automated machine learning applied to design an optimal data dependent neural model. The main drawback of NAS is the high time and computational demand of the search procedure. Performance predictors and proxies are utilized to speed up the critical phase of model evaluation.

The goal of the work is to propose a new efficient prediction procedure for neural architecture search. The student will explore combinations of learning curve extrapolation predictors and zero cost proxies and their application strategies during the architecture search. The efficiency of the algorithms will be evaluated on the standard NAS-Bench-201 benchmark.
References
[1] Frank Hutter and Lars Kotthoff and Joaquin Vanschoren, Automated Machine Learning - Methods, Systems, Challenges, 2019, Springer

[2] Xuanyi Dong et al, NAS-Bench-201: Extending the Scope of Reproducible Neural Architecture Search, International Conference on Learning Representations (ICLR), 2020, https://arxiv.org/abs/2001.00326

[3] Esteban Real et al, Regularized Evolution for Image Classifier Architecture Search, AAAI, 2019, https://ojs.aaai.org/index.php/AAAI/article/view/4405

[4] Colin White et al, How Powerful are Performance Predictors in Neural Architecture Search?, NeurIPS, 2021, https://papers.nips.cc/paper_files/paper/2021/hash/ef575e8837d065a1683c022d2077d342-Abstract.html

[5] Mohamed S. Abdelfattah et al, Zero-Cost Proxies for Lightweight NAS, International Conference on Learning Representations (ICLR), 2021, https://arxiv.org/abs/2101.08134
 
Charles University | Information system of Charles University | http://www.cuni.cz/UKEN-329.html