The following paper has been accepted to the International Conference on Machine Learning (ICML 2019). In this paper, we generalize and improve the dynamic optimization method of neural network structures proposed in [Shirakawa et al. 2018], and realize the fast, robust, and widely-applicable neural architecture search method.
[Reference] Youhei Akimoto, Shinichi Shirakawa, Nozomu Yoshinari, Kento Uchida, Shota Saito, and Kouhei Nishida: Adaptive Stochastic Natural Gradient Method for One-Shot Neural Architecture Search, Proceedings of the 36th International Conference on Machine Learning (ICML), Vol. 97 of PMLR, pp. 171-180 (2019) [Link] [arXiv] [Code]