Accepted to CGI 2020

The following paper has been accepted to the Computer Graphics International (CGI) 2020 as a proceedings paper. This paper proposes an improved redirected walking technique using reinforcement learning, where the redirected walking enables to experience virtual reality in limited physical spaces.

Wataru Shibayama and Shinichi Shirakawa: Reinforcement Learning-Based Redirection Controller for Efficient Redirected Walking in Virtual Maze Environment, Computer Graphics International (CGI) 2020 (Accepted)

Accepted to PPSN 2020

The following paper has been accepted to the The Sixteenth International Conference on Parallel Problem Solving from Nature (PPSN 2020). This paper improves the adaptive stochastic natural gradient method (ASNG) to work it well on the objective functions with low effective dimensionality.

Teppei Yamaguchi, Kento Uchida, and Shinichi Shirakawa: Adaptive Stochastic Natural Gradient Method for Optimizing Functions with Low Effective Dimensionality, The Sixteenth International Conference on Parallel Problem Solving from Nature (PPSN 2020) (Accepted)

New members!

The menbers page has been updated. Now, our laboratory has 15 graduate and 7 undergraduate students.

Members page

Accepted to ICANN 2019

The following paper has been accepted to the 28th International Conference on Artificial Neural Networks (ICANN 2019) as an oral presentation. In this paper, we propose a method to control the architecture complexity by adding the penalty term in the dynamic optimization method of neural network structures [Shirakawa et al. 2018].

Shota Saito and Shinichi Shirakawa: Controlling Model Complexity in Probabilistic Model-Based Dynamic Optimization of Neural Network Structures, 28th International Conference on Artificial Neural Networks (ICANN 2019) (Accepted as oral presentation) [arXiv]

Accepted to ICML 2019

The following paper has been accepted to the International Conference on Machine Learning (ICML 2019). In this paper, we generalize and improve the dynamic optimization method of neural network structures proposed in [Shirakawa et al. 2018], and realize the fast, robust, and widely-applicable neural architecture search method. [Reference] Youhei Akimoto, Shinichi Shirakawa, Nozomu Yoshinari, Kento Uchida, Shota Saito, and Kouhei Nishida: Adaptive Stochastic Natural Gradient Method for One-Shot Neural Architecture Search, Proceedings of the 36th International Conference on Machine Learning (ICML), Vol. [Read More]

Accepted to IEEE Transaction on Evolutionary Computation

The following paper has been accepted to IEEE Transaction on Evolutionary Computation. This paper is an extension of our GECCO 2018 paper, which theoretically analyzes the information geometric optimization (IGO) algorithm with isotropic Gaussian distribution on Convex Quadratic Functions.

Kento Uchida, Shinichi Shirakawa, Youhei Akimoto, “Finite-Sample Analysis of Information Geometric Optimization with Isotropic Gaussian Distribution on Convex Quadratic Functions,” IEEE Transactions on Evolutionary Computation (2019) (Accepted) [DOI]

New members!

The menbers page has been updated. Now, our laboratory has 12 graduate and five undergraduate students.

Members page

Accepted to Evolutionary Computation Journal

The following paper has been accepted to Evolutionary Computation Journal. This paper is an extension of our GECCO 2017 paper, which proposed an architecture search method of CNN using genetic programming.

Masanori Suganuma, Masayuki Kobayashi, Shinichi Shirakawa, and Tomoharu Nagao, “Evolution of Deep Convolutional Neural Networks Using Cartesian Genetic Programming,” Evolutionary Computation, MIT Press (Accepted)

GECCO 2018

We are going to present the following papers at Genetic and Evolutionary Computation Conference (GECCO) 2018 @ Kyoto.

Kento Uchida, Youhei Akimoto, Shinichi Shirakawa, “Analysis of Information Geometric Optimization with Isotropic Gaussian Distribution Under Finite Samples” (accepted as a full paper).

Shota Saito, Shinichi Shirakawa, Youhei Akimoto, “Embedded Feature Selection Using Probabilistic Model-Based Optimization” (to be presented at student workshop).

New members!

The menbers page has been updated. Now, our laboratory has eight graduate and six undergraduate students.

Members page