Choosing an adequate neural network for a new application is an intricate and time-consuming process. This task is often split into two phases: the first one determines the architecture of the network while the second decides on the optimization algorithm to apply which is responsible for the training of the network. However, the initial hyperparameter optimization problem can be seen as an expensive blackbox problem and therefore can be handled by derivative-free optimization methods. This talk presents the adaptation of the MADS algorithm for this particular instance and the positive results this approach obtained on computer vision and anomaly detection problems.
Please use this link to attend the virtual seminar: