Optimizing Optimizers
Michael Scherbela, 22. Mar 2023
A short overview of 3 papers, which each try to find better first order optimizers, by optimizing their optimizers:
- Gradient Descent: The Ultimate Optimizer, NeurIPS 2022: Optimization of SGD-hyper-parameters using SGD
- Symbolic Discover of Optimization Algorithms, 2023: Evolutionary search for new optimizers, discovers a simple, robust optimizer which obtains new SOTA on ImageNet
- Learning to learn by gradient descent by gradient descent, NIPS 2016: Replacing the optimizer by a neural network