<Deeplarning> A simple way to distinguish different optimizers in DeepLearning
The optimizers
Let’s simply list the optimizers that may or may not be used in your project.
- SGDOptimizer
- MomentumOptimizer
- NesterovOptimizer
- AdagradOptimizer
- AdadeltaOptimizer
- RMSPropOptimizer
- AdamOptimizer
- NadamOptimizer