Adaptive Moment Estimator (Adam) Optimizer in ITK v3

Bhavya Ajani*,Aditya Bharadwaj
Abstract

Abstract

This document describes an ITK class implementing an Adaptive Moment Estimator (Adam) optimizer algorithm within the Insight Toolkit ITK www.itk.org. Adam is an adaptive gradient descent optimizer, which independently adaptively estimates the gradient descent step for each parameter, at each iteration, based on stored past gradients. The optimizer stores exponentially decaying averages of past gradients to estimate first moment (the mean) and the second moment (the variance) of the gradients to formulate update rule for present iteration. The Adam optimizer compares favorably to other adaptive learning-method algorithms, converges faster, and is robust to saddle point. This paper is accompanied with the source code, input data, parameters and output data that the authors used for validating the algorithm described in this paper.

Keywords

ADAMDiscrete optimization
Manuscript
Source Code and Data

Source Code and Data

SourceAdamOptimizerV3Test.cxx7.9 KBCMakeLists.txt2.2 KBIJMacros.txt3.4 KBincludeitkAdamOptimizer.cxx5.2 KBitkAdamOptimizer.h3 KB

Select a file to preview