Adaptive Moment Estimator (Adam) Optimizer in ITK v3
Please use this identifier to cite or link to this publication: http://hdl.handle.net/10380/3600
New: Prefer using the following doi: https://doi.org/10.54294/gsd7s9
Published in The Insight Journal - 2019 January-December.
This document describes an ITK class implementing an Adaptive Moment Estimator (Adam) optimizer algorithm within the Insight Toolkit ITK www.itk.org. Adam is an adaptive gradient descent optimizer, which independently adaptively estimates the gradient descent step for each parameter, at each iteration, based on stored past gradients. The optimizer stores exponentially decaying averages of past gradients to estimate first moment (the mean) and the second moment (the variance) of the gradients to formulate update rule for present iteration. The Adam optimizer compares favorably to other adaptive learning-method algorithms, converges faster, and is robust to saddle point. This paper is accompanied with the source code, input data, parameters and output data that the authors used for validating the algorithm described in this paper.