moose icon indicating copy to clipboard operation
moose copied to clipboard

Add ADAM optimization in STM for Gaussian Process training

Open somu15 opened this issue 2 years ago • 0 comments

Reason

Currently Gaussian Process (GP) training utilizes (mostly) CG optimization through TAO. Adding ADAM optimization since its popular for GP training and permits a stochastic selection of the training data at each iteration reducing the training cost. Training cost for GP is O(N^3) with N as the size of training data.

Design

ADAM optimization option will be added to the GaussianProcessHandler class in STM.

Impact

No anticipated impacts to existing objects.

somu15 avatar Jul 31 '22 16:07 somu15