Skip to contents

This class defines Nesterovs momentum using Nesterov accelerated gradient (NAG).

Super class

vistool::Optimizer -> OptimizerNAG

Active bindings

momentum

(numeric(1)) Momentum of the algorithm.

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.

Usage

OptimizerNAG$new(
  objective,
  x_start,
  lr = 0.01,
  momentum = 0.9,
  id = "NAG",
  print_trace = TRUE
)

Arguments

objective

(Objective)
The objective to optimize.

x_start

(numeric())
Start value of the optimization. Note, after the first call of $optimize() the last value is used to continue optimization. Get this value with $x.

lr

(numeric(1))
Step size with which the update is multiplied.

momentum

(numeric(1))
Momentum value.

id

(character(1))
Id of the object.

print_trace

(logical(1))
Indicator whether to print the status of $optimize().


Method optimize()

Optimize steps iteration.

Usage

OptimizerNAG$optimize(
  steps = 1L,
  step_size_control = function(x, u, obj, opt) {
     return(1)
 },
  minimize = NULL
)

Arguments

steps

(integer(1))
Number of steps/iterations.

step_size_control

(function()) A function with arguments x (the old input value), u (the update generated by $update()), and obj (the objective object).

minimize

(logical(1)) Indicator to whether minimize or optimize the objective. The default (NULL) uses the option defined in objective$minimize.


Method update()

Calculate the update for x

Usage

OptimizerNAG$update(lr, mom, minimize)

Arguments

lr

(numeric(1)) The learning rate.

mom

(numeric(1)) The momentum.

minimize

(logical(1)) Indicator to whether minimize or optimize the objective (default = TRUE).


Method clone()

The objects of this class are cloneable with this method.

Usage

OptimizerNAG$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.