Skip to contents

All functions

hypothesis()
Hypothesis wrapper for functional models
LossFunction
Loss function
Objective
Objective function
Optimizer
Optimizer class
OptimizerGD
Gradient descent optimizer
OptimizerMomentum
Momentum optimizer
OptimizerNAG
Nesterovs momentum optimizer
Visualizer
Base visualizer class
VisualizerLossFuns
Visualizer for loss functions
VisualizerModel
Visualize model (unified 1D/2D)
VisualizerObj
Visualize objective (unified 1D/2D)
VisualizerSurface
Visualize 2D functions as interactive surfaces
VisualizerSurfaceModel
Visualize model as interactive surface
VisualizerSurfaceObj
Visualize objective as interactive surface
as.data.table(<DictionaryLoss>)
Convert dictionary to data table
as_visualizer()
Convert to visualizer
assert_step_size_control()
Assertion for the main signature of a step_size_controlXX function.
dict_loss
Dictionary of loss functions
dict_objective
Dictionary for test functions
get_continuous_colorscale()
Get continuous colorscale for surface plots
get_vistool_color()
Get consistent color from discrete palette
lss()
Retrieve loss function
merge_optim_archives()
Merge optimization archives
mlr_learners_regr.lm_formula LearnerRegrLMFormula
Linear model regression learner with formula
obj()
Retrieve objective functions
step_size_control_decay_exp()
Conduct exponential decay to adjust the update. See https://neptune.ai/blog/how-to-choose-a-learning-rate-scheduler
step_size_control_decay_linear()
Conduct linear decay to adjust the update. See https://neptune.ai/blog/how-to-choose-a-learning-rate-scheduler
step_size_control_decay_steps()
Conduct a step-wise decay to adjust the update. See https://neptune.ai/blog/how-to-choose-a-learning-rate-scheduler
step_size_control_decay_time()
Conduct time decay to adjust the update. See https://neptune.ai/blog/how-to-choose-a-learning-rate-scheduler
step_size_control_line_search()
Conduct line search in each iteration to adjust the update.
vistool_theme()
vistool theming utilities