Convex Optimization in R
Last Updated on August 22, 2019
Optimization is a big part of machine learning. It is the core of most popular methods, from least squares regression to artificial neural networks.
In this post you will discover recipes for 5 optimization algorithms in R.
These methods might be useful in the core of your own implementation of a machine learning algorithm. You may want to implement your own algorithm tuning scheme to optimize the parameters of a model for some cost function.
A good example may be the case where you want to optimize the hyper-parameters of a blend of predictions from an ensemble of multiple child models.
Kick-start your project with my new book Machine Learning Mastery With R, including step-by-step tutorials and the R source code files for all examples.
Let’s get started.
Golden Section Search
Golden Section Search is a Line Search method for Global Optimization in one-dimension. It is a Direct Search (Pattern Search) method as it samples the function to approximate a derivative rather than computing it directly.
The Golden Section Search is related to pattern searches of discrete ordered lists such as the Binary Search and the Fibonacci Search. It is related
To finish reading, please visit source site