Methods
First order methods
Theory
Matrix calculus
Convex sets
Affine set
Convex set
Conic set
Projection
Convex function
Conjugate set
Conjugate function
Dual norm
Subgradient and subdifferential
Optimality conditions. KKT
Convex optimization problem
Duality
Rates of convergence
Methods
Line search
Binary search
Golden search
Inexact Line Search
Successive parabolic interpolation
Zero order methods
Bee algorithm
Nelder–Mead
Simulated annealing
First order methods
Gradient descent
Subgradient descent
Projected subgradient descent
Mirror descent
Stochastic gradient descent
Stochastic average gradient
ADAM: A Method for Stochastic Optimization
Lookahead Optimizer:
k
steps forward,
1
step back
Adaptive metric methods
Newton method
Quasi Newton methods
Conjugate gradients
Natural gradient descent
Linear Programming and simplex algorithm
Automatic differentiation
Exercises
Matrix calculus
Convex sets
Projection
Separation
Conjugate sets
Convex functions
Subgradient and subdifferential
Conjugate functions
General optimization problems
Duality
Rates of convergence
Line search
CVXPY library
Automatic differentiation
Zero order methods
First order methods
Uncategorized
Applications
A^*
algorithm for path finding
Deep learning
Minimum volume ellipsoid
Knapsack problem
Linear least squares
Maximum likelihood estimation
Neural network Lipschitz constant
Neural Network Loss Surface Visualization
Principal component analysis
Rendezvous problem
Travelling salesman problem
Total variation in-painting
Two way partitioning problem
Benchmarks
CNN on FashionMNIST
Linear Least Squares
Materials
Tutorials
Quick start to the Colab
On this page
1
Materials
Edit this page
Methods
First order methods
First order methods
Now we have only first order information from the oracle.
Gradient descent
Original Cauchy paper
Subgradient descent
Рассматривается классическая задача выпуклой оптимизации:
Projected subgradient descent
Suppose, we are to solve the following problem:
Mirror descent
Метод зеркального спуска является естественным обобщением метода проекции субградиента в случае обобщения
l_2
нормы на более общий случай какой-то функции расстояния.
Stochastic gradient descent
Suppose, our target function is the sum of functions.
Stochastic average gradient
A classical problem of minimizing finite sum of the smooth and convex functions was considered.
ADAM: A Method for Stochastic Optimization
Adam is the stochastic first order optimization algorithm, that uses historical information about stochastic gradients and incorporates it in attempt to estimate second…
Lookahead Optimizer:
k
steps forward,
1
step back
The lookahead method provides an interesting way to accelerate and stabilize algorithms of stochastic gradient descent family. The main idea is quite simple:
No matching items
1
Materials
Visualization
of optimization algorithms.