Convex Optimization


Convex Optimization

Convex optimization deals with finding the minimum of a function while ensuring that the search space remains well-behaved. The key requirement is that the objective function is convex and that the feasible region Ω (the set of allowed values for x) is also convex. This guarantees that any local minimum is also a global minimum, making convex optimization highly efficient and predictable.

A convex optimization problem is formally written as:

minf(x),xΩRn

where:


Local Minimum in Convex Optimization

A local minimum of a function f:D(f)R is a point x, where the function value is smaller than or equal to all nearby values within a small distanceδ:

δ>0:f(x)f(y)yD(f) satisfying xy<δ.

This means that in a small neighborhood around x no other points have a lower function value.

Local Minima 2D.png|500

Local Minima 3D.png|500

Local Minima are Global Minima

One of the most powerful properties of convex functions is that any local minimum is automatically a global minimum.

Lemma 1

If x is a local minimum of a convex function, then it is also a global minimum, meaning:

f(x)f(y)yD(f).

This property makes convex optimization particularly efficient because it eliminates the risk of getting trapped in local minima, unlike in non-convex problems.

Non-Convex Minima.png|500
In non-convex functions, local minima can be traps that prevent finding the global minimum.


Critical Points are Global Minima

A critical point of a function is where the gradient (first derivative) is zero. In the case of convex functions, a critical point is always a global minimum.

Lemma 2

If f is convex and differentiable over an open domain D(f), then any point xD(f) where the gradient is zero is a global minimum:

f(x)=0x is a global minimum.

This means that solving the equation f(x)=0 is sufficient to find the global minimum in convex optimization problems.

Geometric intuition:
The tangent hyperplane (a generalization of a tangent line in higher dimensions) is horizontal at x, meaning the function stops decreasing and reaches its minimum.

gradient is zero.png|500
f(x1,x2)=x12+x229 has a global minimum at (0,0) where the gradient is zero.

f(x)=[2x12x2]=[00]


Strictly Convex Functions

A strictly convex function is a convex function where the inequality is strict — the function curves more sharply, ensuring that it has only one minimum.

Recall from the image:

Strict convex.png

Lemma 3

If f:D(f)R is strictly convex, then it has at most one global minimum.

This property is useful because it guarantees uniqueness, meaning that optimization algorithms will always find the same solution.


Summary

Concept Meaning
Convex optimization Minimizing a convex function over a convex set ensures efficient and predictable results.
Local minimum A point wheref(x)f(y)for nearbyy.
Local = Global In convex functions, every local minimum is a global minimum.
Critical points If f(x)=0 in a convex function, thenxis a global minimum.
Strictly convex functions Have at most one global minimum, ensuring uniqueness.

Convex optimization provides a powerful and reliable framework for finding optimal solutions in various fields, from machine learning to economics. 🚀