
Convex optimization


Accelerated first-order methods for regularization ā FISTA

Subgradient ā Subgradient generalizes the notion of gradient/derivative and quantifies the rate of change of non-differentiable/non-smooth function ([1], section 8.1 in [2]). For a real-valued function, the

Ī±-strong convexity, Ī²-strong smoothness ā Strong convexity often allows for optimization algorithms that converge very quickly to an Ļµ-optimum (rf. FISTA and NESTA). This post will cover some fundamentals of
Obituaries: Harold Kuhn (1925ā2014) ā

Accelerating first-order methods ā The lower bound on the oracle complexity of continuously differentiable, Ī²-smooth convex function is O(1āĻµ) [Theorem 2.1.6, Nesterov04; Theorem 3.8, Bubeck14; Nesterov08]. General first-order gradient
