# Inexact Tensor Methods with Dynamic Accuracies

@inproceedings{Doikov2020InexactTM, title={Inexact Tensor Methods with Dynamic Accuracies}, author={Nikita Doikov and Yurii Nesterov}, booktitle={ICML}, year={2020} }

In this paper, we study inexact high-order Tensor Methods for solving convex optimization problems with composite objective. At every step of such methods, we use approximate solution of the auxiliary problem, defined by the bound for the residual in function value. We propose two dynamic strategies for choosing the inner accuracy: the first one is decreasing as $1/k^{p + 1}$, where $p \geq 1$ is the order of the method and $k$ is the iteration counter, and the second approach is using for the… Expand

#### Supplemental Presentations

Presentation Slides

#### 4 Citations

Affine-invariant contracting-point methods for Convex Optimization

- Computer Science, Mathematics
- 2020

A general framework of Contracting-Point methods, which solve at each iteration an auxiliary subproblem re- stricting the smooth part of the objective function onto contraction of the initial domain, which provides a systematic way for developing optimization methods of different order, endowed with the global complexity bounds. Expand

Hyperfast Second-Order Local Solvers for Efficient Statistically Preconditioned Distributed Optimization

- Mathematics
- 2021

Statistical preconditioning can be used to design fast methods for distributed large-scale empirical risk minimization problems, for strongly convex and smooth loss functions, allowing fewer… Expand

Tensor methods for finding approximate stationary points of convex functions

- Mathematics, Physics
- 2019

In this paper we consider the problem of finding $\epsilon$-approximate stationary points of convex functions that are $p$-times differentiable with $\nu$-H\"{o}lder continuous $p$th derivatives. We… Expand

Optimization Methods for Fully Composite Problems

- Mathematics
- 2021

In this paper, we propose a new Fully Composite Formulation of convex optimization problems. It includes, as a particular case, the problems with functional constraints, max-type minimization… Expand

#### References

SHOWING 1-10 OF 53 REFERENCES

Implementable tensor methods in unconstrained convex optimization

- Computer Science, Mathematics
- Math. Program.
- 2021

New tensor methods for unconstrained convex optimization, which solve at each iteration an auxiliary problem of minimizing convex multivariate polynomial, and an efficient technique for solving the auxiliary problem, based on the recently developed relative smoothness condition are developed. Expand

A Unified Adaptive Tensor Approximation Scheme to Accelerate Composite Convex Optimization

- Mathematics, Computer Science
- SIAM J. Optim.
- 2020

The results partially address the problem of incorporating adaptive strategies into the high-order {\it accelerated} methods raised by Nesterov in (Nesterov-2018), although the strategies cannot assure the convexity of the auxiliary problem and such adaptive strategies are already popular in high- order nonconvex optimization. Expand

A Stochastic Tensor Method for Non-convex Optimization

- Mathematics, Computer Science
- ArXiv
- 2019

A stochastic optimization method that uses a fourth-order regularized model to find local minima of smooth and potentially non-convex objective functions and its implementation relies on tensor-vector products only. Expand

On inexact solution of auxiliary problems in tensor methods for convex optimization

- Computer Science, Mathematics
- Optim. Methods Softw.
- 2021

This paper studies the auxiliary problems that appear in p-order tensor methods for unconstrained minimization of convex functions with ν-Hölder continuous pth derivatives and proves that the referred methods take at most iterations to find either a suitable approximate stationary point of the tensor model or an ε-approximate stationary points of the original objective function. Expand

Gradient methods for minimizing composite functions

- Mathematics, Computer Science
- Math. Program.
- 2013

In this paper we analyze several new methods for solving optimization problems with the objective function formed as a sum of two terms: one is smooth and given by a black-box oracle, and another is… Expand

Inexact basic tensor methods

- Mathematics
- 2019

In this paper, we study local convergence of high-order Tensor Methods for solving convex optimization problems with composite objective. We justify local superlinear convergence under the assumption… Expand

Tensor Methods for Minimizing Functions with H\"{o}lder Continuous Higher-Order Derivatives

- Physics
- 2019

In this paper we study $p$-order methods for unconstrained minimization of convex functions that are $p$-times differentiable with $\nu$-Holder continuous $p$th derivatives. We propose tensor schemes… Expand

Accelerating the cubic regularization of Newton’s method on convex problems

- Mathematics, Computer Science
- Math. Program.
- 2008

An accelerated version of the cubic regularization of Newton’s method that converges for the same problem class with order, keeping the complexity of each iteration unchanged and arguing that for the second-order schemes, the class of non-degenerate problems is different from the standard class. Expand

Convergence Rates of Inexact Proximal-Gradient Methods for Convex Optimization

- Mathematics, Computer Science
- NIPS
- 2011

This work shows that both the basic proximal-gradient method and the accelerated proximal - gradient method achieve the same convergence rate as in the error-free case, provided that the errors decrease at appropriate rates. Expand

Sub-sampled Cubic Regularization for Non-convex Optimization

- Computer Science, Mathematics
- ICML
- 2017

This work provides a sampling scheme that gives sufficiently accurate gradient and Hessian approximations to retain the strong global and local convergence guarantees of cubically regularized methods, and is the first work that gives global convergence guarantees for a sub-sampled variant of cubic regularization on non-convex functions. Expand