AccScience Publishing / IJOCTA / Online First / DOI: 10.36922/IJOCTA025390163
RESEARCH ARTICLE

Beyond Nesterov: Dynamical systems perspective with time-dependent inertia and conformable Bregman flows

Osama F. Abdel Aal1† Necdet Sinan Ozbek1,2† Jairo Viola1† YangQuan Chen1†*
Show Less
1 Department of Mechanical and Aerospace Engineering, School of Engineering, University of California Merced, California, United States of America
2 Department of Electrical and Electronic Engineering, Faculty of Engineering, Adana Alparslan Turkes Science and Technology University, Adana, Turkiye
†These authors contributed equally to this work.
Received: 25 September 2025 | Revised: 6 November 2025 | Accepted: 10 November 2025 | Published online: 16 December 2025
© 2025 by the Author(s). This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution -Noncommercial 4.0 International License (CC-by the license) ( https://creativecommons.org/licenses/by-nc/4.0/ )
Abstract

We present a Lyapunov-based framework for analyzing continuous-time accelerated optimization dynamics with time-dependent inertia and damping. By explicitly designing Lyapunov functions that account for varying inertia, we rigorously characterize convergence rates of the objective function, achieving exponential or polynomial acceleration beyond the classical O(1/t2), even in the absence of strong convexity. Building on this foundation, we introduce a variational extension using conformable (fractional) derivatives in the Lagrangian formulation, replacing the classical velocity term with a time-weighted fractional velocity. This approach systematically modulates the system’s effective inertia and damping, providing a principled mechanism to balance acceleration and stability, reduce oscillations, and interpolate smoothly between strongly damped gradient flows and momentum-driven dynamics. The resulting framework unifies Lyapunov analysis and fractional variational modeling, offering flexible, theoretically grounded design principles for fast and stable accelerated optimization.

Keywords
Accelerated Gradient Methods
Continuous optimization
Bregman lagrangian
Calculus of variation
Gradient algorithms
Conformable derivatives
Funding
None.
Conflict of interest
YangQuan Chen is an Editorial Board Member of this journal, but was not in any way involved in the editorial and peer-review process conducted for this paper, directly or indirectly. Separately, other authors declared that they have no known competing financial interests or personal relationships that could have influenced the work reported in this paper.
References
  1. Su, S. Boyd, and E. J. Cand‘es. A differential equation for modeling Nesterov’s accelerated gradient method: Theory and insights. J Mach Learn Res. 2016;17(153):1–43.

 

  1. Park C,  Cho  Y,  and  Yang  Generalized continuous-time models for Nesterov’s accelerated gradient methods. 2024.

 

  1. Wibisono A, Wilson AC, and Jordan MI. A Variational perspective on accelerated methods in op- timization. Proceedings of the National Academy of Sciences. 2016;113(47):E7351–E7358.

 

  1. Shi B, Du SS, Jordan MI and Su WJ. Understanding the Acceleration Phenomenon via High-resolution differential equations. Mathematical Programming. 2022;1–70.

 

  1. Chen S,  Shi  B  and  Yuan  Y-x, Revisiting the acceleration phenomenon via high resolution differential equations. 2022;arXiv preprint arXiv:2212.05700.

 

  1. Muehlebach M and Jordan MI. Optimization with momentum: Dynamical, control-theoretic, and symplectic perspectives. J Mach Learn Res. 2021;22.

 

  1. Chen S, Liu J, Wang P, Xu C, Cai S, and Chu J. Accelerated Optimization in Deep Learning with a Proportional-Integral-Derivative Controller. Nature Communications. 2024;15(1):10263.

 

  1. Zhu TT, Hu R, and Fang YP. Fast convergence rates and trajectory convergence of a tikhonov regularized inertial primal–dual dynamical system with time scaling and vanishing damping. Journal of Computational and Applied Mathematics. 460:116394, 2025.

 

  1. Weinan E. A proposal on machine learning via dynamical systems Communications in Mathematics and Statistics. 2017;5:1–11.

 

  1. Le DM, Patil OS, Nino CF, and Dixon WE. Accelerated gradient approach for deep neural network- based adaptive control of unknown nonlinear sys- tems. IEEE Transactions on Neural Networks and Learning Systems. 2024;36(4):62990-6313.

 

  1. Da Silva AB and Gazeau M. A General System of Differential Equations To Model Firstorder Adap- tive Algorithms. J Mach Learn Res. 2020;21.

 

  1. Mao Z, Suzuki S, Nabae H, Miyagawa S, Suzu- mori K, and Maeda S. Machine Learning enhanced Soft Robotic System Inspired by Rectal Functions to Investigate Fecal Incontinence. Bio-Design and Manufacturing. 2025;8(3):482-

 

  1. Muehlebach M and Jordan M. A Dynamical Systems Perspective on Nesterov Acceleration, in International Conference on Machine Learning. 2019;4656–4662.

 

  1. Shi B, Du SS, Su W, and Jordan MI. Accelera- tion via Symplectic Discretization of High resolution Differential Equations. Advances in Neural Information Processing Systems. 2019;32.

 

  1. Moucer C,  Taylor  A  and  Bach    A  Systematic  Approach  To  Lyapunov Analyses of  Continuous-time  Models  in  Convex Optimization.  SIAM  Journal  on Optimization. 2023;33(3):1558–1586.

 

  1. Sanz Serna JM, and Zygalakis KC. The Connections Between Lyapunov Functions For Some Optimization Algorithms and Differential Equations. SIAM Journal on Numerical Analysis. 2021;59(3):1542–1565.

 

  1. Kolarijani AS, Esfahani PM, and Keviczky T. Continuous-time Accelerated Methods via A Hy- brid Control Lens. IEEE Transactions on Automatic Control. 2020; 65(8):3425-3440.

 

  1. Lin T, and Jordan MI. A control-theoretic perspective on optimal high-order optimization,” Mathematical Programming. 2022;195:929–975.

 

  1. Attouch H, Bot RI, and Csetnek ER. Fast Optimization via Inertial Dynamics With Closedloop Damping. Journal of the European Mathematical Society. 2022;25(5):1985–2056.

 

  1. Adly S. and Attouch H. Accelerated Optimization Through Time-scale Analysis of Inertial Dy- namics With Asymptotic Vanishing and Hessian- driven Dampings. Optimization. 2024;1–38.

 

  1. Castera C, Attouch H, Fadili J and Ochs P. Continuous Newton-like Methods Featuring Inertia and Variable Mass. SIAM Journal on Optimization. 2024;34(1):251–277.

 

  1. Attouch H, Chbani Z and Riahi H. Fast Proximal Methods via Time Scaling of Damped Inertial Dynamics. SIAM Journal on Optimization. 2019;29(3):2227–2256.

 

  1. Cheng X, Liu J, and Shang Z. A class of generalized nesterov’s accelerated gradient method from dynamical perspective. arXiv preprint ar. 2025;Xiv:2508.12816.

 

  1. Laborde M and Oberman A. A Lyapunov Analy- sis For Accelerated Gradient Methods: From Deterministic To Stochastic Case,” in Proceedings of The Twenty Third International Conference on Artificial Intelligence and Statistics (S. Chiappa and R. Calandra, eds.), vol. 108 of Proceedings of Machine Learning.

 

  1. Ross I. An Optimal Control Theory For Nonlinear Optimization. Journal of Computational and Applied Mathematics. 2019;354:39–51.

 

  1. Wilson AC, Recht B, and Jordan MI. A Lyapunov Analysis of Accelerated Methods in Opti- mization. Journal of Machine Learning Research. 2021;22(113):1-34.

 

  1. Verma VR, Nishad DK, Sharma V, Singh VK, Verma A, and Shah DR. Quantum Machine Learning for Lyapunov-stabilized Computation Offloading in Next-Generation MEC Networks. Scientific Reports. 2025;15:405.

 

  1. Dobson P, Sanz-Serna JM and Zygalakis KC. On the connections between optimization algorithms, lyapunov functions, and differential equations: Theory and insights. SIAM Journal on Optimization. 2025;35(1):537–566.

 

  1. Kravitz H, Dur´on C, Nieves B, and Brio M. Data-driven optimization and parameter estimation for a metric graph epidemic model with applications to covid-19 spread in poland: A real-world  example  of  optimization  for  a challenging rosenbrock-type objective function. An International Journal of Optimization and Control:  Theories & Applications (IJOCTA). 2025;16:750–778.

 

  1. Hanachi SB, Sellami B and Belloufi M. Global convergence property with inexact line search for a new conjugate gradient method. An International Journal of Optimization and Control: Theories & Applications (IJOCTA). 2025;15:25-34.

 

  1. Tran BK and Leok M. Variational principles for Hamiltonian systems. arXiv preprint. 2024;arXiv:2410.02960.

 

  1. Cresson J. Fractional Embedding of Differential Operators and Lagrangian Systems,” Journal of Mathematical Physics. 2007;48(3).

 

  1. Aal OFA, Viola J, and Chen Y, Fractional Order Euler-Lagrange Model for Accelerated Gradient Methods. IFAC-PapersOnLine. 2024;58(12):466- 471.

 

  1. Pham T and Wagner H. Fast Kd-trees for the Kullback-Leibler Divergence and other De- composable Bregman Divergences. 2025;arXiv preprint arXiv:2502.13425.

 

  1. Banerjee  A,  Merugu  S,  Dhillon  IS and Ghosh    Clustering  with  Bregman  Divergences. Journal of Machine Learning Research. 2005;6:1705–1749.

 

  1. Cha  J,  Kim  Y,  Shin  J,  Cho  J,  Kim  S, and Ryu J. Optimization of Bregman Variational Learning Dynamics. arXiv preprint ar. 2025;Xiv:2510.20227.

 

  1. Yu D, Jiang W, Wan Y, and Zhang L. Mirror Descent under Generalized Smoothness. arXiv preprint. 2025;arXiv:2502.00753.

 

  1. Hayashi M. Bregman-Divergence-based Arimoto- Blahut Algorithm. IEEE Transactions on Information Theory. 2025.

 

  1. Gao  X,  Jiang  Y,  Cai  X,  and  Wang  An Accelerated Mirror Descent Algorithm for Constrained Nonconvex Problems. Optimization. 2025;1–26.

 

  1. Boyd SP and Vandenberghe L. Convex optimization. Cambridge University Press. 2004.

 

  1. Khalil R, Al Horani M, Yousef A and Sababheh M. A New Definition of Fractional Derivative. Journal of Computational and Applied Mathematics. 2014;26465–70.

 

  1. Chung    Fractional  Newton  Mechanics with Conformable Fractional Derivative. Computers  &  Mathematics  with Applications. 2015;69(4):322–330.

 

  1. Nesterov    A  Method  For  Solving  The Convex  Programming  Problem  With Con- vergence  Rate.  in  Dokl.  Akad.  Nauk  Sssr. 1983;269:543–547.

 

  1. Nesterov Y. Introductory Lectures on Convex Optimization: A Basic Course, vol. 87. Springer Sci- ence & Business Media.

 

  1. Nesterov Y. Nonsmooth Convex Optimization. Lectures on Convex Optimization. 2018;139–240.

 

  1. Gaud N, Murthy PK, Hassan MMM, Ganguly A, Mali V, Randive MLB, and Singh A. Enhanced nirmal optimizer with damped nesterov acceleration: a comparative analysis. arXiv preprint 2025;arXiv:2508.16550.

 

  1. Haraux A. Systemes Dynamiques Dissipatifs Et Applications. 1991.

 

  1. Almeida R. Fractional Calculus of Variations for composed Functionals with Generalized Derivatives.
Share
Back to top
An International Journal of Optimization and Control: Theories & Applications, Electronic ISSN: 2146-5703 Print ISSN: 2146-0957, Published by AccScience Publishing