Publications
2024
International Conference on Machine Learning (ICML)
We present a novel adaptive learning rate scheduling algorithm that significantly improves convergence speed for large-scale neural networks. Our method dynamically adjusts the learning rate based on the local geometry...
Proceedings of the International Conference on Machine Learning (ICML)
We present novel optimization techniques for training large-scale neural networks that significantly reduce computational overhead while maintaining model performance. Our approach introduces adaptive learning rate scheduling combined with gradient compression...
Journal of Mathematical Optimization
We present a rigorous mathematical analysis of gradient descent convergence rates under various smoothness conditions. Our main theorem establishes that for $L$-smooth functions, the convergence rate is $\mathcal{O}(1/k)$ where $k$...
Publication Metrics
- Total Publications: 3
- Publications with DOI: 3
- Publications with Code: 2
For a complete and up-to-date list, see my ORCID profile.
For citation metrics, see my Google Scholar profile.