Efficient Stochastic Optimization Algorithms for Convex, Non-Convex Problems
Author | : Aysegul Beyza Bumin |
Publisher | : |
Total Pages | : 0 |
Release | : 2023 |
ISBN-10 | : OCLC:1401939966 |
ISBN-13 | : |
Rating | : 4/5 (66 Downloads) |
Book excerpt: The main research interest presented is large-scale stochastic optimization, which is at the core of machine learning and data science. Most of the existing algorithms are based on stochastic gradient descent (SGD), a conceptually simple algorithm that works reasonably well in practice. However, it also suffers from a slow convergence rate and requires manual tuning for best performance. This research focuses on developing stochastic proximal algorithms and improving them further for specific applications in bioinformatics. The goal is to 1) make the per-iteration complexity as efficient as that of SGD and 2) provide much more robust convergence guarantees that rely on fewer assumptions such as smoothness, Lipschitz continuity, or even convexity. Upon successfully being conducted, it greatly benefits the field of optimization and machine learning.