His current research interests include large scale convex and non-convex optimization, design and analysis of algorithms for machine learning. To be more specific, he is interested in designing efficient first-order and second-order algorithms both in deterministic and stochastic manners.
Selected Recent News
- April 2021: Presented the talk “SONIA: A Symmetric Blockwise Truncated Optimization Algorithm”, at the AISTATS 20201
- February 2021: Our paper “Fast and Safe: Accelerated gradient methods with optimality certificates and underestimate sequences” (joint work with Naga Venkata C. Gudapati, Chenxin Ma, Rachael Tappenden and Martin Takáč) has been accepted for publication at Computational Optimization and Applications (COAP)
- January 2021: Our paper “SONIA: A Symmetric Blockwise Truncated Optimization Algorithm” (joint work with Mohammadreza Nazari, Rachael Tappenden, Albert S. Berahas and Martin Takáč) has been accepted for publication at AISTATS 2021
- November 2020: Presented in a session organized by Lam M. Nguyen (Recent Advances in Stochastic Gradient Algorithms for Machine Learning Applications) at the INFORMS Annual Meeting
- November 2020: Our paper: “DynNet: Physics-based neural architecture design for linear and nonlinear structural response modeling and prediction” (joint work with Soheil Sadeghi Eshkevari, Martin Takáč, and Shamim N. Pakzad) has been accepted for publication at Engineering Structures
- September 2020: Our paper “Alternating Maximization: Unifying Framework for 8 Sparse PCA Formulations and Efficient Parallel Codes” (joint work with Peter Richtárik, Selin Damla Ahipasaoglu and Martin Takáč) has been accepted for publication at Optimization and Engineering (OPTE)
- August 2020: Presented the talk “Efficient Distributed Hessian Free Algorithm for Large-scale Empirical Risk Minimization via Accumulating Sample Strategy”, at the AISTATS 2020
- July 2020: Presented the talk “Scaling Up Quasi-Newton Algorithms: Communication Efficient Distributed SR1”, at the LOD 2020
- June 2020: New paper out: “SONIA: A Symmetric Blockwise Truncated Optimization Algorithm” (joint work with Mohammadreza Nazari, Rachael Tappenden, Albert S. Berahas and Martin Takáč)
- May 2020: Our paper “Scaling Up Quasi-Newton Algorithms: Communication Efficient Distributed SR1” (joint work with Mohammadreza Nazari, Sergey Rusakov, Albert S. Berahas and Martin Takáč) has been accepted for publication at The Sixth International Conference on Machine Learning, Optimization, and Data Science
- April 2020: New paper out: “Alternating Maximization: Unifying Framework for 8 Sparse PCA Formulations and Efficient Parallel Codes” (joint work with Peter Richtárik, Selin Damla Ahipasaoglu and Martin Takáč)
- January 2020: Our paper “Efficient Distributed Hessian Free Algorithm for Large-scale Empirical Risk Minimization via Accumulating Sample Strategy” (joint work with Xi He, Chenxin Ma, Aryan Mokhtari, Dheevatsa Mudigere, Alejandro Ribeiro and Martin Takáč) has been accepted for publication at AISTATS 2020
- December 2019: Finished my amazing internship at SAS, Raleigh, NC.
- October 2019: Presented the talk “Scaling Up Quasi-Newton Algorithms: Communication Efficient Distributed SR1” at the INFORMS Annual Meeting in Seattle, WA.
- June 2019: New paper out: “Scaling Up Quasi-Newton Algorithms: Communication Efficient Distributed SR1” (joint work with Mohammadreza Nazari, Sergey Rusakov, Albert S. Berahas and Martin Takáč)
- May 2019: Started my summer internship at SAS as an Operation Research R & D graduate intern under supervision of Yan Xu, Joshua Griffin and Scott Pope, Raleigh, NC.
- January 2019: New paper out: “Quasi-Newton Methods for Deep Learning: Forget the Past, Just Sample“ (joint work with Albert S. Berahas and Martin Takáč)