I work in the field of nonlinear optimization, with a focus on derivative free optimization (DFO).
Publication
- Berahas, A. S., Cao, L., & Scheinberg, K. Analysis of a Trust Region Method with Errors (in preparation)
- Cao, L., Menickelly, M., & Wild, S.M. A Model-based Approach to Derivetive-free Multiobjective Optimization (in preparation)
- Wang, Fenlan, and Liyuan Cao. A New Algorithm for Quadratic Integer Programming Problems with Cardinality Constraint. Japan Journal of Industrial and Applied Mathematics (2020): 1-12.
- Berahas, A. S., Cao, L., Choromanski, K., & Scheinberg, K. (2019). A Theoretical and Empirical Comparison of Gradient Approximations in Derivative-Free Optimization. Under Revision: Foundations of Computational Mathematics
- Berahas, A. S., Cao, L., & Scheinberg, K. (2019). Global Convergence Rate Analysis of a Generic Line Search Algorithm with Noise. Under 3rd Round of Review: SIAM Journal on Optimization
- Berahas, A. S., Cao, L., Choromanski, K., & Scheinberg, K. (2019). Linear Interpolation Gives Better Gradients Than Gaussian Smoothing in Derivative-free Optimization. arXiv preprint arXiv:1905.13043.
Presentation
Lagrange Polynomial OptML 2020 Spring, Lehigh University
Some Gradient Approximation Methods for Derivative Free Optimization INFORMS 2019, Seattle, WA
Comparing Derivative Free Methods INFORMS 2018, Phoenix, AZ
Note
Some Useful Expected Values with Multivariate NormalDistribution and Uniform Distribution on Sphere