Research

Publication

Presentation

  • “Linear Convergence of Randomized Feasible Descent Methods Under the Weak Strong Convexity Assumption,” SIAM Conference on Optimization, Vancouver, Canada, May, 2017
  • “Acceleration of a Communication-Efficient Distributed Dual Block Descent Algorithm,” poster presentation in 2015 Informs Annual Meeting, Nashville, TN, Nov. 14th, 2016
  • “Distributed Inexact Damped Newton Method: Data Partitioning and Load-Balancing,” poster presentation in 10th Annual Machine Learning Symposium, Manhattan, NY, Mar. 4th, 2016
  • “Acceleration of a Communication-Efficient Distributed Dual Block Descent Algorithm,” 21th Modeling and Optimization: Theory and Applications (MOPTA), Bethlehem, PA, Aug. 18th, 2016
  • “Fast Coordinate Descent Methods on Linear Support Vector Machines,” SAS Interns Expo, Aug. 3rd, 2016
  • “Communication-Efficient Distributed Dual Coordinate Ascent and Its Applications in Machine Learning,”2015 Informs Annual Meeting, Philadelphia, PA, Nov. 2nd, 2015
  • “Partitioning Data on Features or Samples in Distributed Optimization?” poster presentation in Conference on Neural Information Processing Systems (NIPS) workshop, Montreal, Canada, Dec. 11th, 2015
  • “Linear Convergence of Randomized Feasible Descent Methods Under Weak Strong Convexity Assumption,” 20th Modeling and Optimization: Theory and Applications (MOPTA), Bethlehem, PA, Jul. 23th, 2015