Rethink Convex Programming for Statistical Estimation
Date: Fri, March 23, 2018
Time: 10:30am
Location: Holmes Hall 389
Speaker: Dr. Sohail Bahmani
Abstract:
A critical aspect of statistical procedures in many engineering and scientific applications is their computational cost. Convex programming provides a versatile framework to design computationally tractable algorithms for many statistical tasks such as regression, classification, and variable selection. A prominent examples of such convex programs is the lasso estimator designed for sparse regression. The estimation procedures based on convex programming are traditionally designed in deterministic fashion and independent of the statistical model. In this talk, I will show that leveraging the statistical model can be lead to new estimators with convex formulations that are applicable in a broader set of problems and can be computationally less demanding.
In particular, I will talk about the phase retrieval problem and a provably-accurate estimator, formulated as a convex program in the natural domain of the signal. This formulation enables the proposed estimator to operate at a much lower computational cost than the existing estimators based on semidefinite programming which operate in a "lifted" domain. Furthermore, I will present a more general estimator based on the same principles that applies to non-linear regression problems with convex non-linearities. The statistical accuracy and sample complexity guarantees for these estimators are obtained using techniques from statistical learning theory and empirical process theory.
Bio:
Sohail Bahmani is a postdoctoral fellow in the School of Electrical and Computer Engineering at Georgia Tech. He earned his PhD from Carnegie Mellon University in 2013, and received his Master's and Bachelor's degrees respectively from Simon Fraser University, Canada, in 2008, and Sharif University Technology, Iran, in 2006. He is a recipient of the AISTATS 2017 best paper award. Motivated by applications in signal processing, machine learning, and network analysis his research interests are broadly in algorithmic and theoretical aspects of statistical inference relating to topics such as optimization, high-dimensional statistics, applied probability theory, and informational theory.