Seminar on 'Online bootstrap for averaged implicit SGD estimators' by Professor Yixin FANG
posted by Department of Statistics and Actuarial Science for HKU and Public
Event Type: Public Lecture/Forum/Seminar/Workshop/Conference/Symposium
Event Nature: Science & Technology
DEPARTMENT OF STATISTICS AND ACTUARIAL SCIENCE
THE UNIVERSITY OF HONG KONG
Professor Yixin FANG
Department of Mathematical Sciences
New Jersey Institute of Technology
will give a talk
ONLINE BOOTSTRAP FOR AVERAGED IMPLICIT SGD ESTIMATORS
In many applications involving large-scale data or streaming data, stochastic gradient descent (SGD) provides a scalable way to compute parameter estimates. As an alternative version, averaged implicit SGD (AI-SGD) has been showed to be more stable and more efficient than SGD. Although the asymptotic properties of the estimator from AI-SGD have been well established, statistical inferences based on it such as interval estimation remain unexplored. The bootstrap is a popular resampling method for conducting statistical inferences, but it is not computationally feasible since it requires to repeatedly draw independent samples from the entire dataset. And the plug-in method is not computationally efficient and is not applicable when there is no explicit formula for the estimator’s covariance matrix. In this paper, we propose an online version of the bootstrap, which can be used for conducting scalable inferences based on the AI-SGD estimator. The online bootstrap updates the AI-SGD estimate as well as many randomly perturbed AI-SGD estimates, upon the arrival of each observation. We derive some large-sample theoretical properties of the proposed online bootstrap. We also examine the performance of the proposed method via simulation studies.
|Venue||Room 301, Run Run Shaw Building, HKU|
Registration is not required.