First-order Newton-type Estimator for Distributed Estimation and Inference

Xi Chen, Weidong Liu & Yichen Zhang
This paper studies distributed estimation and inference for a general statistical problem with a convex loss that could be non-differentiable. For the purpose of efficient computation, we restrict ourselves to stochastic first-order optimization, which enjoys low per-iteration complexity. To motivate the proposed method, we first investigate the theoretical properties of a straightforward Divide-and-Conquer Stochastic Gradient Descent (DC-SGD) approach. Our theory shows that there is a restriction on the number of machines and this restriction becomes...
This data repository is not currently reporting usage information. For information on how your repository can submit usage information, please see our documentation.