Yuanzhi Li, Yingyu Liang, Learning Mixtures of Linear Regressions with Nearly Optimal Complexity, https://arxiv.org/abs/1802.07895, Submitted, Conference on Learning Theory, 2018.
Hongyi Wang*, Scott Sievert*, Zachary Charles, Dimitris Papailiopoulos, Stephen Wright (authors with * contributed equally), ATOMO: Communication-efficient Learning via Atomic Sparsification, NIPS 2018, Submitted, 2018.
Jared D. Martin, Adrienne Wood, William T. L. Cox, Scott Sievert, Robert Nowak, Eva Gilboa-Schechtman, & Paula M. Niedenthal, Validating a social-functional smile typology: Insights from machine learning and computer-assisted facial expression analysis, Cognitive Science, Submitted, May 2018.
Gautam Dasarathy, Elchanan Mossel, Robert Nowak, Sebastien Roch,Coalescent-based species tree estimation: a stochastic Farris transform,https://arxiv.org/abs/1707.04300, Submitted, 2017.
Sebastien Roch, Karl Rohe, Generalized least squares can overcome the critical threshold in respondent-driven sampling, https://arxiv.org/abs/1708.04999, Submitted, 2017.
Sebastien Roch, Kun-Chieh Wang, Circular Networks from Distorted Metrics, https://arxiv.org/abs/1707.05722, Proceedings of the 22nd Annual International Conference on Research in Computational Molecular Biology (RECOMB) 2018, 167–176
Zachary Charles, Dimitris Papailiopoulos and Jordan Ellenberg, Approximate Gradient Coding via Sparse Random Graphs, https://arxiv.org/pdf/1711.06771.pdf, November 2017.
Zachary Charles and Dimitris Papailiopoulos, Stability and Generalization of Learning Algorithms
that Converge to Global Optima, https://arxiv.org/pdf/1710.08402.pdf, ICML Accepted, October 2017.
Lingjiao Chen, Hongyi Wang, Zachary Charles, Dimitris Papailiopoulos, DRACO:Robust Distributed Training via Redundant Gradients, https://arxiv.org/pdf/1803.09877.pdf, ICML Accepted, April 2018.
Zachary Charles and Dimitris Papailiopoulos, Gradient coding using the stochastic block model, ISIT, Accepted 2018.
Ke Chen and Qin Li and Jianfeng Lu and Stephen J. Wright, Randomized sampling for basis functions construction in generalized finite element methods, https://arxiv.org/abs/1801.06938, SIAM-MMS Submitted, 2018.
Ke Chen and Qin Li and Jian-Guo Liu, Online learning in optical tomography: a stochastic approach, http://iopscience.iop.org/article/10.1088/1361-6420/aac220/pdf, Inverse Problems, Accepted, 2018.
Sebastien Roch, Michael Nute, Tandy Warnow, Long-branch attraction in species tree estimation: inconsistency of partitioned likelihood and topology-based summary methods, https://arxiv.org/abs/1803.02800, Submitted, 2018
Zvi Rosen, Anand Bhaskar, Sebastien Roch, Yun S. Song, Geometry of the sample frequency spectrum and the perils of demographic inference, https://arxiv.org/abs/1712.05035, Submitted, 2017
Wai-Tong Fan, Sebastien Roch, Necessary and sufficient conditions for consistent root reconstruction in Markov models on trees, https://arxiv.org/abs/1707.05702, To appear in Electronic Journal of Probability, 2018
Hongyi Wang*, Scott Sievert*, Zachary Charles, Dimitris Papailiopoulos, Stephen Wright (authors with * contributed equally), ATOMO: Communication-efficient Learning via Atomic Sparsification, NIPS 2018
Sumeet Katariya, Lalit Jain, Nandana Sengupta, James Evans, Robert Nowak, Adaptive Sampling for Coarse Ranking, https://arxiv.org/abs/1802.07176, AISTATS Feb 2018.
Ching-pei Lee,Cong Han Lim, and Stephen J Wright, A Distributed quasi-Newton algorithm for empirical risk minimization with nonsmooth regularization, https://arxiv.org/abs/1803.01370, KDD, 2018.
Bin Hu, Laurent Lessard, and Stephen J Wright, Dissipativity theory for accelerating stochastic variance reduction: A Unified analysis of SVRG and Katyusha using semidefinite programs, ICML, 2018.
Gábor Braun, Sebastian Pokutta, Dan Tu, and Stephen Wright, Blended Conditional Gradients: The unconditioning of conditional gradients, https://arxiv.org/abs/1805.07311
submitted, May, 2018.
Greg Ongie, Laura Balzano, Daniel Pimentel-Alarcón, Rebecca Willett, Robert D. Nowak, Tensor Methods for Nonlinear Matrix Completion, https://arxiv.org/abs/1804.10266, April 2018, submitted.
Yuan Li, Garvesh Raskutti, Rebecca Willett, Graph-based regularization for regression problems with highly-correlated designs, https://arxiv.org/abs/1803.07658, March, 2018, submitted.
Clément W. Royer, Michael O’Neill, Stephen J. Wright, A Newton-CG Algorithm with Complexity Guarantees for Smooth Unconstrained Optimization, https://arxiv.org/abs/1803.02924, March, 2018. Submitted.
Benjamin Mark, Garvesh Raskutti, Rebecca Willett, Network Estimation from Point Process Data, https://arxiv.org/abs/1802.04838, February 2018, submitted to IEEE Transactions on Information Theory.
Xiaomin Zhang, Xuezhou Zhang, Xiaojin Zhu, Po-Ling Loh, Theoretical support of machine learning debugging via weighted M-estimation, January 2018, submitted.
Zachary Charles, Amin Jalali, Rebecca Willett, Subspace Clustering with Missing and Corrupted Data, January 2018, https://arxiv.org/abs/1707.02461
Xuezhou Zhang, Xiaojin Zhu, Stephen J. Wright, Training set debugging using trusted items, January, 2018, https://arxiv.org/abs/1801.08019 , AAAI 2018.
Kwang-Sung Jun, Francesco Orabona, Stephen Wright, Rebecca Willett, Online Learning for Changing Environments using Coin Betting November, 2017, https://arxiv.org/abs/1711.02545 . To appear in Online Journal of Statistics.
Royer, C. and Wright, S. J., Complexity analysis of second-order line-search algorithms for smooth nonconvex optimization, November, 2017. https://arxiv.org/abs/1706.03131 SIAM Journal on Optimization.
O’Neill, M. and Wright, S. J., Behavior of accelerated gradient methods near critical points of nonconvex functions, https://arxiv.org/abs/1706.07993, June 2017 submitted.