Stability and Generalization of Convergent Learning Algorithms
In the era of Big Data, researchers like electrical and computer engineer Rebecca Willett and statistics professor Garvesh Raskutti, both part of the Institute for Foundations of Data Science at the University of Wisconsin-Madison, are developing new methods and tools to make sense of how discreet events can influence the occurrence of other events over time.
The program includes four week-long workshops, on topics ranging from the algorithmic underpinnings of Big Data analysis to the heterogeneity and geometry of large-scale datasets.
The gathering concentrated on a vital emerging issue in data science: formulation of data science problems as nonconvex optimization problems, and algorithms for solving them.
Algorithmic tools from optimization have had become more and more important in machine learning and data analysis over the past 15 years. For much of this time, the focus has been on tools from convex optimization. The best known problems in data analysis (such as kernel learning, linear regression and …
The IFDS will play a key role in the future of data science, developing fundamental techniques for handling increasingly massive data sets in shorter times.
Teacher Improves Learning by Selecting a Training Subset
DRACO:Robust Distributed Training via Redundant Gradients
Detecting solution space of PDEs via local random sampling
Optimal Calibration for Computer Model Prediction with Finite Samples