IFDS Supporting Five Research Assistants in Spring 2019

Using funds from the National Science Foundation grant (under NSF’s TRIPODS and Convergence Programs), IFDS is funding five Research Assistants this semester to collaborate across departments on IFDS research. Each person is advised by one primary and one secondary adviser, all of them members of IFDS and all affiliated with the Wisconsin Institute for Discovery.

Ke Chen (Mathematics), advised by Qin Li (Mathematics) and Stephen Wright (Computer Science) is working on randomized algorithms for solving multiscale partial differential equations (PDEs) and inverse problems. He focuses on exploring and understanding the low dimensional feature of solution space for multi scale PDE, and on experiment design for inverse problems in optical tomography. Such understanding leads to a fast solver for PDEs with large degree of freedom and helps to reduce the number of measurements in optical tomography while maintaining the accuracy of reconstructed images.

Yang GuoYang Guo (Computer Science), advised by Dimitris Papailiopoulos (Electrical and Computer Engineering) and Stephen Wright (Computer Science), investigates the problems of adversarial and distributed machine learning from the perspective of optimization. In particular, he focuses on understanding the tradeoff between the accuracy and adversarial robustness of machine learning algorithms and speeding up the adversarial training process. Additionally, he is interested in the information theoretical guarantee behind the distributed optimization methods.

Ryan JulianRyan Julian (Mathematics), advised by Nigel Boston and collaborating with Dimitris Papiliopoulos (Electrical and Computer Engineering) is doing research on nonlinear algebraic methods for coding theory and data representation. In particular, he is working on improving the known bounds for the size and minimum distances of rank modulation codes for flash memory and finding families of codes that approach these bounds as closely as possible. He is also looking for ways to apply some of these nonlinear methods to machine learning, where more traditional linear coding theory has already found several useful applications in distributing workload in gradient algorithms in a manner that is resistant to node failures and attacks.
Tun Lee NgTun Lee Ng (Statistics), advised by Michael Newton (Statistics and BMI) and collaborating with Stephen Wright (Computer Science), is working on a general-purpose approximation approach to Bayesian analysis, in which repeated optimization of a randomized objective function provides approximate samples from the joint posterior distribution. The aim of this project is to extend existing theoretical support to a wider class of models. The weighting approach could become a useful alternative to other approximations (e.g. Markov chain Monte Carlo; variational Bayes). Potential data-analysis application includes improving hypothesis testing in brain imaging and genomics.
Alisha ZachariahAlisha Zachariah (Mathematics) is working with Justin Hsu (Computer Science) and Nigel Boston (Mathematics) on applications of differential privacy to the multiparty optimal power flow problem. She is investigating further ways to secure existing algorithms for the multi-party optimal power flow problem that involve techniques in the area of joint differential privacy. We are looking to achieve this additional level of security while keeping efficiency in mind.