CSE Faculty Candidate Seminar - Nisha Chandramoorthy

Time: 
Thursday, June 9, 2022 - 11:00am to 12:00pm
Location: 
Zoom

Event Details

Name: Nisha Chandramoorthy, Postdoctoral Researcher at Massachusetts Institute of Technology

Date: Thursday, June 9, 2022 at 11:00 am

Link: https://gatech.zoom.us/j/91934503092?pwd=MGV5NGR6VnBPeWhidUh3c2x5clhjUT09 

Webinar ID: 919 3450 3092
Passcode: 304463

Title: Dynamics Meets Data and Computation for Engineering Complex Systems

 

Abstract: How does the long-term behavior of a chaotic system respond to small parameter changes? This is a fundamental scientific question that arises in every discipline, from astrophysics to aerodynamics and climate science. Moreover, the derivative with respect to parameters of long-time averages/ensemble averages, known as linear response, is useful for engineering problems such as design optimization and uncertainty quantification. But, an efficient computation of linear response has been a longstanding open problem. 

In ideal chaotic models, Ruelle proved a rigorous formula for linear response in terms of an exponentially converging series of ensemble averages. These ensemble averages, however, typically show very slow error convergence in the ensemble size, and hence the original formula is not practically useful in high-dimensional chaotic systems. In this talk, we find an alternative computation, known as the space-split sensitivity or the S3 algorithm, that provably converges to Ruelle's formula, and yet shows Monte Carlo-like convergence. Along the way, we derive novel iterative numerical methods to calculate "unstable derivatives": derivatives along unstable directions, which are useful beyond sensitivity analysis.

Having computed linear response, we ask, when can small perturbations cause a drastic change in the statistics of a chaotic system? In this regard, we demonstrate surprising lessons that can be learned from one-dimensional chaotic systems. 

Staying in the theme of chaotic systems, we discuss the problem of Bayesian filtering, in which the probability distribution of the state given past observations is sought. Current workhorse algorithms are based on variants of the Kalman filter that assume that the filtering distributions are Gaussian, an assumption that is violated in the presence of nonlinearities. Another class of existing algorithms based on particle filter scale exponentially with dimension. To address this gap, we discuss new strategies for an ensemble filtering algorithm based on measure transport that exploits information about the unstable subspace of the underlying dynamics. 

Then, we migrate from learning dynamics to the dynamics of learning. We study the generalization performance of local descent learning algorithms. We show that a statistically algorithmically stable algorithm -- one whose statistics are robust to training input perturbations -- generalizes well. Our approach illustrates that a dynamical system perspective can provide new insights into theoretical machine learning when compared to the traditional optimization and learning theory perspectives.  

 

Bio: Nisha Chandramoorthy is a postdoctoral researcher in the Institute for Data, Systems and Society at MIT. She is interested in taking the dynamical systems approach to build rigorous algorithms for and generate mathematical insight into complex systems in the physical sciences, and particularly in the geosciences. She received a PhD in computational science and engineering from MIT in 2021.

For More Information Contact

Bryant Wine, Communications Officer
bryant.wine@cc.gatech.edu