Nonlinear Science & Mathematical Physics
May 2, 2018 -
11:00am to 12:00pm
Howey - School of Physics
Georgia Tech - School of Electrical and Computer Engineering
Dynamical systems producing time series data have been examined in a number of ways, including with "black box" systems described by recurrent neural networks and structured approaches such as delay embeddings. While these approaches can work well in practice and do have theoretical justification for the approach, these techniques have lacked guarantees characterizing the quality of the information representation they produce. Meanwhile, recent results in randomized dimensionality reduction (including the field of compressed sensing and related results) have shown the value of geometry preservation in characterizing the stability of an embedding in a variety of interesting problems in signal and image acquisition.
In this talk, I will give an overview of our recent results establishing stable embedding guarantees as a quality measure for several existing approaches to understanding dynamical systems. In recurrent networks, we will show new guarantees on the short-term memory capacity of dynamic networks that are exponential improvements over the previous state of the art, showing rigorously for the first time that networks can have memory capacity that scales superlinearly with the size of the network. In delay embeddings, we extend the classic Takens' embedding theorem to establish conditions under which the image reconstructed from the time-series data is a stable embedding of a system's attractor. Beyond only preserving the attractor topology, a stable embedding preserves the attractor geometry by ensuring that distances between points in the state space are approximately preserved. These results also provide some guidance to choosing system parameters (e.g., number of delays, sampling rate), echoing the tradeoff between irrelevancy and redundancy that has been heuristically investigated in the literature.