Downloads and Materials

Code for our methods and models can be downloaded at our github site
https://github.com/DurstewitzLab


Recent Talks

On the difficulty of learning chaotic dynamics with RNNs at NeurIPS 2022, by Jonas Mikhaeil

Population models for large-scale neural recordings at Informatics Forum, University of Edinburgh, by Daniel Durstewitz

Tractable Dendritic RNNs for Reconstructing Nonlinear Dynamical Systems at ICML 2022, by Manuel Brenner

Reconstructing Nonlinear Dynamical Systems from Multi-Modal Time Series at ICML 2022, by Daniel Kramer

Interpretable Recurrent Neural Networks for reconstructing nonlinear dynamical systems from time series observations at CERN by Daniel Durstewitz

Identifying nonlinear dynamical systems with multiple time scales and long-range dependencies at ICLR 2021, by Dominik Schmidt

Transformation of ReLU-based recurrent neural networks from discrete-time to continuous-time at ICML 2020, by Zahra Monfared


Deep Learning of dynamical systems for mechanistic insight and prediction in psychiatry at IPAM by Daniel Durstewitz

Lectures

Time Series Analysis and Recurrent Neural Networks

The summer term 2021 lecture series on “Time Series Analysis and Recurrent Neural Networks” can be found online by following this link and using “TSArnn##2021” as the password.
Here are the topics of the individual lectures:
Lecture 1: Introduction and basic terms
Lecture 2: Autocorrelations and ARMA Models
Lecture 3: Statistical Inference in ARMA Models and Granger Causality
Lecture 4: AR count and point process models
Lecture 5: State Space Models and Expectation Maximization
Lecture 6: Kalman filter and smoother
Lecture 7: Poisson State Space Models and Nonlinear Dynamics
Lecture 8: Nonlinear Dynamics and Recurrent Neural Networks
Lecture 9: Universal Approximation Theorems for RNNs and Gradient Descent
Lecture 10: Long Term Dependencies and Exploding and Vanishing Gradients
Lecture 11: Generative RNNs: Extended and Unscented Kalman Filters
Lecture 12: Variational Inference, Reparameterization Trick and GANs
Lecture 13: Attention, Self-Attention and Transformers

Computational Statistics and Data Analysis

The summer term 2020 lecture series on “Computational Statistics and Data Analysis” can be found under this link.
Here are the individual lectures:
Lecture 1: Overview, basics of probability theory
Lecture 2: Discrete and continuous+exponential family distributions
Lecture 3: Moment generating functions, statistical models
Lecture 4: Statistical Inference and Parameter Estimation
Lecture 5: Numerical methods and statistical hypothesis tests
Lecture 6: Asymptotic Tests, Central Limit Theorem and Bootstrap Based Tests
Lecture 7: Linear&Nonlinear Regression Models
Lecture 8: Neural Networks and Nonlinear Regression
Lecture 9: Classifications Models I
Lecture 10: Classification Models II
Lecture 11: Regularization Techniques and Dimensionality Reduction Techniques
Lecture 12: Latent Variable Models
Lecture 13: Variational Inference and Generative Adversarial Networks

Books

Advanced Data Analysis in Neuroscience: Integrating Statistical and Computational Models

The book covers linear and nonlinear methods in statistics and machine learning for graduate students and researchers in neuroscience and related fields. Its focus is on time series analysis from a dynamical systems perspective, aiming to convey an understanding of the dynamical mechanisms that could have generated observed time series. It integrates computational modeling of behavioral and neural dynamics with statistical estimation and hypothesis testing. The book comes with MatLab code.

For further information see https://www.springer.com/de/book/9783319599748

Other materials

Method for stationarity-segmentation of spike train data
Analytical Approximation of the Firing Rate of an aEIF Neuron in the Presence of Synaptic Noise
Fitting the simplified AdEx model to physiological dataNeural Trajectories Reconstruction Matlab Toolbox 1.5 beta
Dynamical Basis of Irregular Spiking in NMDA-Driven PFC Neurons
Self-Organizing Neural Integrator Predicts Interval Times through Climbing Activity
A detailed data-driven network model of prefrontal cortex reproduces key features of in vivo activity
Method for cell assembly detection in parallel spike trainsGithub repository for PLRNN MATLAB code (see Durstewitz, D. (2017), PLoS Comput Biol 13)