- Il Memming Park (Group Leader, Champalimaud Centre for the Unknown)
- Matthew Dowling (senior PhD student, Electrical and Computer Engineering, Stony Brook University)
- Ayesha Vermani (PhD student, Champalimaud Centre for the Unknown)
- Ábel Ságodi (PhD student, Champalimaud Centre for the Unknown)
- 2022 November 14 -- November 18
- Lectures: 9:30 - 12:00, in classroom (in person)
- Exercise: 14:00 - 16:00, in classroom (in person)
- Motivation: intuition on high-dimensional spaces can be learned
- Motivation: most things are somewhat linear
- Linear neural network model
- Linear algebra: Matrix-vector products
- Linear algebra: Fun special matrix forms
- Linear algebra: Eigendecomposition
- Solution for arbitrary time
- Singular value decomposition (SVD)
- Books for linear algebra 1 2
- 12:00-13:00 CISS seminars
- Simulating discrete-time linear dynamics in 1D and 2D, line attractor
- Complex eigenvalues, spectrum plot, and stability
- Reading assignment 3
- Student presentation and discussion
- Linear algebra: Normal vs non-normal matrix
- Linear algebra: Shur decomposition
- Higher-order linear difference equations and delay embedding
- Discrete dynamics zoo
- Neuroscience: optimal memory structures 4
- Linear time invariant systems, convolution
- Filtering: tap-delay-line, finite impulse response, infinite impulse response, frequency response
- Adjoint matrix: zero-padding, finite difference operator, binning
- Neuroscience: non-normal / transient amplification 56
- Books 78
- existence and uniqueness
- Linear algebra: Jordan form
- 1D system, stability, flow field
- 2D system, visualization
- invariant subspaces
- matrix fundamental solution
- complete categorization of linear dynamics
- Neuroscience: line attractor
- Neuroscience: first-order approximation of hair cell activity 9
- Linearization: Hartman-Grobman theorem, Rectification theorem, Volterra series
- Kernel methods
- Koopman operator theory
- Neuroscience: Fundamental limits of linear systems as models of neural computation
- Neuroscience: Balanced state networks
- Derivatives of linear systems
- Learning: variational system and sensitivity propagation
- Neuroscience: backpropagation of deep linear system
- Neuroscience: echo state network and liquid state machines
- Solving linear systems
- Condition number of a matrix
- Never numerically invert a matrix unless absolutely necessary!
- Least squares is convex optimization
- Approximating linear ODE as a discrete time system
- PCA is linear
- CCA is linear
- Fourier transform is linear
- Essence of linear algebra series by 3Blue1Brown (Watch 1, 2, 3, 4, 5, 13, 14 (85 minutes total); other videos are optional
- Neuromatch Academy: Computational Neuroscience, Linear Algebra
- Petersen and Pedersen, The Matrix Cookbook
- NYU CNS Mathematical Tools for Neural and Cognitive Science
Footnotes
-
Strang, G. (2006). Linear Algebra and Its Applications. ↩
-
Horn, R. A., & Johnson, C. R. (2012). Matrix Analysis. Cambridge University Press. ↩
-
Goldman, M. S. (2009). Memory without feedback in a neural network. Neuron, 61(4), 621–634. ↩
-
Ganguli, S., Huh, D., & Sompolinsky, H. (2008). Memory traces in dynamical systems. Proceedings of the National Academy of Sciences, 105(48), 18970–18975. ↩
-
Murphy, B. K., & Miller, K. D. (2009). Balanced amplification: a new mechanism of selective amplification of neural activity patterns. Neuron, 61(4), 635–648. ↩
-
Mante, V., Sussillo, D., Shenoy, K. V., & Newsome, W. T. (2013). Context-dependent computation by recurrent dynamics in prefrontal cortex. Nature, 503(7474), 78–84. ↩
-
Brockett, R. W. (1970). Finite Dimensional Linear Systems. Wiley. ↩
-
Chicone, C. (2006). Ordinary Differential Equations with Applications. Springer Science & Business Media. ↩
-
Meddis, R. (1986). Simulation of mechanical to neural transduction in the auditory receptor. The Journal of the Acoustical Society of America, 79(3), 702–711. ↩