Chapter 1: Neural circuit models based on firing rates and Hopfield networks: their dynamics, interconnections, and local Hebbian adaptation rules
Chapter 2: Stability in dynamic neural networks using Lyapunov methods, multistability, and energy functions
Chapter 3: Optimization in neural networks through biologically inspired gradient dynamics and sparse representations.
Chapter 4: Unsupervised learning via neural dynamics, linking Hebbian rules to tasks like PCA, clustering, and similarity-based representation learning.
Version dated Jun 15, 2025
Complete book in upright format [PDF file]
Complete book in slide/landscape format [PDF]
@Book{lectures-neural-dynamics, author = {F. Bullo}, title = {Lectures on Neural Dynamics}, year = 2025, url = {https://fbullo.github.io/lnd} }
These lecture notes are intended for personal non-commercial use only: you may not use this material for commercial purposes and you may not copy and redistribute this material in any medium or format.
Spring 2022: first time I taught the course
Winter 2025: second time I taught the course and first book draft
May 8, 2025: first edition online
May 27, 2025: small update with improvements to similarity matching, downloadable python code, and more
Jun 15, 2025: small update with improvements to softmax treatment, preface, exercises, and more
Jul 2, 2025: small update with improvements to the stability proof of Hopfield models and the removal of python code
Jul 16, 2025: substantial revision of stability statements and proofs for the Hopfield and firing rate model + miscellaneous typos
I am thankful for any feedback information, including suggestions, evaluations, error descriptions, or comments about teaching or research uses. Please email bullo at ucsb.edu