Stanford University
Introduction to Computational Neuroscience
Spring 2025

Main Navigation

  • Home
  • Schedule
  • Lectures
  • Assignments

Lectures

  • Introduction and perceptrons (4/21/2025)
    Course goals and logistics. Introduction to perceptrons, the basic model of synaptic learning.
    [slides] [notes]

    Pre-lesson reading:

    • Vectors (10 minute video)
    • Dot product (the relevant part is the first 2 minutes 10 seconds, but feel free to watch the whole thing if you like)

    Optional Material:

    • Hertz, Krogh, Palmer Introduction to the theory of neural computation, chapters 5 and 6
  • Oscillations (4/23/2025)
    Network oscillations via coupled inhibitory cells (Wang and Buzsaki, 1996). Oscillations between up and down states (Wilson, 2005) in striatum, and their dependence on KIR channels. Thalamic relay neurons as intrinsic oscillators dependent on interactions between H current and T current, and how synaptic inhibitory responses engage this intrinsic oscillatory activity (McCormick and Pape, 1990).
    [slides]

    Suggested Reading:

    • Wang and Buzsaki 1996
    • Wilson 2005
    • McCormick and Pape 1990
  • Neural Encoding (4/28/2025)
    Review of theory for describing neural responses.
    [slides]

    Suggested reading:

    • Schwartz et. al. 2006 (a review of spike-triggered analysis)
    • Fairhall 2006 (applies spike-triggered covariance analysis to retinal data)
    • Mease et. al. 2013 (investigates adaptation in single neuron models)

    Advanced reading:

    • Aljadeff, Landsdell, Fairhall, Kleinfeld 2016. Analysis of Neuronal Spike Trains, Deconstructed
    • Yamins and DiCarlo 2016
  • Neural population analysis (4/30/2025)
    Introduction to decoding. Linear Discriminant Analysis. Factor Analysis.
    [slides]

    Advanced reading:

    • Duda, Hart, Stork, Pattern Classification, chapters 2-5
  • Adaptation and plasticity (5/5/2025)
    Neural adaptation. Maximizing information in a noisy neural system. Discussion of biophysical constraints and mechanisms of neural adaptation. Review of the Hodgkin Huxley model.
    [slides]

    Suggested reading:

    • Laughlin 1981 (Maximizing a neuron’s information capacity)
    • Van Hateren 1992 (Real and optimal neural images in early vision)
    • Hennig 2013 (Theoretical models of synaptic short term plasticity)
    • Ozuysal and Baccus 2012 (Linking the computational structure of variance adaptation to biophysical mechanisms)

    Advanced reading:

    • Atick and Redlich 1992 (What does the retina know about natural scenes?)
    • Abbott and Nelson 2000 (Synaptic Plasticity: Taming the Beast)
    • Mongillo, Barak, and Tsodyks 2008 (Synaptic theory of working memory)
  • Hopfield Networks (5/7/2025)
    Hopfield networks as models of content-addressable memory.
    [slides]

    Suggested reading:

    • Hopfield 1982 (A classic paper)
    • Review article by Chaudhuri and Fiete 2016 (Computational principles of memory)

    Advanced reading:

    • Hertz, Krogh, Palmer Introduction to the theory of neural computation, chapters 1-3
    • Amit, Gutfreund, Sompolinsky Storing infinite numbers of patterns in a spin-glass model of neural networks, PRL 1985

    Additional resources:

    • Detailed lecture notes on Hopfield network

Stanford University
Stanford, CA
USA

  • stanford.edu