VS265: Syllabus: Difference between revisions

From RedwoodCenter
Jump to navigationJump to search
Line 1: Line 1:
== Syllabus ==
== Syllabus ==


==== Introduction ====
==== Aug. 28: Introduction ====
# Theory and modeling in neuroscience
# Theory and modeling in neuroscience
# Goals of AI/machine learning vs. theoretical neuroscience
# Goals of AI/machine learning vs. theoretical neuroscience
# Turing vs. neural computation
# Turing vs. neural computation


==== Neuron models ====
==== Sept. 2,4: Neuron models ====


# Membrane equation, compartmental model of a neuron
# Membrane equation, compartmental model of a neuron
Line 12: Line 12:
# Perceptron model and linear separability
# Perceptron model and linear separability


==== Supervised learning ====
==== Sept. 9,11: Guest lectures ====
 
# TBD
# Paul Rhodes, Evolved Machines:  Multi-compartment models; dendritic integration
 
==== Sept. 16,18: Supervised learning ====


# Perceptron learning rule
# Perceptron learning rule
Line 19: Line 24:
# Multilayer networks and backpropagation
# Multilayer networks and backpropagation


==== Unsupervised learning ====
==== Sept. 23,25: Unsupervised learning ====


# Linear Hebbian learning and PCA, decorrelation
# Linear Hebbian learning and PCA, decorrelation
# Winner-take-all networks and clustering
# Winner-take-all networks and clustering
# Sparse, distributed coding


==== Plasticity and cortical maps ====
==== Sept. 30:  Guest lecture ====
 
# TBD
 
==== Oct. 2:  Sparse, distributed coding ====
 
# Autoencoders
# Natural image statistics
# Projection pursuit
 
==== Oct. 7:  Plasticity and cortical maps ====


# Cortical maps
# Cortical maps
# Self-organizing maps, Kohonen nets
# Self-organizing maps, Kohonen nets
# Models of experience dependent learning and cortical reorganization
# Models of experience dependent learning and cortical reorganization
# Manifold learning


==== Recurrent networks ====
==== Oct. 9:  Guest lecture ====
 
# TBD
 
==== Oct. 14:  Manifold learning ====
 
# Local linear embedding and Isomap
 
==== Oct. 16:  Guest lecture ====
 
# Tom Dean, Google:  Connectomics
 
==== Oct. 21,23,28,30:  Recurrent networks ====
# Hopfield networks
# Hopfield networks
# Models of associative memory, pattern completion
# Models of associative memory, pattern completion
Line 38: Line 63:
# Dynamical models
# Dynamical models


==== Probabilistic models and inference ====
==== Nov. 4,6,13,18,20,25:  Probabilistic models and inference ====


# Probability theory and Bayes’ rule
# Probability theory and Bayes’ rule
Line 48: Line 73:
# Energy-based models
# Energy-based models


==== Neural implementations ====
==== Dec. 2,4:  Neural implementations ====


# Integrate-and-fire model
# Integrate-and-fire model
Line 54: Line 79:
# Limits of precision in neurons
# Limits of precision in neurons
# Neural synchrony and phase-based coding
# Neural synchrony and phase-based coding
==== Dec. 9,11:  Guest lectures ====
# TBD
# TBD

Revision as of 05:39, 1 September 2014

Syllabus

Aug. 28: Introduction

  1. Theory and modeling in neuroscience
  2. Goals of AI/machine learning vs. theoretical neuroscience
  3. Turing vs. neural computation

Sept. 2,4: Neuron models

  1. Membrane equation, compartmental model of a neuron
  2. Linear systems: vectors, matrices, linear neuron models
  3. Perceptron model and linear separability

Sept. 9,11: Guest lectures

  1. TBD
  2. Paul Rhodes, Evolved Machines: Multi-compartment models; dendritic integration

Sept. 16,18: Supervised learning

  1. Perceptron learning rule
  2. Adaptation in linear neurons, Widrow-Hoff rule
  3. Objective functions and gradient descent
  4. Multilayer networks and backpropagation

Sept. 23,25: Unsupervised learning

  1. Linear Hebbian learning and PCA, decorrelation
  2. Winner-take-all networks and clustering

Sept. 30: Guest lecture

  1. TBD

Oct. 2: Sparse, distributed coding

  1. Autoencoders
  2. Natural image statistics
  3. Projection pursuit

Oct. 7: Plasticity and cortical maps

  1. Cortical maps
  2. Self-organizing maps, Kohonen nets
  3. Models of experience dependent learning and cortical reorganization

Oct. 9: Guest lecture

  1. TBD

Oct. 14: Manifold learning

  1. Local linear embedding and Isomap

Oct. 16: Guest lecture

  1. Tom Dean, Google: Connectomics

Oct. 21,23,28,30: Recurrent networks

  1. Hopfield networks
  2. Models of associative memory, pattern completion
  3. Line attractors and `bump circuits’
  4. Dynamical models

Nov. 4,6,13,18,20,25: Probabilistic models and inference

  1. Probability theory and Bayes’ rule
  2. Learning and inference in generative models
  3. The mixture of Gaussians model
  4. Boltzmann machines
  5. Sparse coding and ‘ICA’
  6. Kalman filter model
  7. Energy-based models

Dec. 2,4: Neural implementations

  1. Integrate-and-fire model
  2. Neural encoding and decoding
  3. Limits of precision in neurons
  4. Neural synchrony and phase-based coding

Dec. 9,11: Guest lectures

  1. TBD
  2. TBD