VS298 (Fall 06): Neural Computation

From RedwoodCenter
Jump to navigationJump to search

People

Professor: Bruno Olshausen

  • Email: baolshausen AT berkeley DOT edu
  • Office: 10 Giannini
  • Office hours: TBD

GSI: Amir Khosrowshahi

  • Email: amirk AT berkeley DOT edu
  • Office: 523 Minor, 3-5996
  • Office hours: TBD

Course description

This is a 3-unit course that provides an introduction to theory of neural computation. The goal is to familiarize students with the major theoretical frameworks and models used in neuroscience and psychology, and to provide ""hands-on"" experience in using these models.

This course compliments MCB 262, Advanced Topics in Systems Neuroscience," year will be a "trial version" of the course - the plan is to offer it other year, interleaving with

Lectures

  • Location: TBD
  • Times: Two 1.5 hour lectures per week.

We will have an organizational meeting during the first week of class to determine a good time to meet.

Email list and forum

  • Please email the GSI to be added to the class email list.
  • A bulletin board is provided here for discussion regarding lecture material, readings, and problem sets.

Grading

Based on weekly homework assignments (60%) and a final project (40%).

Required background

Prerequisites are calculus, ordinary differential equations, basic probability and statistics, and linear algebra. Familiarity with programming in a high level language, ideally Matlab, is also required.

Textbooks

  • [HKP] Hertz, J. and Krogh, A. and Palmer, R.G. Introduction to the theory of neural computation. Amazon
  • [DJCM] MacKay, D.J.C. Information Theory, Inference and Learning Algorithms. available online or Amazon
  • [DA] Dayan, P. and Abbott, L.F. Theoretical neuroscience: computational and mathematical modeling of neural systems. Amazon

Additional reading, such as primary source material, will be suggested on a lecture by lecture basis.

Syllabus

Introduction

  1. Theory and modeling in neuroscience
  2. Descriptive vs. functional models
  3. Turing vs. neural computation
  • Reading: HKP chapter 1

Linear neuron models

  1. Linear systems: vectors, matrices, linear neuron models
  2. Perceptron model and linear separability
  • Reading: HKP chapter 5, DJCM chapters 38-40

Supervised learning

  1. Perceptron learning rule
  2. Adaptation in linear neurons, Widrow-Hoff rule
  3. Objective functions and gradient descent
  4. Multilayer networks and backpropagation
  • Reading: HKP chapter 6, 7, DJCM chapters 38-40

Reinforcement learning

  1. Theory of associative reward-penalty
  2. Models and critics
  • Reading: HKP chapter 8, DJCM chapter 36, DA chapter 9

Unsupervised learning

  1. Linear Hebbian learning and PCA, decorrelation
  2. Winner-take-all networks and clustering
  3. Sparse, distributed coding
  • Reading: HKP chapter 8, DJCM chapter 36, DA chapter 8

Plasticity and cortical maps

  1. Self-organizing maps, Kohonen nets
  2. Models of experience dependent learning and cortical reorganization

Recurrent networks

  1. Hopfield networks
  2. Pattern completion
  3. Line attractors and `bump circuits’
  4. Models of associative memory

Probabilistic models and inference

  1. Probability theory and Bayes’ rule
  2. Learning and inference in generative models
  3. The mixture of Gaussians model
  4. Boltzmann machines
  5. Sparse coding and ‘ICA’

Neural implementations

  1. Integrate-and-fire model
  2. Neural encoding and decoding
  3. Limits of precision in neurons