# Difference between revisions of "VS298 (Fall 06): Neural Computation"

Line 14: | Line 14: | ||

This is a 3-unit course that provides an introduction to the theory of neural computation. The goal is to familiarize students with the major theoretical frameworks and models used in neuroscience and psychology, and to provide hands-on experience in using these models. | This is a 3-unit course that provides an introduction to the theory of neural computation. The goal is to familiarize students with the major theoretical frameworks and models used in neuroscience and psychology, and to provide hands-on experience in using these models. | ||

− | Relation to MCB 262: This course differs from MCB 262, ''Advanced Topics in Systems Neuroscience, in that it emphasizes the theoretical underpinnings of models - i.e., their mathematical and computational properties - as opposed to how they are used to analyze neuroscientific data. It will be offered in alternate years, interleaving with MCB 262. Students interested in computational neuroscience are encouraged to take both of these courses as they complement each other. | + | Relation to MCB 262: This course differs from MCB 262, ''Advanced Topics in Systems Neuroscience,'' in that it emphasizes the theoretical underpinnings of models - i.e., their mathematical and computational properties - as opposed to how they are used to analyze neuroscientific data. It will be offered in alternate years, interleaving with MCB 262. Students interested in computational neuroscience are encouraged to take both of these courses as they complement each other. |

==== Lectures ==== | ==== Lectures ==== |

## Revision as of 06:02, 17 August 2006

## People

**Professor:** Bruno Olshausen

- Email: baolshausen AT berkeley DOT edu
- Office: 10 Giannini
- Office hours: TBD

**GSI:** Amir Khosrowshahi

- Email: amirk AT berkeley DOT edu
- Office: 523 Minor, 3-5996
- Office hours: TBD

## Course description

This is a 3-unit course that provides an introduction to the theory of neural computation. The goal is to familiarize students with the major theoretical frameworks and models used in neuroscience and psychology, and to provide hands-on experience in using these models.

Relation to MCB 262: This course differs from MCB 262, *Advanced Topics in Systems Neuroscience,* in that it emphasizes the theoretical underpinnings of models - i.e., their mathematical and computational properties - as opposed to how they are used to analyze neuroscientific data. It will be offered in alternate years, interleaving with MCB 262. Students interested in computational neuroscience are encouraged to take both of these courses as they complement each other.

#### Lectures

- Location: TBD
- Times: Two 1.5 hour lectures per week.

We will have an organizational meeting during the first week of class to determine a good time to meet.

#### Email list and forum

- Please email the GSI to be added to the class email list.
- A bulletin board is provided here for discussion regarding lecture material, readings, and problem sets.

#### Grading

Based on weekly homework assignments (60%) and a final project (40%).

#### Required background

Prerequisites are calculus, ordinary differential equations, basic probability and statistics, and linear algebra. Familiarity with programming in a high level language, ideally Matlab, is also required.

#### Textbooks

- [
**HKP**] Hertz, J. and Krogh, A. and Palmer, R.G. Introduction to the theory of neural computation. Amazon - [
**DJCM**] MacKay, D.J.C. Information Theory, Inference and Learning Algorithms. available online or Amazon - [
**DA**] Dayan, P. and Abbott, L.F. Theoretical neuroscience: computational and mathematical modeling of neural systems. Amazon

Additional reading, such as primary source material, will be suggested on a lecture by lecture basis.

## Syllabus

#### Introduction

- Theory and modeling in neuroscience
- Descriptive vs. functional models
- Turing vs. neural computation

**Reading**:**HKP**chapter 1

#### Linear neuron models

- Linear systems: vectors, matrices, linear neuron models
- Perceptron model and linear separability

**Reading:****HKP**chapter 5,**DJCM**chapters 38-40

#### Supervised learning

- Perceptron learning rule
- Adaptation in linear neurons, Widrow-Hoff rule
- Objective functions and gradient descent
- Multilayer networks and backpropagation

**Reading:****HKP**chapter 6, 7,**DJCM**chapters 38-40

#### Reinforcement learning

- Theory of associative reward-penalty
- Models and critics

**Reading:****HKP**chapter 8,**DJCM**chapter 36,**DA**chapter 9

#### Unsupervised learning

- Linear Hebbian learning and PCA, decorrelation
- Winner-take-all networks and clustering
- Sparse, distributed coding

**Reading:****HKP**chapter 8,**DJCM**chapter 36,**DA**chapter 8

#### Plasticity and cortical maps

- Self-organizing maps, Kohonen nets
- Models of experience dependent learning and cortical reorganization

#### Recurrent networks

- Hopfield networks
- Pattern completion
- Line attractors and `bump circuits’
- Models of associative memory

#### Probabilistic models and inference

- Probability theory and Bayes’ rule
- Learning and inference in generative models
- The mixture of Gaussians model
- Boltzmann machines
- Sparse coding and ‘ICA’

#### Neural implementations

- Integrate-and-fire model
- Neural encoding and decoding
- Limits of precision in neurons