VS298: Neural Computation: Difference between revisions

From RedwoodCenter
Jump to navigationJump to search
No edit summary
No edit summary
Line 4: Line 4:
This course differs from MCB 262, ''Advanced Topics in Systems Neuroscience,'' in that it emphasizes the theoretical underpinnings of models - i.e., their mathematical and computational properties - rather than their application to the analysis of neuroscientific data.  It will be offered in alternate years, interleaving with MCB 262.  Students interested in computational neuroscience are encouraged to take both of these courses as they complement each other.
This course differs from MCB 262, ''Advanced Topics in Systems Neuroscience,'' in that it emphasizes the theoretical underpinnings of models - i.e., their mathematical and computational properties - rather than their application to the analysis of neuroscientific data.  It will be offered in alternate years, interleaving with MCB 262.  Students interested in computational neuroscience are encouraged to take both of these courses as they complement each other.


== People ==
=== Instructors ===


'''Professor:''' [http://redwood.berkeley.edu/bruno Bruno Olshausen]
[http://redwood.berkeley.edu/bruno Bruno Olshausen]
* Email: baolshausen AT berkeley DOT edu
* Email: baolshausen AT berkeley DOT edu
* Office: 570 Evans
* Office: 570 Evans
* Office hours: TBD
* Office hours: TBD


'''GSI:''' Amir Khosrowshahi
Amir Khosrowshahi, GSI
* Email: amirk AT berkeley DOT edu  
* Email: amirk AT berkeley DOT edu  
* Office: 567 Evans
* Office: 567 Evans
* Office hours: TBD
* Office hours: TBD


==== Lectures ====
=== Lectures ===
*'''Location''': Evans 508-20
*'''Location''': Evans 508-20
*'''Times''': 3 hours per week, to be determined at organizational meeting held on Wednesday, August 27 at 5:00
*'''Times''': 3 hours per week, to be determined at organizational meeting held on Wednesday, August 27 at 5:00
*'''Telebears''': {'''CCN''', Section, Units, Grade Option} == {'''TBD''', 02 LEC, 3, Letter Grade}
*'''Telebears''': {'''CCN''', Section, Units, Grade Option} == {'''TBD''', 02 LEC, 3, Letter Grade}


==== Email list and forum ====
=== Email list and forum ===
* Please subscribe to the class email list [http://lists.berkeley.edu here]. The list name is vs298-students.
* Please subscribe to the class email list [http://lists.berkeley.edu here]. The list name is vs298-students.
* A bulletin board is provided [http://redwood.berkeley.edu/forum/index.php here] for discussion regarding lecture material, readings, and problem sets. Signup required for posting.
* A bulletin board is provided [http://redwood.berkeley.edu/forum/index.php here] for discussion regarding lecture material, readings, and problem sets. Signup required for posting.


==== Grading ====
=== Grading ===
Based on weekly homework assignments (60%) and a final project (40%).
Based on weekly homework assignments (60%) and a final project (40%).


==== Required background====
=== Required background===
Prerequisites are calculus, ordinary differential equations, basic probability and statistics, and linear algebra. Familiarity with programming in a high level language, ideally Matlab, is also required.
Prerequisites are calculus, ordinary differential equations, basic probability and statistics, and linear algebra. Familiarity with programming in a high level language, ideally Matlab, is also required.


==== Textbooks ====
=== Textbooks ===
* ['''HKP'''] Hertz, J. and Krogh, A. and Palmer, R.G. ''Introduction to the theory of neural computation.'' [http://www.amazon.com/gp/product/0201515601/sr=8-1/qid=1155770696/ref=sr_1_1/103-4924722-3267011?ie=UTF8 Amazon]
* ['''HKP'''] Hertz, J. and Krogh, A. and Palmer, R.G. ''Introduction to the theory of neural computation.'' [http://www.amazon.com/gp/product/0201515601/sr=8-1/qid=1155770696/ref=sr_1_1/103-4924722-3267011?ie=UTF8 Amazon]
* ['''DJCM'''] MacKay, D.J.C. ''Information Theory, Inference and Learning Algorithms.'' [http://www.inference.phy.cam.ac.uk/mackay/itila/book.html Available online] or [http://www.amazon.com/exec/obidos/redirect?tag=davidmackayca-20&path=tg/detail/-/0521642981/qid%3D1057850920/sr%3D1-4 Amazon]
* ['''DJCM'''] MacKay, D.J.C. ''Information Theory, Inference and Learning Algorithms.'' [http://www.inference.phy.cam.ac.uk/mackay/itila/book.html Available online] or [http://www.amazon.com/exec/obidos/redirect?tag=davidmackayca-20&path=tg/detail/-/0521642981/qid%3D1057850920/sr%3D1-4 Amazon]

Revision as of 00:56, 18 August 2008

Course description

This is a 3-unit course that provides an introduction to the theory of neural computation. The goal is to familiarize students with the major theoretical frameworks and models used in neuroscience and psychology, and to provide hands-on experience in using these models.

This course differs from MCB 262, Advanced Topics in Systems Neuroscience, in that it emphasizes the theoretical underpinnings of models - i.e., their mathematical and computational properties - rather than their application to the analysis of neuroscientific data. It will be offered in alternate years, interleaving with MCB 262. Students interested in computational neuroscience are encouraged to take both of these courses as they complement each other.

Instructors

Bruno Olshausen

  • Email: baolshausen AT berkeley DOT edu
  • Office: 570 Evans
  • Office hours: TBD

Amir Khosrowshahi, GSI

  • Email: amirk AT berkeley DOT edu
  • Office: 567 Evans
  • Office hours: TBD

Lectures

  • Location: Evans 508-20
  • Times: 3 hours per week, to be determined at organizational meeting held on Wednesday, August 27 at 5:00
  • Telebears: {CCN, Section, Units, Grade Option} == {TBD, 02 LEC, 3, Letter Grade}

Email list and forum

  • Please subscribe to the class email list here. The list name is vs298-students.
  • A bulletin board is provided here for discussion regarding lecture material, readings, and problem sets. Signup required for posting.

Grading

Based on weekly homework assignments (60%) and a final project (40%).

Required background

Prerequisites are calculus, ordinary differential equations, basic probability and statistics, and linear algebra. Familiarity with programming in a high level language, ideally Matlab, is also required.

Textbooks

  • [HKP] Hertz, J. and Krogh, A. and Palmer, R.G. Introduction to the theory of neural computation. Amazon
  • [DJCM] MacKay, D.J.C. Information Theory, Inference and Learning Algorithms. Available online or Amazon
  • [DA] Dayan, P. and Abbott, L.F. Theoretical neuroscience: computational and mathematical modeling of neural systems. Amazon

HKP and DA are available as paperback. Some copies of HKP and DJCM are available at the Berkeley bookstore. Additional reading, such as primary source material, will be suggested on a lecture by lecture basis.