VS265: Reading: Difference between revisions

From RedwoodCenter
Jump to navigationJump to search
No edit summary
No edit summary
Line 47: Line 47:
* Atick, Redlich. [http://redwood.berkeley.edu/vs265/Atick-Redlich-NC92.pdf What does the retina know about natural scenes?], Neural Computation, 1992.
* Atick, Redlich. [http://redwood.berkeley.edu/vs265/Atick-Redlich-NC92.pdf What does the retina know about natural scenes?], Neural Computation, 1992.
* Dan, Atick, Reid. [http://www.jneurosci.org/cgi/reprint/16/10/3351.pdf Efficient Coding of Natural Scenes in the Lateral Geniculate Nucleus: Experimental Test of a Computational Theory], J Neuroscience, 1996.
* Dan, Atick, Reid. [http://www.jneurosci.org/cgi/reprint/16/10/3351.pdf Efficient Coding of Natural Scenes in the Lateral Geniculate Nucleus: Experimental Test of a Computational Theory], J Neuroscience, 1996.
<!-- plasticity and cortical maps
* '''Reading''': '''HKP''' chapter 9, '''DA''' chapter 8 -->
<!-- recurrent networks
* '''Reading''': '''HKP''' chapters 2, 3 (sec. 3.3-3.5), 7 (sec. 7.2-7.3), '''DJCM''' chapter 42, '''DA''' chapter 7 -->
<!-- probabilistic models and inference
* '''Reading''': '''HKP''' chapter 7 (sec. 7.1),'''DJCM''' chapter 1-3, 20-24,41,43, '''DA''' chapter 10 -->
<!-- neural implementations
* '''Reading''': '''DA''' chapter 1-4, 5.4 -->


====Sept 30, Oct 2: Attractor Networks and Associative Memories ====
====Sept 30, Oct 2: Attractor Networks and Associative Memories ====
* "HKP" Chapter 2 and 3 (sec. 3.3-3.5), 7 (sec. 7.2-7.3), '''DJCM''' chapter 42, '''DA''' chapter 7
* [http://redwood.berkeley.edu/vs265/attractor-networks.pdf Handout] on attractor networks - their learning, dynamics and how they differ from feed-forward networks
* [http://redwood.berkeley.edu/vs265/attractor-networks.pdf Handout] on attractor networks - their learning, dynamics and how they differ from feed-forward networks
* "HKP" Chapter 2 and 3
* [http://redwood.berkeley.edu/vs265/hopfield82.pdf Hopfield82]
* [http://redwood.berkeley.edu/vs265/hopfield82.pdf Hopfield82]
* [http://redwood.berkeley.edu/vs265/hopfield84.pdf Hopfield84]
* [http://redwood.berkeley.edu/vs265/hopfield84.pdf Hopfield84]
Line 66: Line 57:
====Oct 7: Ecological utility  and the mythical neural code ====
====Oct 7: Ecological utility  and the mythical neural code ====
* [ftp://ftp.icsi.berkeley.edu/pub/feldman/eeu.pdf Feldman10] Ecological utility and the mythical neural code
* [ftp://ftp.icsi.berkeley.edu/pub/feldman/eeu.pdf Feldman10] Ecological utility and the mythical neural code
<!-- plasticity and cortical maps
* '''Reading''': '''HKP''' chapter 9, '''DA''' chapter 8 -->
<!-- probabilistic models and inference
* '''Reading''': '''HKP''' chapter 7 (sec. 7.1),'''DJCM''' chapter 1-3, 20-24,41,43, '''DA''' chapter 10 -->
<!-- neural implementations
* '''Reading''': '''DA''' chapter 1-4, 5.4 -->

Revision as of 21:38, 9 October 2014

Aug 28: Introduction

Optional:

Sept 2: Neuron models

Background reading on dynamics, linear time-invariant systems and convolution, and differential equations:

Sept 4: Linear neuron, Perceptron

Background on linear algebra:

Sept 11: Multicompartment models, dendritic integration

Sept. 16, 18: Supervised learning

  • HKP Chapters 5, 6
  • Handout on supervised learning in single-stage feedforward networks
  • Handout on supervised learning in multi-layer feedforward networks - "back propagation"

Further reading:

Sept. 23, 24: Unsupervised learning

  • HKP Chapters 8 and 9, DJCM chapter 36, DA chapter 8, 10
  • Handout: Hebbian learning and PCA
  • PDP Chapter 9 (full text of Michael Jordan's tutorial on linear algebra, including section on eigenvectors)

Optional:

Sept 30, Oct 2: Attractor Networks and Associative Memories

  • "HKP" Chapter 2 and 3 (sec. 3.3-3.5), 7 (sec. 7.2-7.3), DJCM chapter 42, DA chapter 7
  • Handout on attractor networks - their learning, dynamics and how they differ from feed-forward networks
  • Hopfield82
  • Hopfield84
  • Willshaw69

Oct 7: Ecological utility and the mythical neural code

  • Feldman10 Ecological utility and the mythical neural code