VS265: Reading: Difference between revisions

From RedwoodCenter
Jump to navigationJump to search
No edit summary
 
(6 intermediate revisions by the same user not shown)
Line 140: Line 140:
** Kevin Murphy's [http://redwood.berkeley.edu/vs265/murphy-hmm.pdf  HMM tutorial]
** Kevin Murphy's [http://redwood.berkeley.edu/vs265/murphy-hmm.pdf  HMM tutorial]


==== 4 Nov ====
==== 4 Dec ====


* DA Chapter 10
* DA Chapter 10
* [http://redwood.berkeley.edu/vs265/info-theory.pdf Information theory primer]  
* [http://redwood.berkeley.edu/vs265/info-theory.pdf Information theory primer]  
* [http://redwood.berkeley.edu/vs265/handout-sparse-08.pdf Sparse coding and ICA handout]
* [http://redwood.berkeley.edu/vs265/handout-sparse-08.pdf Sparse coding and ICA handout]
* Jascha Sohl-Dickstein, [http://redwood.berkeley.edu/vs265/jascha-natgrad.pdf Natural gradients made quick and dirty]
* Jascha Sohl-Dickstein, [http://arxiv.org/abs/1205.1828 The Natural Gradient by Analogy to Signal Whitening, and Recipes and Tricks for its Use]
* Jascha Sohl-Dickstein, [http://redwood.berkeley.edu/vs265/jascha-cookbook.pdf Natural gradient cookbook]
* Jascha Sohl-Dickstein, [http://redwood.berkeley.edu/vs265/jascha-cookbook.pdf Natural gradient cookbook]
* Bell & Sejnowski, [http://redwood.berkeley.edu/vs265/tony-ica.pdf An Information-Maximization Approach to Blind Separation and Blind Deconvolution], Neural Comp, 1995.
* Bell & Sejnowski, [http://redwood.berkeley.edu/vs265/tony-ica.pdf An Information-Maximization Approach to Blind Separation and Blind Deconvolution], Neural Comp, 1995.
Line 154: Line 154:
* Shao & Cottrell paper on [http://redwood.berkeley.edu/vs265/hshan-nips06.pdf Recursive ICA], NIPS 2006.
* Shao & Cottrell paper on [http://redwood.berkeley.edu/vs265/hshan-nips06.pdf Recursive ICA], NIPS 2006.


==== 9 Dec ====


Kalman filter:
* Robbie Jacobs' [http://www.bcs.rochester.edu/people/robbie/jacobslab/cheat_sheet/sensoryIntegration.pdf notes on Kalman filter]
* [http://redwood.berkeley.edu/vs265/kalman.m kalman.m] demo script
* Greg Welch's [http://www.cs.unc.edu/~welch/kalman/kalmanIntro.html  tutorial on Kalman filter]
* [http://vision.ucla.edu/~doretto/projects/dynamic-textures.html Dynamic texture models]
* Kevin Murphy's [http://redwood.berkeley.edu/vs265/murphy-hmm.pdf  HMM tutorial]


<!-- neural implementations
Spiking neurons:
* '''Reading''': '''DA''' chapter 1-4, 5.4  
* DA chapters 1-4, 5.4  
* Karklin & Simoncelli, [[http://redwood.berkeley.edu/vs265/karklin-simoncelli.pdf Efficient coding of natural images with a population of noisy Linear-Nonlinear neurons], NIPS 2011.
* Karklin & Simoncelli, [http://redwood.berkeley.edu/vs265/karklin-simoncelli.pdf Efficient coding of natural images with a population of noisy Linear-Nonlinear neurons], NIPS 2011.
-->
* Chris Eliasmith, Charlie Anderson, [http://books.google.com/books?id=J6jz9s4kbfIC Neural Engineering:  Computation, Representation, and Dynamics in Neurobiological Systems], MIT Press, 2004.  (Chapter 4 will be emailed to the class.)
* Softky and Koch, [http://redwood.berkeley.edu/vs265/softky-koch-jn93.pdf The Highly Irregular Firing of Cortical Cells Is Inconsistent with Temporal Integration of Random EPSPs], J Neuroscience, January 1993, 13(1):334-350.
* Mainen and Sejnowski, [http://redwood.berkeley.edu/vs265/mainen-sejnowski.pdf Reliability of Spike Timing in Neocortical Neurons], Science, Vol 268, 6 June 1995.
* Shadlen and Newsome, [http://redwood.berkeley.edu/vs265/shadlen-newsome1.pdf Noise, neural codes and cortical organization], Curr Opin in Neur, 1994, 4:569-579.
* Shadlen and Newsom, [http://redwood.berkeley.edu/vs265/shadlen-newsome1.pdf Is there a signal in the noise?], Current Opin in Neur, 1995, 5:248-250.
* Softky, [http://redwood.berkeley.edu/vs265/softky-commentary.pdf Simple codes versus efficient codes], Current Opin in Neuro, 1995, 5:239-247.
* Izhikevich, [http://redwood.berkeley.edu/vs265/izhikevich-nn03.pdf Simple model of spiking neurons], IEEE Trans Neur Networks, 14(6):2003.
* Izhikevich, [http://redwood.berkeley.edu/vs265/izhikevich-which-nn04.pdf Which Model to Use for Cortical Spiking Neurons?], IEEE Trans Neur Networks, 15(5):2004.

Latest revision as of 02:09, 11 December 2014

Aug 28: Introduction

Optional:

Sept 2: Neuron models

Background reading on dynamics, linear time-invariant systems and convolution, and differential equations:

Sept 4: Linear neuron, Perceptron

Background on linear algebra:

Sept 11: Multicompartment models, dendritic integration (Rhodes guest lecture)

Sept. 16, 18: Supervised learning

  • HKP Chapters 5, 6
  • Handout on supervised learning in single-stage feedforward networks
  • Handout on supervised learning in multi-layer feedforward networks - "back propagation"

Further reading:

Sept. 23, 24: Unsupervised learning

  • HKP Chapters 8 and 9, DJCM chapter 36, DA chapter 8, 10
  • Handout: Hebbian learning and PCA
  • PDP Chapter 9 (full text of Michael Jordan's tutorial on linear algebra, including section on eigenvectors)

Optional:

Sept 30, Oct 2: Attractor Networks and Associative Memories (Sommer guest lectures)

  • "HKP" Chapter 2 and 3 (sec. 3.3-3.5), 7 (sec. 7.2-7.3), DJCM chapter 42, DA chapter 7
  • Handout on attractor networks - their learning, dynamics and how they differ from feed-forward networks
  • Hopfield82
  • Hopfield84
  • Willshaw69

Oct 7: Ecological utility and the mythical neural code (Feldman guest lecture)

  • Feldman10 Ecological utility and the mythical neural code

Oct 9: Hyperdimensional computing (Kanerva guest lecture)

Oct 16: Structural and Functional Connectomics (Tom Dean guest lecture)

21,23,28 Oct

Additional readings:

30 Oct, 4 Nov

Optional:

Re-organization in response to cortical lesions:

6 Nov

Additional reading:

6, 13 Nov

13,18 Nov

20,25 Nov

2 Dec

4 Dec

9 Dec

Kalman filter:

Spiking neurons: