VS265: Reading: Difference between revisions

From RedwoodCenter
Jump to navigationJump to search
No edit summary
 
(12 intermediate revisions by the same user not shown)
Line 112: Line 112:
==== 13,18 Nov ====
==== 13,18 Nov ====


* HKP Chapter 5
* HKP Chapter 7, sections 7.2, 7.3
* [http://redwood.berkeley.edu/vs265/jaeger04-ESN.pdf Jaeger, echo state networks]
* [http://redwood.berkeley.edu/vs265/jaeger04-ESN.pdf Jaeger, echo state networks]
* LSTMs
* [http://www.cs.toronto.edu/~graves/phd.pdf Alex Graves' thesis], see Chapter 4
* [http://redwood.berkeley.edu/vs265/sussillo-dynamical-systems-curropin.pdf Sussillo, dynamical systems in neuroscience]
* [http://redwood.berkeley.edu/vs265/sussillo-dynamical-systems-curropin.pdf Sussillo, dynamical systems in neuroscience]


==== 20,25 Nov ====


<!-- probabilistic models and inference
* HKP chapter 7 (sec. 7.1),DJCM chapter 1-3, 20-24,41,43, DA chapter 10
* '''Reading''': '''HKP''' chapter 7 (sec. 7.1),'''DJCM''' chapter 1-3, 20-24,41,43, '''DA''' chapter 10 -->
* [https://www.dropbox.com/s/naezga6niejfw1n/chapter_preprint.pdf?dl=0 Olshausen (2014) Perception as an Inference Problem]
<!-- ICA
* [http://redwood.berkeley.edu/vs265/probability.pdf A probability primer]
* [http://redwood.berkeley.edu/vs265/bayes-prob.pdf Bayesian probability theory and generative models]
* [http://redwood.berkeley.edu/vs265/mog.pdf Mixture of Gaussians model ]
* T.J. Loredo, [http://redwood.berkeley.edu/vs265/loredo-laplace-supernova.pdf From Laplace to supernova SN1987A:  Bayesian inference in astrophysics]
 
==== 2 Dec ====
 
* Crick and Mitchison theory on 'unlearning' during sleep - [http://redwood.berkeley.edu/vs265/crick-mitchison-sleep.pdf paper]
* Application of Boltzmann machines to neural data analysis:<br>
** E. Schneidman, M.J. Berry, R. Segev and W. Bialek,[http://www.nature.com/nature/journal/v440/n7087/full/nature04701.html Weak pairwise correlations imply strongly correlated network states in a neural population], Nature 4400 (7087) (2006), pp. 1007-1012.<br>
** J. Shlens, G.D. Field, J.L. Gauthier, M.I. Grivich, D. Petrusca, A. Sher, A.M. Litke and E.J. Chichilnisky, [http://www.jneurosci.org/cgi/content/abstract/26/32/8254 The structure of multi-neuron firing patterns in primate retina], J Neurosci 260 (32) (2006), pp. 8254-8266.<br>
** U. Koster, J. Sohl-Dickstein, C.M. Gray, B.A. Olshausen, [http://www.ploscompbiol.org/article/info%3Adoi%2F10.1371%2Fjournal.pcbi.1003684 Modeling higher-order correlations within Cortical Microcolumns], PLOS Computational Biology, July 2014.
* Kalman filter
** Robbie Jacobs' [http://www.bcs.rochester.edu/people/robbie/jacobslab/cheat_sheet/sensoryIntegration.pdf notes on Kalman filter]
** [http://redwood.berkeley.edu/vs265/kalman.m kalman.m] demo script
** Greg Welch's [http://www.cs.unc.edu/~welch/kalman/kalmanIntro.html  tutorial on Kalman filter]
** [http://vision.ucla.edu/~doretto/projects/dynamic-textures.html Dynamic texture models]
** Kevin Murphy's [http://redwood.berkeley.edu/vs265/murphy-hmm.pdf  HMM tutorial]
 
==== 4 Dec ====
 
* DA Chapter 10
* [http://redwood.berkeley.edu/vs265/info-theory.pdf Information theory primer]
* [http://redwood.berkeley.edu/vs265/handout-sparse-08.pdf Sparse coding and ICA handout]
* Jascha Sohl-Dickstein, [http://arxiv.org/abs/1205.1828 The Natural Gradient by Analogy to Signal Whitening, and Recipes and Tricks for its Use]
* Jascha Sohl-Dickstein, [http://redwood.berkeley.edu/vs265/jascha-cookbook.pdf Natural gradient cookbook]
* Bell & Sejnowski, [http://redwood.berkeley.edu/vs265/tony-ica.pdf An Information-Maximization Approach to Blind Separation and Blind Deconvolution], Neural Comp, 1995.
* Simoncelli, Olshausen. [http://redwood.berkeley.edu/vs265/simoncelli01-reprint.pdf Natural Image Statistics and Neural Representation], Annu. Rev. Neurosci. 2001. 24:1193–216.
* Simoncelli, Olshausen. [http://redwood.berkeley.edu/vs265/simoncelli01-reprint.pdf Natural Image Statistics and Neural Representation], Annu. Rev. Neurosci. 2001. 24:1193–216.
* van Hateren & Ruderman [http://redwood.berkeley.edu/vs265/vanhateren-ruderman98.pdf Independent component analysis of natural image sequences], Proc. R. Soc. Lond. B (1998) 265. (blocked sparse coding/ICA of video) -->
* van Hateren & Ruderman [http://redwood.berkeley.edu/vs265/vanhateren-ruderman98.pdf Independent component analysis of natural image sequences], Proc. R. Soc. Lond. B (1998) 265. (blocked sparse coding/ICA of video)
<!-- neural implementations
* Hyvarinen, Hoyer, Inki, [http://redwood.berkeley.edu/vs265/TICA.pdf Topographic Independent Component Analysis], Neural Comp, 2001.
* '''Reading''': '''DA''' chapter 1-4, 5.4 -->
* Karklin & Lewicki paper on  [http://redwood.berkeley.edu/vs265/karklin-lewicki2003.pdf Learning Higher-Order Structure in Natural Images], Network 2003.
* Shao & Cottrell paper on [http://redwood.berkeley.edu/vs265/hshan-nips06.pdf Recursive ICA], NIPS 2006.
 
==== 9 Dec ====
 
Kalman filter:
* Robbie Jacobs' [http://www.bcs.rochester.edu/people/robbie/jacobslab/cheat_sheet/sensoryIntegration.pdf notes on Kalman filter]
* [http://redwood.berkeley.edu/vs265/kalman.m kalman.m] demo script
* Greg Welch's [http://www.cs.unc.edu/~welch/kalman/kalmanIntro.html  tutorial on Kalman filter]
* [http://vision.ucla.edu/~doretto/projects/dynamic-textures.html Dynamic texture models]
* Kevin Murphy's [http://redwood.berkeley.edu/vs265/murphy-hmm.pdf  HMM tutorial]
 
Spiking neurons:
* DA chapters 1-4, 5.4  
* Karklin & Simoncelli, [http://redwood.berkeley.edu/vs265/karklin-simoncelli.pdf Efficient coding of natural images with a population of noisy Linear-Nonlinear neurons], NIPS 2011.
* Chris Eliasmith, Charlie Anderson, [http://books.google.com/books?id=J6jz9s4kbfIC Neural Engineering:  Computation, Representation, and Dynamics in Neurobiological Systems], MIT Press, 2004.  (Chapter 4 will be emailed to the class.)
* Softky and Koch, [http://redwood.berkeley.edu/vs265/softky-koch-jn93.pdf The Highly Irregular Firing of Cortical Cells Is Inconsistent with Temporal Integration of Random EPSPs], J Neuroscience, January 1993, 13(1):334-350.
* Mainen and Sejnowski, [http://redwood.berkeley.edu/vs265/mainen-sejnowski.pdf Reliability of Spike Timing in Neocortical Neurons], Science, Vol 268, 6 June 1995.
* Shadlen and Newsome, [http://redwood.berkeley.edu/vs265/shadlen-newsome1.pdf Noise, neural codes and cortical organization], Curr Opin in Neur, 1994, 4:569-579.
* Shadlen and Newsom, [http://redwood.berkeley.edu/vs265/shadlen-newsome1.pdf Is there a signal in the noise?], Current Opin in Neur, 1995, 5:248-250.
* Softky, [http://redwood.berkeley.edu/vs265/softky-commentary.pdf Simple codes versus efficient codes], Current Opin in Neuro, 1995, 5:239-247.
* Izhikevich, [http://redwood.berkeley.edu/vs265/izhikevich-nn03.pdf Simple model of spiking neurons], IEEE Trans Neur Networks, 14(6):2003.
* Izhikevich, [http://redwood.berkeley.edu/vs265/izhikevich-which-nn04.pdf Which Model to Use for Cortical Spiking Neurons?], IEEE Trans Neur Networks, 15(5):2004.

Latest revision as of 02:09, 11 December 2014

Aug 28: Introduction

Optional:

Sept 2: Neuron models

Background reading on dynamics, linear time-invariant systems and convolution, and differential equations:

Sept 4: Linear neuron, Perceptron

Background on linear algebra:

Sept 11: Multicompartment models, dendritic integration (Rhodes guest lecture)

Sept. 16, 18: Supervised learning

  • HKP Chapters 5, 6
  • Handout on supervised learning in single-stage feedforward networks
  • Handout on supervised learning in multi-layer feedforward networks - "back propagation"

Further reading:

Sept. 23, 24: Unsupervised learning

  • HKP Chapters 8 and 9, DJCM chapter 36, DA chapter 8, 10
  • Handout: Hebbian learning and PCA
  • PDP Chapter 9 (full text of Michael Jordan's tutorial on linear algebra, including section on eigenvectors)

Optional:

Sept 30, Oct 2: Attractor Networks and Associative Memories (Sommer guest lectures)

  • "HKP" Chapter 2 and 3 (sec. 3.3-3.5), 7 (sec. 7.2-7.3), DJCM chapter 42, DA chapter 7
  • Handout on attractor networks - their learning, dynamics and how they differ from feed-forward networks
  • Hopfield82
  • Hopfield84
  • Willshaw69

Oct 7: Ecological utility and the mythical neural code (Feldman guest lecture)

  • Feldman10 Ecological utility and the mythical neural code

Oct 9: Hyperdimensional computing (Kanerva guest lecture)

Oct 16: Structural and Functional Connectomics (Tom Dean guest lecture)

21,23,28 Oct

Additional readings:

30 Oct, 4 Nov

Optional:

Re-organization in response to cortical lesions:

6 Nov

Additional reading:

6, 13 Nov

13,18 Nov

20,25 Nov

2 Dec

4 Dec

9 Dec

Kalman filter:

Spiking neurons: