VS265: Reading: Difference between revisions

From RedwoodCenter
Jump to navigationJump to search
 
(41 intermediate revisions by 2 users not shown)
Line 8: Line 8:
* Zhang K, Sejnowski TJ (2000)  [http://redwood.berkeley.edu/vs265/zhang-sejnowski.pdf A universal scaling law between gray matter and white matter of cerebral cortex.]  PNAS, 97: 5621–5626.
* Zhang K, Sejnowski TJ (2000)  [http://redwood.berkeley.edu/vs265/zhang-sejnowski.pdf A universal scaling law between gray matter and white matter of cerebral cortex.]  PNAS, 97: 5621–5626.
* O'Rourke, N.A et al. [http://redwood.berkeley.edu/vs265/smith-synaptic-diversity.pdf "Deep molecular diversity of mammalian synapses:  why it matters and how to measure it."]  Nature Reviews Neurosci. 13, (2012)
* O'Rourke, N.A et al. [http://redwood.berkeley.edu/vs265/smith-synaptic-diversity.pdf "Deep molecular diversity of mammalian synapses:  why it matters and how to measure it."]  Nature Reviews Neurosci. 13, (2012)
* Stephen Smith [http://smithlab.stanford.edu/Smithlab/AT_Movies.html Array Tomography movies]
* Solari & Stoner, [http://redwood.berkeley.edu/vs265/solari-stoner-cognitive-consilience.pdf Cognitive Consilience]


==== Sept 2:  Neuron models ====
==== Sept 2:  Neuron models ====
Line 20: Line 22:
* '''HKP''' chapter 5, '''DJCM''' chapters 38-40, 44, '''DA''' chapter 8 (sec. 4-6)
* '''HKP''' chapter 5, '''DJCM''' chapters 38-40, 44, '''DA''' chapter 8 (sec. 4-6)
* [http://redwood.berkeley.edu/vs265/linear-neuron/linear-neuron-models.html Linear neuron models]
* [http://redwood.berkeley.edu/vs265/linear-neuron/linear-neuron-models.html Linear neuron models]
* [http://redwood.berkeley.edu/vs265/superlearn_handout1.pdf Handout] on supervised learning in single-stage feedforward networks
Background on linear algebra:
Background on linear algebra:
* [http://redwood.berkeley.edu/vs265/linear-algebra/linear-algebra.html Linear algebra primer]
* [http://redwood.berkeley.edu/vs265/linear-algebra/linear-algebra.html Linear algebra primer]
* Jordan, M.I. [http://redwood.berkeley.edu/vs265/PDP.pdf An Introduction to Linear Algebra in Parallel Distributed Processing] in McClelland and Rumelhart, ''Parallel Distributed Processing'', MIT Press, 1985.
* Jordan, M.I. [http://redwood.berkeley.edu/vs265/PDP.pdf An Introduction to Linear Algebra in Parallel Distributed Processing] in McClelland and Rumelhart, ''Parallel Distributed Processing'', MIT Press, 1985.


<!-- unsupervised learning:   
==== Sept 11:  Multicompartment models, dendritic integration (Rhodes guest lecture) ====
* '''Reading''': '''HKP''' chapter 8, '''DJCM''' chapter 36, '''DA''' chapter 8, 10 -->
* Koch, Single Neuron Computation, Chapter 19 [https://www.dropbox.com/s/rb24w33fqar7gjp/koch_ch19_small.pdf?dl=0 pdf]
* Rhodes P (1999) [http://redwood.berkeley.edu/vs265/Rhodes-review.pdf Functional Implications of Active Currents in the Dendrites of Pyramidal Neurons]
* Schiller J (2003) [http://redwood.berkeley.edu/vs265/Schiller-spikes-dendrites.pdf  Submillisecond Precision of the Input–Output Transformation Function Mediated by Fast Sodium Dendritic Spikes in Basal Dendrites of CA1 Pyramidal Neurons]
 
==== Sept. 16, 18: Supervised learning ====
* '''HKP''' Chapters 5, 6
* [http://redwood.berkeley.edu/vs265/superlearn_handout1.pdf Handout] on supervised learning in single-stage feedforward networks
* [http://redwood.berkeley.edu/vs265/superlearn_handout2.pdf Handout] on supervised learning in multi-layer feedforward networks - "back propagation"
Further reading:
* Y. LeCun, L. Bottou, G. Orr, and K. Muller (1998) [http://redwood.berkeley.edu/vs265/lecun-98b.pdf  "Efficient BackProp,"] in Neural Networks: Tricks of the trade, (G. Orr and Muller K., eds.).
* [http://cnl.salk.edu/Research/ParallelNetsPronounce/ NetTalk demo]
 
==== Sept. 23, 24: Unsupervised learning ====
* '''HKP''' Chapters 8 and 9, '''DJCM''' chapter 36, '''DA''' chapter 8, 10
* Handout: [http://redwood.berkeley.edu/vs265/hebb-pca-handout.pdf Hebbian learning and PCA]
* '''PDP''' [http://redwood.berkeley.edu/vs265/chap9.pdf Chapter 9] (full text of Michael Jordan's tutorial on linear algebra, including section on eigenvectors)
 
Optional:
* Atick, Redlich. [http://redwood.berkeley.edu/vs265/Atick-Redlich-NC92.pdf What does the retina know about natural scenes?], Neural Computation, 1992.
* Dan, Atick, Reid. [http://www.jneurosci.org/cgi/reprint/16/10/3351.pdf Efficient Coding of Natural Scenes in the Lateral Geniculate Nucleus: Experimental Test of a Computational Theory], J Neuroscience, 1996.
 
==== Sept 30, Oct 2: Attractor Networks and Associative Memories (Sommer guest lectures) ====
* "HKP" Chapter 2 and 3 (sec. 3.3-3.5), 7 (sec. 7.2-7.3), '''DJCM''' chapter 42, '''DA''' chapter 7
* [http://redwood.berkeley.edu/vs265/attractor-networks.pdf Handout] on attractor networks - their learning, dynamics and how they differ from feed-forward networks
* [http://redwood.berkeley.edu/vs265/hopfield82.pdf Hopfield82]
* [http://redwood.berkeley.edu/vs265/hopfield84.pdf Hopfield84]
* [https://www.dropbox.com/s/mh40nnj5d53o183/222960a0.pdf?dl=0 Willshaw69]
 
==== Oct 7: Ecological utility  and the mythical neural code (Feldman guest lecture) ====
* [ftp://ftp.icsi.berkeley.edu/pub/feldman/eeu.pdf Feldman10] Ecological utility and the mythical neural code
 
==== Oct 9: Hyperdimensional computing (Kanerva guest lecture) ====
* Kanerva, [http://redwood.berkeley.edu/vs265/Kanerva-allerton2014.pdf Computing with 10,000-bit words]
* Kanerva, [http://redwood.berkeley.edu/vs265/kanerva09-hyperdimensional.pdf Hyperdimensional Computing]
 
==== Oct 16: Structural and Functional Connectomics (Tom Dean guest lecture) ====
 
* [http://cs.brown.edu/people/tld/note/blog/14/10/16/index.html Lecture notes and slides]
 
==== 21,23,28 Oct ====
* Barlow, HB. [http://redwood.berkeley.edu/vs265/barlow1972.pdf Single units and sensation: A neuron doctrine for perceptual psychology?]  Perception, volume 1, pp. 371 -394 (1972)
* Foldiak, P. [http://redwood.berkeley.edu/vs265/foldiak90.pdf Forming sparse representations by local anti-Hebbian learning]. Biol. Cybern. 64, 165-170 (1990).
* Olshausen BA, Field DJ. [http://redwood.berkeley.edu/vs265/bruno-nature.pdf Emergence of simple-cell receptive field properties by learning a sparse code for natural images], Nature, 381: 607-609. (1996)
 
Additional readings:
* Rozell, Johnson, Baraniuk, Olshausen. [http://redwood.berkeley.edu/vs265/rozell-sparse-coding-nc08.pdf Sparse Coding via Thresholding and Local Competition in Neural Circuits], Neural Computation 20, 2526–2563 (2008).
* Zylberberg, Murphy, DeWeese, [http://redwood.berkeley.edu/vs265/zylberberg-sparse-coding.pdf A sparse coding model with synaptically local plasticity and spiking neurons can account for the diverse shapes of V1 simple cell receptive fields.], PLOS Computational Biology, 7,  (2011).  (sparse coding with spiking neurons)
* Olshausen [http://redwood.berkeley.edu/bruno/papers/icip03.pdf Sparse coding of time-varying natural images], ICIP 2003. (convolution sparse coding of video)
* Smith E, Lewicki MS. [http://redwood.berkeley.edu/vs265/smith-lewicki-nature06.pdf Efficient auditory coding], Nature Vol 439 (2006).  (convolution sparse coding of sound)
 
==== 30 Oct, 4 Nov ====
* '''HKP''' chapter 9, '''DA''' chapter 8
* [http://redwood.berkeley.edu/vs265/miller89.pdf Ocular dominance column development: Analysis and simulation] by Miller, Keller and Stryker.
* [http://redwood.berkeley.edu/vs265/durbin-mitchison.pdf A dimension reduction framework for understanding cortical maps] by R. Durbin and G. Mitchison.
* [http://redwood.berkeley.edu/vs265/horton05.pdf The cortical column: a structure without a function] by Jonathan C. Horton and Daniel L. Adams
 
Optional:
* Gary Blasdel, [http://redwood.berkeley.edu/vs265/blasdel1992.pdf Orientation selectivity, preference, and continuity in monkey striate cortex.], J Neurosci, 1992.  Another source of many of nice images are in the galleries on Amiram Grinvald's site: [http://www.weizmann.ac.il/brain/grinvald/]
* From Clay Reid's lab, [http://www.nature.com/nature/journal/v433/n7026/abs/nature03274.html Functional imaging with cellular resolution reveals precise micro-architecture in visual cortex]. Make sure you look at the supplementary material and videos on their web site (seems partly broken) [http://reid.med.harvard.edu/movies.html].
 
Re-organization in response to cortical lesions:
* Gilbert & Wiesel (1992), [http://www.nature.com/nature/journal/v356/n6365/pdf/356150a0.pdf Receptive Field Dynamics in Adult Primary Visual Cortex]
* Pettet & Gilbert (1992), [http://www.pnas.org/content/89/17/8366.full.pdf+html Dynamic changes in receptive-field size in cat primary visual cortex]
 
==== 6 Nov ====
 
* [http://redwood.berkeley.edu/vs265/tenenbaum-manifold.pdf A Global Geometric Framework for Nonlinear Dimensionality Reduction ], Tenenbaum et al., Science 2000.
* [http://redwood.berkeley.edu/vs265/roweis-saul-manifold.pdf Nonlinear Dimensionality Reduction by Locally Linear Embedding], Roweis and Saul, Science 2000.
* [http://redwood.berkeley.edu/vs265/carlsson-ijcv08.pdf On the Local Behavior of Spaces of Natural Images], Carlsson et al., Int J Comput Vis (2008) 76: 1–12.
 
Additional reading:
 
* [http://redwood.berkeley.edu/vs265/webster-face-adaptation.pdf Adaptation to natural facial categories], Michael A. Webster, Daniel Kaping, Yoko Mizokami & Paul Duhamel, Nature, 2004.
* [http://redwood.berkeley.edu/vs265/leopold.pdf Prototype-referenced shape encoding revealed by high-level aftereffects], David A. Leopold, Alice J. O’Toole, Thomas Vetter and Volker Blanz, Nature, 2001.
* [http://redwood.berkeley.edu/vs265/Blanz-siggraph-99.pdf A Morphable Model For The Synthesis Of 3D Faces], Blanz & Vetter 1999.
* [http://mbthompson.com/research/ Matthew B. Thompson's web page on flashed face distortion effect]
 
==== 6, 13 Nov ====
 
* [http://redwood.berkeley.edu/vs265/marr-poggio-science76.pdf Marr-Poggio stereo algorithm paper]
* [http://redwood.berkeley.edu/vs265/zhang96.pdf Kechen Zhang paper on bump circuits]
* [http://redwood.berkeley.edu/vs265/olshausen-etal93.pdf Olshausen, Anderson & Van Essen, dynamic routing circuit model]
 
==== 13,18 Nov ====
 
* HKP Chapter 7, sections 7.2, 7.3
* [http://redwood.berkeley.edu/vs265/jaeger04-ESN.pdf Jaeger, echo state networks]
* [http://www.cs.toronto.edu/~graves/phd.pdf Alex Graves' thesis], see Chapter 4
* [http://redwood.berkeley.edu/vs265/sussillo-dynamical-systems-curropin.pdf Sussillo, dynamical systems in neuroscience]
 
==== 20,25 Nov ====
 
* HKP chapter 7 (sec. 7.1),DJCM chapter 1-3, 20-24,41,43, DA chapter 10
* [https://www.dropbox.com/s/naezga6niejfw1n/chapter_preprint.pdf?dl=0 Olshausen (2014) Perception as an Inference Problem]
* [http://redwood.berkeley.edu/vs265/probability.pdf A probability primer]
* [http://redwood.berkeley.edu/vs265/bayes-prob.pdf Bayesian probability theory and generative models]
* [http://redwood.berkeley.edu/vs265/mog.pdf Mixture of Gaussians model ]
* T.J. Loredo, [http://redwood.berkeley.edu/vs265/loredo-laplace-supernova.pdf From Laplace to supernova SN1987A:  Bayesian inference in astrophysics]
 
==== 2 Dec ====
 
* Crick and Mitchison theory on 'unlearning' during sleep - [http://redwood.berkeley.edu/vs265/crick-mitchison-sleep.pdf paper]
* Application of Boltzmann machines to neural data analysis:<br>
** E. Schneidman, M.J. Berry, R. Segev and W. Bialek,[http://www.nature.com/nature/journal/v440/n7087/full/nature04701.html Weak pairwise correlations imply strongly correlated network states in a neural population], Nature 4400 (7087) (2006), pp. 1007-1012.<br>
** J. Shlens, G.D. Field, J.L. Gauthier, M.I. Grivich, D. Petrusca, A. Sher, A.M. Litke and E.J. Chichilnisky, [http://www.jneurosci.org/cgi/content/abstract/26/32/8254 The structure of multi-neuron firing patterns in primate retina], J Neurosci 260 (32) (2006), pp. 8254-8266.<br>
** U. Koster, J. Sohl-Dickstein, C.M. Gray, B.A. Olshausen, [http://www.ploscompbiol.org/article/info%3Adoi%2F10.1371%2Fjournal.pcbi.1003684 Modeling higher-order correlations within Cortical Microcolumns], PLOS Computational Biology, July 2014.
* Kalman filter
** Robbie Jacobs' [http://www.bcs.rochester.edu/people/robbie/jacobslab/cheat_sheet/sensoryIntegration.pdf notes on Kalman filter]
** [http://redwood.berkeley.edu/vs265/kalman.m kalman.m] demo script
** Greg Welch's [http://www.cs.unc.edu/~welch/kalman/kalmanIntro.html  tutorial on Kalman filter]
** [http://vision.ucla.edu/~doretto/projects/dynamic-textures.html Dynamic texture models]
** Kevin Murphy's [http://redwood.berkeley.edu/vs265/murphy-hmm.pdf  HMM tutorial]
 
==== 4 Dec ====


<!-- plasticity and cortical maps
* DA Chapter 10
* '''Reading''': '''HKP''' chapter 9, '''DA''' chapter 8 -->
* [http://redwood.berkeley.edu/vs265/info-theory.pdf Information theory primer]
* [http://redwood.berkeley.edu/vs265/handout-sparse-08.pdf Sparse coding and ICA handout]
* Jascha Sohl-Dickstein, [http://arxiv.org/abs/1205.1828 The Natural Gradient by Analogy to Signal Whitening, and Recipes and Tricks for its Use]
* Jascha Sohl-Dickstein, [http://redwood.berkeley.edu/vs265/jascha-cookbook.pdf Natural gradient cookbook]
* Bell & Sejnowski, [http://redwood.berkeley.edu/vs265/tony-ica.pdf An Information-Maximization Approach to Blind Separation and Blind Deconvolution], Neural Comp, 1995.
* Simoncelli, Olshausen. [http://redwood.berkeley.edu/vs265/simoncelli01-reprint.pdf Natural Image Statistics and Neural Representation], Annu. Rev. Neurosci. 2001. 24:1193–216.
* van Hateren & Ruderman [http://redwood.berkeley.edu/vs265/vanhateren-ruderman98.pdf Independent component analysis of natural image sequences], Proc. R. Soc. Lond. B (1998) 265. (blocked sparse coding/ICA of video)
* Hyvarinen, Hoyer, Inki, [http://redwood.berkeley.edu/vs265/TICA.pdf Topographic Independent Component Analysis], Neural Comp, 2001.
* Karklin & Lewicki paper on  [http://redwood.berkeley.edu/vs265/karklin-lewicki2003.pdf Learning Higher-Order Structure in Natural Images], Network 2003.
* Shao & Cottrell paper on [http://redwood.berkeley.edu/vs265/hshan-nips06.pdf Recursive ICA], NIPS 2006.


<!-- recurrent networks
==== 9 Dec ====
* '''Reading''': '''HKP''' chapters 2, 3 (sec. 3.3-3.5), 7 (sec. 7.2-7.3), '''DJCM''' chapter 42, '''DA''' chapter 7 -->


<!-- probabilistic models and inference
Kalman filter:
* '''Reading''': '''HKP''' chapter 7 (sec. 7.1),'''DJCM''' chapter 1-3, 20-24,41,43, '''DA''' chapter 10 -->
* Robbie Jacobs' [http://www.bcs.rochester.edu/people/robbie/jacobslab/cheat_sheet/sensoryIntegration.pdf notes on Kalman filter]
* [http://redwood.berkeley.edu/vs265/kalman.m kalman.m] demo script
* Greg Welch's [http://www.cs.unc.edu/~welch/kalman/kalmanIntro.html  tutorial on Kalman filter]
* [http://vision.ucla.edu/~doretto/projects/dynamic-textures.html Dynamic texture models]
* Kevin Murphy's [http://redwood.berkeley.edu/vs265/murphy-hmm.pdf  HMM tutorial]


<!-- neural implementations
Spiking neurons:
* '''Reading''': '''DA''' chapter 1-4, 5.4 -->
* DA chapters 1-4, 5.4
* Karklin & Simoncelli, [http://redwood.berkeley.edu/vs265/karklin-simoncelli.pdf Efficient coding of natural images with a population of noisy Linear-Nonlinear neurons], NIPS 2011.
* Chris Eliasmith, Charlie Anderson, [http://books.google.com/books?id=J6jz9s4kbfIC Neural Engineering:  Computation, Representation, and Dynamics in Neurobiological Systems], MIT Press, 2004.  (Chapter 4 will be emailed to the class.)
* Softky and Koch, [http://redwood.berkeley.edu/vs265/softky-koch-jn93.pdf The Highly Irregular Firing of Cortical Cells Is Inconsistent with Temporal Integration of Random EPSPs], J Neuroscience, January 1993, 13(1):334-350.
* Mainen and Sejnowski, [http://redwood.berkeley.edu/vs265/mainen-sejnowski.pdf Reliability of Spike Timing in Neocortical Neurons], Science, Vol 268, 6 June 1995.
* Shadlen and Newsome, [http://redwood.berkeley.edu/vs265/shadlen-newsome1.pdf Noise, neural codes and cortical organization], Curr Opin in Neur, 1994, 4:569-579.
* Shadlen and Newsom, [http://redwood.berkeley.edu/vs265/shadlen-newsome1.pdf Is there a signal in the noise?], Current Opin in Neur, 1995, 5:248-250.
* Softky, [http://redwood.berkeley.edu/vs265/softky-commentary.pdf Simple codes versus efficient codes], Current Opin in Neuro, 1995, 5:239-247.
* Izhikevich, [http://redwood.berkeley.edu/vs265/izhikevich-nn03.pdf Simple model of spiking neurons], IEEE Trans Neur Networks, 14(6):2003.
* Izhikevich, [http://redwood.berkeley.edu/vs265/izhikevich-which-nn04.pdf Which Model to Use for Cortical Spiking Neurons?], IEEE Trans Neur Networks, 15(5):2004.

Latest revision as of 02:09, 11 December 2014

Aug 28: Introduction

Optional:

Sept 2: Neuron models

Background reading on dynamics, linear time-invariant systems and convolution, and differential equations:

Sept 4: Linear neuron, Perceptron

Background on linear algebra:

Sept 11: Multicompartment models, dendritic integration (Rhodes guest lecture)

Sept. 16, 18: Supervised learning

  • HKP Chapters 5, 6
  • Handout on supervised learning in single-stage feedforward networks
  • Handout on supervised learning in multi-layer feedforward networks - "back propagation"

Further reading:

Sept. 23, 24: Unsupervised learning

  • HKP Chapters 8 and 9, DJCM chapter 36, DA chapter 8, 10
  • Handout: Hebbian learning and PCA
  • PDP Chapter 9 (full text of Michael Jordan's tutorial on linear algebra, including section on eigenvectors)

Optional:

Sept 30, Oct 2: Attractor Networks and Associative Memories (Sommer guest lectures)

  • "HKP" Chapter 2 and 3 (sec. 3.3-3.5), 7 (sec. 7.2-7.3), DJCM chapter 42, DA chapter 7
  • Handout on attractor networks - their learning, dynamics and how they differ from feed-forward networks
  • Hopfield82
  • Hopfield84
  • Willshaw69

Oct 7: Ecological utility and the mythical neural code (Feldman guest lecture)

  • Feldman10 Ecological utility and the mythical neural code

Oct 9: Hyperdimensional computing (Kanerva guest lecture)

Oct 16: Structural and Functional Connectomics (Tom Dean guest lecture)

21,23,28 Oct

Additional readings:

30 Oct, 4 Nov

Optional:

Re-organization in response to cortical lesions:

6 Nov

Additional reading:

6, 13 Nov

13,18 Nov

20,25 Nov

2 Dec

4 Dec

9 Dec

Kalman filter:

Spiking neurons: