Difference between revisions of "VS298: Reading"
|Line 168:||Line 168:|
==== 4 Dec ====
==== 4 Dec ====
* A.J. Bell [http://redwood.berkeley.edu/amir/vs298/bell-cross-level.pdf Towards a Cross-Level Theory
* A.J. Bell[http://redwood.berkeley.edu/amir/vs298/bell-cross-level.pdf Towards a Cross-Level Theory
of Neural Learning].
of Neural Learning].
Revision as of 07:12, 11 December 2008
- Bell, A.J. Levels and loops: the future of artificial intelligence and neuroscience. Phil Trans: Bio Sci. 354:2013--2020 (1999) here or here
- Dreyfus, H.L. and Dreyfus, S.E. Making a Mind vs. Modeling the Brain: Artificial Intelligence Back at a Branchpoint. Daedalus, Winter 1988.
- Mead, C. Chapter 1: Introduction and Chapter 4: Neurons from Analog VLSI and Neural Systems, Addison-Wesley, 1989.
- Jordan, M.I. An Introduction to Linear Algebra in Parallel Distributed Processing in McClelland and Rumelhart, Parallel Distributed Processing, MIT Press, 1985.
- Zhang K, Sejnowski TJ (2000) A universal scaling law between gray matter and white matter of cerebral cortex. PNAS, 97: 5621–5626.
- Land, MF and Fernald, RD. The Evolution of Eyes, Ann Revs Neuro, 1992.
- Douglas, R and Martin, K. Recurrent neuronal circuits in the neocortex, Current Biology, 2007.
- Linear neuron models
- Linear time-invariant systems and convolution
- Simulating differential equations
- Carandini M, Heeger D (1994) Summation and division by neurons in primate visual cortex. Science, 264: 1333-1336.
Optional reading for more background:
- Handout on supervised learning in single-stage feedforward networks
- Handout on supervised learning in multi-layer feedforward networks - "backpropagation"
- Y. LeCun, L. Bottou, G. Orr, and K. Muller (1998) "Efficient BackProp," in Neural Networks: Tricks of the trade, (G. Orr and Muller K., eds.).
- NetTalk demo
- Handout: Hebbian learning and PCA
- HKP Chapter 8
- PDP Chapter 9 (full text of Michael Jordan's tutorial on linear algebra, including section on eigenvectors)
- HKP Chapter 9
- Atick, Redlich. What does the retina know about natural scenes?, Neural Computation, 1992.
- Dan, Atick, Reid. Efficient Coding of Natural Scenes in the Lateral Geniculate Nucleus: Experimental Test of a Computational Theory, J Neuroscience, 1996.
- Foldiak, P. Forming sparse representations by local anti-Hebbian learning. Biol. Cybern. 64, 165-170 (1990).
- Olshausen BA, Field DJ. Emergence of simple-cell receptive field properties by learning a sparse code for natural images, Nature, 381: 607-609. (1996)
Optional readings that covers material in lecture in greater depth:
- Rozell, Johnson, Baraniuk, Olshausen. Sparse Coding via Thresholding and Local Competition in Neural Circuits, Neural Computation 20, 2526–2563 (2008).
- Simoncelli, Olshausen. Natural Image Statistics and Neural Representation, Annu. Rev. Neurosci. 2001. 24:1193–216.
- Smith, Lewicki. Efficient auditory coding, Nature Vol 439 (2006).
Dayan and Abbott has a nice section on sparse coding in Chapter 10. This is on the syllabus for unsupervised learning already, but you may want to focus on section 10.3 and 10.4.
Here is a link to Compressive Sensive Resources at Rice. It has an enormous number of recent papers related to compressed sensing and sparse coding.
- Ocular dominance column development: Analysis and simulation by Miller, Keller and Stryker.
- A dimension reduction framework for understanding cortical maps by R. Durbin and G. Mitchison.
- The cortical column: a structure without a function by Jonathan C. Horton and Daniel L. Adams
Here are some additional links to papers mentioned in lecture. Optional reading:
- Gary Blasdel, Differential Imaging of Ocular Dominance and Orientation Selectivity in Monkey Striate Cortex, J Neurosci, 1992. Another source of many of nice images are in the galleries on Amiram Grinvald's site: 
- From Clay Reid's lab, Functional imaging with cellular resolution reveals precise micro-architecture in visual cortex. Make sure you look at the supplementary material and videos on their web site (seems partly broken) .
- A Global Geometric Framework for Nonlinear Dimensionality Reduction , Tenenbaum et al., Science 2000.
- Nonlinear Dimensionality Reduction by Locally Linear Embedding, Roweis and Saul, Science 2000.
- On the Local Behavior of Spaces of Natural Images, Carlsson et al., Int J Comput Vis (2008) 76: 1–12.
- Adaptation to natural facial categories, Michael A. Webster, Daniel Kaping, Yoko Mizokami & Paul Duhamel, Nature, 2004.
- Prototype-referenced shape encoding revealed by high-level aftereffects, David A. Leopold, Alice J. O’Toole, Thomas Vetter and Volker Blanz, Nature, 2001.
- Hopfield (1984) paper
- Kechen Zhang paper on bump circuits
- Olshausen, Anderson & Van Essen, dynamic routing circuit model
- Mixture of Gaussians model
- HKP Chapter 7, section 7.1
Some suggested readings for Jon Shlens' talk.
- S.H. Nirenberg and J.D. Victor, Analyzing the activity of large populations of neurons: how tractable is the problem?, Curr Opin Neurobiol 17 (4) (2007), pp. 397--400.
- Shlens J, Rieke F, Chichilnisky E. Synchronized firing in the retina. Curr Opin Neurobiol. 2008 Oct 27.
- S. Amari (2001) Information geometry on hierarchy of probability distributions. IEEE Trans Inform Theory 47:1701-1711
- E. Schneidman, S. Still, M.J. Berry and W. Bialek, Network information and connected correlations, Phys Rev Lett 91 (2003) 238701.
- E. Schneidman, M.J. Berry, R. Segev and W. Bialek,Weak pairwise correlations imply strongly correlated network states in a neural population, Nature 4400 (7087) (2006), pp. 1007-1012.
- J. Shlens, G.D. Field, J.L. Gauthier, M.I. Grivich, D. Petrusca, A. Sher, A.M. Litke and E.J. Chichilnisky, The structure of multi-neuron firing patterns in primate retina, J Neurosci 260 (32) (2006), pp. 8254-8266.
- Tang A, Jackson D, Hobbs J, Chen W, Smith JL, Patel H, Prieto A, Petrusca D, Grivich MI, Sher A, Hottowy P, Dabrowski W, Litke AM, Beggs JM. A maximum entropy model applied to spatial and temporal correlations from cortical networks in vitro. J Neurosci. 2008 Jan 9;28(2):505-18.
- Information theory primer
- Sparse coding and ICA handout
- Bell and Sejnowski, An Information-Maximization Approach to Blind Separation and Blind Deconvolution, Neural Comp, 1995.
- Hyvarinen, Hoyer, Inki, Topographic Independent Component Analysis, Neural Comp, 2001.
- Robbie Jacobs' notes on Kalman filter
- Greg Welch's tutorial on Kalman filter
- Dynamic texture models
- Kevin Murphy's HMM tutorial
- Chris Eliasmith, Charlie Anderson, Neural Engineering: Computation, Representation, and Dynamics in Neurobiological Systems, MIT Press, 2004.
Chapter 4 will be emailed to the class.
- Softky and Koch, The Highly Irregular Firing of Cortical Cells Is Inconsistent with Temporal Integration of Random EPSPs, J Neuroscience, January 1993, 13(1):334-350.
- Mainen and Sejnowski, Reliability of Spike Timing in Neocortical Neurons, Science, Vol 268, 6 June 1995.
- Shadlen and Newsome, Noise, neural codes and cortical organization, Curr Opin in Neur, 1994, 4:569-579.
- Shadlen and Newsom, Is there a signal in the noise?, Current Opin in Neur, 1995, 5:248-250.
- Softky, Simple codes versus efficient codes, Current Opin in Neuro, 1995, 5:239-247.
- Izhikevich, Simple model of spiking neurons, IEEE Trans Neur Networks, 14(6):2003.
- Izhikevich, Which Model to Use for Cortical Spiking Neurons?, IEEE Trans Neur Networks, 15(5):2004.
- A.J. Bell, [http://redwood.berkeley.edu/amir/vs298/bell-cross-level.pdf Towards a Cross-Level Theory
of Neural Learning].