VS265: Reading: Difference between revisions
From RedwoodCenter
Jump to navigationJump to search
No edit summary |
No edit summary |
||
Line 61: | Line 61: | ||
* Kanerva, [http://redwood.berkeley.edu/vs265/Kanerva-allerton2014.pdf Computing with 10,000-bit words] | * Kanerva, [http://redwood.berkeley.edu/vs265/Kanerva-allerton2014.pdf Computing with 10,000-bit words] | ||
* Kanerva, [http://redwood.berkeley.edu/vs265/kanerva09-hyperdimensional.pdf Hyperdimensional Computing] | * Kanerva, [http://redwood.berkeley.edu/vs265/kanerva09-hyperdimensional.pdf Hyperdimensional Computing] | ||
==== Oct 16: Structural and Functional Connectomics (Tom Dean guest lecture) ==== | |||
* [http://cs.brown.edu/people/tld/note/blog/14/10/16/index.html Lecture notes and slides] | |||
Revision as of 19:16, 14 October 2014
Aug 28: Introduction
- HKP chapter 1
- Dreyfus, H.L. and Dreyfus, S.E. Making a Mind vs. Modeling the Brain: Artificial Intelligence Back at a Branchpoint. Daedalus, Winter 1988.
- Bell, A.J. Levels and loops: the future of artificial intelligence and neuroscience. Phil Trans: Bio Sci. 354:2013--2020 (1999) here or here
- 1973 Lighthill debate on future of AI
Optional:
- Land, MF and Fernald, RD. The Evolution of Eyes, Ann Revs Neuro, 1992.
- Zhang K, Sejnowski TJ (2000) A universal scaling law between gray matter and white matter of cerebral cortex. PNAS, 97: 5621–5626.
- O'Rourke, N.A et al. "Deep molecular diversity of mammalian synapses: why it matters and how to measure it." Nature Reviews Neurosci. 13, (2012)
- Stephen Smith Array Tomography movies
- Solari & Stoner, Cognitive Consilience
Sept 2: Neuron models
- Mead, C. Chapter 1: Introduction and Chapter 4: Neurons from Analog VLSI and Neural Systems, Addison-Wesley, 1989.
- Carandini M, Heeger D (1994) Summation and division by neurons in primate visual cortex. Science, 264: 1333-1336.
Background reading on dynamics, linear time-invariant systems and convolution, and differential equations:
Sept 4: Linear neuron, Perceptron
- HKP chapter 5, DJCM chapters 38-40, 44, DA chapter 8 (sec. 4-6)
- Linear neuron models
Background on linear algebra:
- Linear algebra primer
- Jordan, M.I. An Introduction to Linear Algebra in Parallel Distributed Processing in McClelland and Rumelhart, Parallel Distributed Processing, MIT Press, 1985.
Sept 11: Multicompartment models, dendritic integration (Rhodes guest lecture)
- Koch, Single Neuron Computation, Chapter 19 pdf
- Rhodes P (1999) Functional Implications of Active Currents in the Dendrites of Pyramidal Neurons
- Schiller J (2003) Submillisecond Precision of the Input–Output Transformation Function Mediated by Fast Sodium Dendritic Spikes in Basal Dendrites of CA1 Pyramidal Neurons
Sept. 16, 18: Supervised learning
- HKP Chapters 5, 6
- Handout on supervised learning in single-stage feedforward networks
- Handout on supervised learning in multi-layer feedforward networks - "back propagation"
Further reading:
- Y. LeCun, L. Bottou, G. Orr, and K. Muller (1998) "Efficient BackProp," in Neural Networks: Tricks of the trade, (G. Orr and Muller K., eds.).
- NetTalk demo
Sept. 23, 24: Unsupervised learning
- HKP Chapters 8 and 9, DJCM chapter 36, DA chapter 8, 10
- Handout: Hebbian learning and PCA
- PDP Chapter 9 (full text of Michael Jordan's tutorial on linear algebra, including section on eigenvectors)
Optional:
- Atick, Redlich. What does the retina know about natural scenes?, Neural Computation, 1992.
- Dan, Atick, Reid. Efficient Coding of Natural Scenes in the Lateral Geniculate Nucleus: Experimental Test of a Computational Theory, J Neuroscience, 1996.
Sept 30, Oct 2: Attractor Networks and Associative Memories (Sommer guest lectures)
- "HKP" Chapter 2 and 3 (sec. 3.3-3.5), 7 (sec. 7.2-7.3), DJCM chapter 42, DA chapter 7
- Handout on attractor networks - their learning, dynamics and how they differ from feed-forward networks
- Hopfield82
- Hopfield84
- Willshaw69
Oct 7: Ecological utility and the mythical neural code (Feldman guest lecture)
- Feldman10 Ecological utility and the mythical neural code
Oct 9: Hyperdimensional computing (Kanerva guest lecture)
- Kanerva, Computing with 10,000-bit words
- Kanerva, Hyperdimensional Computing
Oct 16: Structural and Functional Connectomics (Tom Dean guest lecture)