VS265: Syllabus: Difference between revisions
From RedwoodCenter
Jump to navigationJump to search
No edit summary |
|||
Line 14: | Line 14: | ||
==== Sept. 9,11: Guest lectures ==== | ==== Sept. 9,11: Guest lectures ==== | ||
* | * Matlab/Python tutorial | ||
* Paul Rhodes, Evolved Machines: Multi-compartment models; dendritic integration | * Paul Rhodes, Evolved Machines: Multi-compartment models; dendritic integration | ||
Line 31: | Line 31: | ||
==== Sept. 30, Oct. 2: Guest lecture ==== | ==== Sept. 30, Oct. 2: Guest lecture ==== | ||
* Fritz Sommer | * Fritz Sommer: Associative memories and attractor neural networks | ||
==== Oct. | ==== Oct. 7,9: Guest lectures ==== | ||
* Jerry Feldman: | |||
* Pentti Kanerva: Computing with 10,000 bits | |||
==== Oct. 14: Unsupervised learning (continued) ==== | |||
* Linear Hebbian learning and PCA, decorrelation | |||
* Winner-take-all networks and clustering | |||
==== Oct. 16: Guest lecture ==== | |||
* Tom Dean, Google: Connectomics | |||
==== Oct. 21: Sparse, distributed coding ==== | |||
* Autoencoders | * Autoencoders | ||
Line 39: | Line 53: | ||
* Projection pursuit | * Projection pursuit | ||
==== Oct. | ==== Oct. 23: Plasticity and cortical maps ==== | ||
* Cortical maps | * Cortical maps | ||
Line 45: | Line 59: | ||
* Models of experience dependent learning and cortical reorganization | * Models of experience dependent learning and cortical reorganization | ||
==== Oct. | ==== Oct. 28: Manifold learning ==== | ||
* Local linear embedding, Isomap | * Local linear embedding, Isomap | ||
==== Oct. | ==== Oct. 30, Nov. 4,6: Recurrent networks ==== | ||
* Hopfield networks, memories as 'basis of attraction' | |||
* Hopfield networks | |||
* Line attractors and `bump circuits’ | * Line attractors and `bump circuits’ | ||
* Dynamical models | * Dynamical models | ||
==== Nov. | ==== Nov. 13,18,20,25: Probabilistic models and inference ==== | ||
* Probability theory and Bayes’ rule | * Probability theory and Bayes’ rule | ||
Line 80: | Line 86: | ||
* Neural synchrony and phase-based coding | * Neural synchrony and phase-based coding | ||
==== Dec. 9,11: | ==== Dec. 9,11: Special topics ==== | ||
* TBD | * TBD | ||
* TBD | * TBD |
Revision as of 19:07, 14 October 2014
Syllabus
Aug. 28: Introduction
- Theory and modeling in neuroscience
- Goals of AI/machine learning vs. theoretical neuroscience
- Turing vs. neural computation
Sept. 2,4: Neuron models
- Membrane equation, compartmental model of a neuron
- Linear systems: vectors, matrices, linear neuron models
- Perceptron model and linear separability
Sept. 9,11: Guest lectures
- Matlab/Python tutorial
- Paul Rhodes, Evolved Machines: Multi-compartment models; dendritic integration
Sept. 16,18: Supervised learning
- Perceptron learning rule
- Adaptation in linear neurons, Widrow-Hoff rule
- Objective functions and gradient descent
- Multilayer networks and backpropagation
Sept. 23,25: Unsupervised learning
- Linear Hebbian learning and PCA, decorrelation
- Winner-take-all networks and clustering
Sept. 30, Oct. 2: Guest lecture
- Fritz Sommer: Associative memories and attractor neural networks
Oct. 7,9: Guest lectures
- Jerry Feldman:
- Pentti Kanerva: Computing with 10,000 bits
Oct. 14: Unsupervised learning (continued)
- Linear Hebbian learning and PCA, decorrelation
- Winner-take-all networks and clustering
Oct. 16: Guest lecture
- Tom Dean, Google: Connectomics
Oct. 21: Sparse, distributed coding
- Autoencoders
- Natural image statistics
- Projection pursuit
Oct. 23: Plasticity and cortical maps
- Cortical maps
- Self-organizing maps, Kohonen nets
- Models of experience dependent learning and cortical reorganization
Oct. 28: Manifold learning
- Local linear embedding, Isomap
Oct. 30, Nov. 4,6: Recurrent networks
- Hopfield networks, memories as 'basis of attraction'
- Line attractors and `bump circuits’
- Dynamical models
Nov. 13,18,20,25: Probabilistic models and inference
- Probability theory and Bayes’ rule
- Learning and inference in generative models
- The mixture of Gaussians model
- Boltzmann machines
- Sparse coding and ‘ICA’
- Kalman filter model
- Energy-based models
Dec. 2,4: Neural implementations
- Integrate-and-fire model
- Neural encoding and decoding
- Limits of precision in neurons
- Neural synchrony and phase-based coding
Dec. 9,11: Special topics
- TBD
- TBD