RW001: Difference between revisions

From RedwoodCenter
Jump to navigationJump to search
Line 4: Line 4:


This is the first ever course of its kind on Matrix Analysis. Matrices are generally awesome and since most of us use Matlab, it would help to know more about them.
This is the first ever course of its kind on Matrix Analysis. Matrices are generally awesome and since most of us use Matlab, it would help to know more about them.
The intention is to provide people some matrix theory and tools that are relevant for machine learning, artificial intelligence, and neuroscience.
Topics will attempt to cover (in no particular order and with no guarantees of them happening):
basic notions of matrix theory - vector spaces, rank, dimension, determinants, etc.
eigenvectors and eigenvalues, norms and inequalities, matrix functions
special matrices - stochastic, orthogonal, unitary, symmetric, positive definite, etc.
matrix decompositions - SVD, polar, diagonalizability, Jordan form, etc.
applications to probability & data analysis - Max/Min Entropy, Markov chains, PCA, ICA, etc.
applications to optimization - Newton and Hessian methods, conjugate gradient, natural gradient etc.
applications to machine learning - spectral relaxations for clustering, gradients of objective functions in matrices, etc.
applications to dynamical systems - linear systems theory, associative networks, etc.
applications to neuroscience - ACS, MPF, spike-timing max entropy neural codes, etc.
applications to noneq stat mech - Schnakenberg theory, Crooks & Jarzynski theory, etc.


=== Instructors ===
=== Instructors ===

Revision as of 21:18, 25 February 2013

This is the spring '13 Matrix analysis class wiki

Course description

This is the first ever course of its kind on Matrix Analysis. Matrices are generally awesome and since most of us use Matlab, it would help to know more about them.

The intention is to provide people some matrix theory and tools that are relevant for machine learning, artificial intelligence, and neuroscience.

Topics will attempt to cover (in no particular order and with no guarantees of them happening):

basic notions of matrix theory - vector spaces, rank, dimension, determinants, etc. eigenvectors and eigenvalues, norms and inequalities, matrix functions special matrices - stochastic, orthogonal, unitary, symmetric, positive definite, etc. matrix decompositions - SVD, polar, diagonalizability, Jordan form, etc. applications to probability & data analysis - Max/Min Entropy, Markov chains, PCA, ICA, etc. applications to optimization - Newton and Hessian methods, conjugate gradient, natural gradient etc. applications to machine learning - spectral relaxations for clustering, gradients of objective functions in matrices, etc. applications to dynamical systems - linear systems theory, associative networks, etc. applications to neuroscience - ACS, MPF, spike-timing max entropy neural codes, etc. applications to noneq stat mech - Schnakenberg theory, Crooks & Jarzynski theory, etc.

Instructors

[Chris Hillar]

  • Website
  • Office: 573 Evans
  • Office hours: immediately following lecture or through e-mail

[Sarah Marzen], Scribe and GSI

  • Email:
  • Office: 567 Evans

Mayur Mudigonda Web stuff

Lectures

  • Location: 560 Evans (Redwood Center Conference Hall)
  • Times: First and Third Thursdays - 3:30 PM to 5 PM
  • Lecture 1 (intro lecture): [1]
  • Lecture 2: [2]

Enrollment information

Open to all interested members of the Berkeley community. The lectures will not be recorded, so attending the fortnightly talks and e-mails are the only interactions.

Email list and forum

Grading

Self-grading with solutions in class. Since this is the first time this class is being offered, the details are still under work.

Required background

Prerequisites are calculus, ordinary differential equations, basic probability and statistics, and linear algebra. Familiarity with programming in a high level language such as Matlab is also required.

Textbooks

Matrix Analysis, Horn and Johnson. Basic Linear Algebra Notes