Difference between revisions of "Jascha"

From RedwoodCenter
Jump to navigationJump to search
(Created page with "I am a graduate student in the [http://redwood.berkeley.edu Redwood Center for Theoretical Neuroscience], at University of California, Berkeley. I am a member of [https://redwoo...")
 
(Blanked the page)
 
Line 1: Line 1:
I am a graduate student in the [http://redwood.berkeley.edu Redwood Center for Theoretical Neuroscience], at University of California, Berkeley.  I am a member of [https://redwood.berkeley.edu/bruno/ Bruno Olshausen's] lab, and the [http://biophysics.berkeley.edu/ Biophysics Graduate Group].  My email address is [mailto:jascha@berkeley.edu jascha@berkeley.edu].
 
  
I am interested in how we learn to perceive the world.  There is evidence that much of our representation of the world is learned during development rather than being pre-programmed - everything from the way light intensity is correlated on adjacent patches of the retina, all the way up to the behavior (and existence!) of objects.  We seem to infer most of human scale physics from examples of sensory input.
 
 
How this unsupervised learning problem is solved - how we learn the structure inherent in the world just by experiencing examples of it - is not well understood.  This is the problem I am interested in tackling.
 
 
Practically - I develop techniques to train highly flexible but difficult to work with (non-normalizable) probabilistic model using ideas from statistical mechanics and dynamical systems.
 
 
== Current projects ==
 
 
I am working with Peter Battaglino and Michael DeWeese on a technique for parameter estimation in probabilistic models with intractable partition functions, involving minimization of probability flows.  See the [http://arxiv.org/abs/0906.4779 arXiv pre/e-print].  Matlab code implementing Minimum Probability Flow learning for the Ising model and RBM cases, and for comparing performance to other techniques under the RBM case, is available on my public github [https://github.com/Sohl-Dickstein/Jascha-Sohl-Dickstein-research-code repository].
 
 
I am working with Jimmy Wang and Bruno Olshausen to build a Lie algebraic model of the transformations which occur in natural video.  See an [http://arxiv.org/abs/1001.1027 arXiv pre/e-print], a [http://marswatch.astro.cornell.edu/jascha/pdfs/jimmy_jascha_bavrd_09.pdf poster pdf], or [https://redwood.berkeley.edu/jwang/research.html Jimmy's web page].
 
 
I am working with Jack Culpepper and Bruno Olshausen on novel uses of sampling algorithms in learning.  Specifically, efficient ways to maintain the full posterior during EM, and ways to exactly calculate the log likelihood and partition function for distributions by treating the sampling chain as an alternative analytic form for the distribution.
 
 
I am experimenting with techniques for online Hessian-aware learning.  More on this soon...
 
 
I am working to develop deep architectures for unsupervised learning based on deterministic, recurrent, networks.
 
 
I am working with Nicol Harper and Chris Rodgers to build a device enabling human echolocation.
 
 
== Notes ==
 
 
* [http://marswatch.astro.cornell.edu/jascha/pdfs/MPF_sampling.pdf Sampling the Connectivity Pattern in Minimum Probability Flow Learning] - Describes how the connectivity pattern between states in MPF can be described using a proposal distribution, rather than a deterministic rule.
 
 
* [http://marswatch.astro.cornell.edu/jascha/pdfs/generic_entropy_091121.pdf Entropy of Generic Distributions] - Calculates the entropy that can be expected for a distribution drawn at random from the simplex of all possible distributions ([http://dittler.us/ John Schulman] points out that ET Jaynes deals with similar questions in chapter 11 of "Probability Theory: The Logic Of Science")
 
 
The following are titles for informal notes I intend to write, but haven't gotten to/finished yet.  If any of the following sound interesting to you, pester me and they will appear more quickly.
 
 
* Natural gradients explained via an analogy to signal whitening
 
* A log bound on the growth of intelligence with system size
 
* The [http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=1467533 field of experts model] learns Gabor-like receptive fields when trained via minimum probability flow or score matching
 
* For small time bins, [http://www.jneurosci.org/cgi/content/abstract/25/47/11003 generalized linear models] and causal Boltzmann machines become equivalent
 
* How to construct phase space volume preserving recurrent networks
 
* Maximum likelihood learning as constraint satisfaction
 
* A spatial derivation of score matching
 
 
== Code ==
 
 
Hamiltonian Annealed Importance Sampling code is available at [https://github.com/Sohl-Dickstein/Hamiltonian-Annealed-Importance-Sampling https://github.com/Sohl-Dickstein/Hamiltonian-Annealed-Importance-Sampling]. This repository contains Matlab code to perform partition function estimation, log likelihood estimation, and importance weight estimation in models with intractable partition functions and continuous state spaces, using Hamiltonian Annealed Importance Sampling.  It can also be used for standard Hamiltonian Monte Carlo sampling (single step, with partial momentum refreshment).
 
 
Miscellaneous additional code related to my research is available on my public github [https://github.com/Sohl-Dickstein/Jascha-Sohl-Dickstein-research-code repository].  The code there includes:
 
 
* '''MPF_ising/ -''' Matlab code for parameter estimation in the Ising model via Minimum Probability Flow learning.
 
* '''MPF_RBM_compare_log_likelihood/ -''' Matlab code for parameter estimation in Restricted Boltzmann Machines via Minimum Probability Flow learning, pseudolikelihood, and Contrastive Divergence.  Also code comparing the log likelihood of data under the estimated RBMs for all 3 techniques.
 
 
== Publications ==
 
 
J Sohl-Dickstein, BJ Culpepper. Hamiltonian annealed importance sampling for partition function estimation.  Under review. http://marswatch.astro.cornell.edu/jascha/pdfs/HAIS.pdf
 
 
J Sohl-Dickstein, P Battaglino, M DeWeese. Minimum Probability Flow Learning.  Under review. http://marswatch.astro.cornell.edu/jascha/pdfs/MPF.pdf
 
 
A Hayes, J Grotzinger, L Edgar, SW Squyres, W Watters, J Sohl-Dickstein. Reconstruction of Eolian Bed Forms and Paleocurrents from Cross-Bedded Strata at Victoria Crater, Meridiani Planum, Mars, Journal of Geophysical Research (2011)
 
 
CM Wang, J Sohl-Dickstein, I Tosik. Lie Group Transformation Models for Predictive Video Coding. Proceedings of the Data Compression Conference (2011) http://marswatch.astro.cornell.edu/jascha/pdfs/PID1615931.pdf
 
 
J Sohl-Dickstein, CM Wang, BA Olshausen. An Unsupervised Algorithm For Learning Lie Group Transformations. (2009) http://arxiv.org/abs/1001.1027
 
 
J Sohl-Dickstein, P Battaglino, M DeWeese. Minimum probability flow learning. (2009) http://arxiv.org/abs/0906.4779
 
 
C Abbey, J Sohl-Dickstein, BA Olshausen. Higher-order scene statistics of breast images. Proceedings of SPIE (2009) http://link.aip.org/link/?PSISDG/7263/726317/1
 
 
K Kinch,
 
J Sohl-Dickstein,
 
J Bell III,
 
JR Johnson,
 
W Goetz,
 
GA Landis.
 
Dust deposition on the Mars Exploration Rover Panoramic Camera (Pancam) calibration targets. Journal of Geophysical Research-Planets (2007) http://www.agu.org/pubs/crossref/2007/2006JE002807.shtml
 
 
POSTER - J Sohl-Dickstein, BA Olshausen. Learning in energy based models via score matching. Cosyne (2007) - this (dense!) poster introduces a spatial derivation of score matching, applies it to learning in a Field of Experts model, and then extends Field of Experts to work with heterogeneous experts (to form a "tapestry of experts").  I'm including it as it hasn't been written up elsewhere. [http://marswatch.astro.cornell.edu/jascha/pdfs/jascha_cosyne_07.pdf download poster]
 
<!-- POSTER - J Wang, J Sohl-Dickstein, BA Olshausen. Unsupervised learning of Lie group operators from natural movies.  Bay Area Vision Research Day (2009). [http://marswatch.astro.cornell.edu/jascha/pdfs/jimmy_jascha_bavrd_09.pdf download poster] -->
 
<!-- POSTER - J Sohl-Dickstein, J Wang, B Olshausen. A Global Energy Function for Deep Belief Networks. Cosyne (2008) - some extensions to DBNs - fixes a problem which probably didn't need to be fixed.  The most interesting part may be noting the full joint distribution for a Deep Belief Network in the left column. [[Media:jascha_cosyne_08_poster.pdf|download poster]] -->
 
 
JR Johnson,
 
J Sohl-Dickstein,
 
WM Grundy,
 
RE Arvidson,
 
J Bell III,
 
P Christensen,
 
T Graff,
 
EA Guinness,
 
K Kinch,
 
R Morris,
 
MK Shepard.
 
Radiative transfer modeling of dust-coated Pancam calibration target materials: Laboratory visible/near-infrared spectrogoniometry. Journal of Geophysical Research (2006) http://www.agu.org/pubs/crossref/2006/2005JE002658.shtml
 
 
J Bell III,
 
J Joseph,
 
J Sohl-Dickstein,
 
H Arneson,
 
M Johnson,
 
M Lemmon,
 
D Savransky
 
In-flight calibration and performance of the Mars Exploration Rover Panoramic Camera (Pancam) instruments.
 
Journal of Geophysical Research (2006)
 
http://www.agu.org/pubs/crossref/2006/2005JE002444.shtml
 
 
Parker et al. Stratigraphy and sedimentology of a dry to wet eolian depositional system, Burns formation, Meridiani Planum, Mars. Earth and Planetary Science Letters (2005)
 
 
Soderblom et al. Pancam multispectral imaging results from the Opportunity rover at Meridiani Planum. Science (2004) http://www.sciencemag.org/content/306/5702/1703
 
 
Soderblom et al. Pancam multispectral imaging results from the Spirit rover at Gusev crater. Science (2004) http://www.sciencemag.org/content/305/5685/800
 
 
Smith et al. Athena microscopic imager investigation. Journal of Geophysical Research-Planets (2003)
 
 
Bell et al. Hubble Space Telescope Imaging and Spectroscopy of Mars During 2001. American Geophysical Union (2001)
 

Latest revision as of 23:31, 18 February 2011