Probabilistic Inference Group (PIGS)
The Probabilistic Inference Group (PIGS) is a paper discussion group with meetings held fortnightly. The group focuses on probabilistic and information theoretic approaches to machine learning problems. Meetings are generally held fortnightly on Mondays at 10:30am in room 2.33
of the Informatics Forum. Announcements are made through the PIGS mailing list
Instructions for presenters:
1) Choose a mainstream
ML paper (or two).
2) Provide paper(s) at least one week
in advance of the meeting.
3) Lead a discussion of the paper(s) in the meeting.
Mainstream means a paper that does not depend heavily on domain specific background to be comprehensible. A possible test would be to consider if the techniques could fairly readily be transferred to another application area. Papers from conferences like NIPS, ICML, UAI, AISTATS, and journals like JMLR and ML papers from IEEE PAMI are likely to be in scope; but note that papers from other sources could well fit too.
Students should discuss their paper selections with their supervisor to make sure they are reasonable choices. It is acceptable to relate the selected papers to the presenter's research, but not at the expense of discussion of the selected paper.
If people want to make thematic groupings of readings it should be possible to arrange swaps in the rota in order to make this happen.
Academic Year 2014-2015
Jun 29: Pol
Markov Chain Monte Carlo and Variational Inference: Bridging the Gap
Tim Salimans, Diederik. P. Kingma and Max Welling
Jun 15: George
Just-In-Time Learning for Fast and Flexible Inference
S. M. Ali Eslami, D. Tarlow, P. Kohli and J. Winn
Jun 8: Rich Caruana (invited speaker)
High-Accuracy Intelligible Models for HealthCare
Jun 1: Cancelled (NIPS deadline)
May 18: Cancelled
May 4: Sohan
Scalable methods for nonnegative matrix factorizations of near-separable tall-and-skinny matrices
A.R. Benson, J.D. Lee, B. Rajwa, F. Gleich
Apr 20: Matt
Chris J. Maddison, Daniel Tarlow, and Tom Minka. A* Sampling.
Advances in Neural Information Processing Systems. 2014. (paper)(NIPS oral session video)
Mar 23: Jinli
Identifying and attacking the saddle point problem in high-dimensional non-convex optimization
Yann N. Dauphin et al.
Mar 9: Zhanxing
Bayesian Sampling Using Stochastic Gradient Thermostats
Nan Ding et.al
Feb 23: Konstantinos
Modeling Deep Temporal Dependencies with Recurrent Grammar Cells
Vincent Michalski, Roland Memisevic, Kishore Konda
Feb 9: Iain
Stochastic Variational Inference
Matthew D. Hoffman, David M. Blei, Chong Wang, John Paisley
Jan 26: NIPS 2014 review
Krzysztof: Do Deep Nets Really Need to be Deep?
Jimmy Ba, Rich Caruana
Amos: Generative Adversarial Nets
Ian J. Goodfellow et al.
Gavin: Spectral Methods Meet EM: A Provably Optimal Algorithm for Crowdsourcing
Zhang, Yuchen, et al.
Pol: Learning Generative Models with Visual Attention
Yichuan Tang et al.
Jinli: Factoring Variations in Natural Images with Deep Gaussian Mixture Models
Aaron van den Oord, Benjamin Schrauwen
Chris: Sequence to sequence learning with neural networks
Sutskever, Vinyals, Le
Yichuan: A Multiplicative Model for Learning Distributed Text-Based Attribute Representations
Kiros, et al.
Jari: Unsupervised Transcription of Piano Music
Berg-Kirkpatrick et al.
Dr Mike Smith (visitor talk, Makerere University), Title: Informatics in a Developing Country (Pollution, Traffic and Malaria)
Meetings in 2014
Meetings in 2013
Meetings in 2012
Meetings in 2011
Meetings in 2010
Meetings in 2009
Meetings in 2008
Meetings in 2007
Earlier meetings (2002-2006) on old website