Statistics and Machine Learning Reading Group 2014-16
IMPORTANT NOTICE: the current webpage for the reading group can be found .
Suggested topics and papers:
- More about the Automatic Statistician can be found .
- Particle filtering. Resources include: Murphy (2012) Machine Learning: A Probabilistic Perspective chapter 23.5 (available from the university library, or see Jim Skinner for other forms of access).
Previously discussed topics and papers:
- VC dimension. Resources include: .
- Dropout for neural networks. Resources include: .
- Speech recognition. Resources include: .
- Latent Dirichlet allocation. Resources include: Murphy (2012) Machine Learning: A Probabilistic Perspective chapter 27.3 (available from the university library, or see Jim Skinner for other forms of access).
- Gaussian process regression derivations. Resources include: , , and the .
- GP-LVM. Resources include: , and .
- . Further information can be found .
- Recommendation engines. Resources include: Leskovec et al. (2014) Mining of Massive Datasets chapter 9 (available ).
- Naive Bayes. Resources include: Murphy (2012) Machine Learning: A Probabilistic Perspective chapters 3.5 and 10.2.1 (available from the university library, or see Jim Skinner for other forms of access).
- Latent linear models. Resources include: Murphy (2012) Machine Learning: A Probabilistic Perspective chapters 12.1 and 12.3 (available from the university library, or see Jim Skinner for other forms of access).
- Kernels and SVMs. Resources include: Murphy (2012) Machine Learning: A Probabilistic Perspective chapter 14 (available from the university library, or see Jim Skinner for other forms of access).
- Guest speaker: . NB: different location - room B3.02.
Gaussian processes in Applied Neuroscience: a case study in Deep Brain Stimulation.
Deep brain stimulation (DBS) is a treatment for movement disorders, such as Parkinson's disease, dystonia, or essential tremor. It usually consists of the implantation of a stimulator in the infraclavicular region connected to an electrode lead that is placed in a target structure in the basal ganglia, particularly in the subthalamic nucleus, or the thalamus. The stimulator delivers electric pulses of a specific frequency, amplitude and pulse-width to the target via the electrode, which results in symptom improvement. In this talk, I will review different topics about how we have been using recent developments in Gaussian processes to tackle different inference problems that appear when dealing with the analysis of the process of DBS. In particular, I will talk about generalized Wishart processes for diffusion tensor estimation, and latent force models for simulating electrical potential fields. - Reinforcement learning. Resources include: Sutton et al. (2012) Reinforcement Learning: An Introduction chapter 1 (available ).
- Dirichlet processes. Resources include: Murphy (2012) Machine Learning: A Probabilistic Perspective chapter 25.2 (available from the university library, or see Jim Skinner for other forms of access).
- Recurrent neural networks. Resources include: Goodfellow et al. Deep Learning (unfinished) chapter 10 (available ) and Karpathy's blogpost on the topic (available ).
- Gaussian mixture models. Resources include: Bishop (2006) Pattern Recognition and Machine Learning chapter 9.2 (available in both the Complexity Centre and university libraries).
- Tool resources can be found . A summarising presentation can be found .
- Multiple testing. Resources include: , and .
- Variational inference. Resources include: Bishop (2006) Pattern Recognition and Machine Learning chapter 10.1 (available in both the Complexity Centre and university libraries), and Murphy (2012) Machine Learning: A Probabilistic Perspective chapters 21.1, 21.2, and 21.5 (available from the university library, or see Jim Skinner for other forms of access).
- Previous topics and papers read during 2014/15
.
Dropout neural network gif by Michael Pearce.
