Overview

This course presents an in-depth study of statistical machine learning approaches. It aims to provide the student with a solid understanding of methods for learning and inference in structured probabilistic models, with a healthy balance of theory and practice. It will cover topics on the semantics of direct and undirected … For more content click the Read More button below. In this course, we will study a class of statistical inference models known as Probabilistic Graphical Models (PGMs). PGMs are a great example of how Computer Science and Statistics can work together. PGMs use graph data structures to represent domains with large amounts of variables and specialised algorithms for efficient inference over these graphical models. Therefore, PGMs have pushed the limits of probability theory to the scale and rate necessary to provide automated reasoning in modern AI systems. During this course, we will cover several graphical models, including Bayesian networks, Markov networks, Conditional Random Fields, Markov chains, Hidden Markov Models, Kalman Filters and Markov decision processes. We will have a clear understanding of how these models work as well as their main algorithms for inference and learning. We will also cover several algorithms used to learn parameters and make inferences such as Monte Carlo Markov Chains (MCMC), Gibbs Sampling, Viterbi and the Baum-Welch algorithms, among others.

Conditions for Enrolment

Prerequisite: MATH5836 or COMP9417

Delivery

Multimodal - Standard (usually weekly or fortnightly)
In-person -

Course Outline

To access course outline please visit below link (Please note that access to UNSW Canberra course outlines requires VPN):

Fees

Pre-2019 Handbook Editions

Access past handbook editions (2018 and prior)