COMS30035 - Machine Learning

Unit Information

This unit seeks to acquaint students with machine learning algorithms which are important in many modern data and computer science applications. We cover topics such as kernel machines, probabilistic inference, neural networks, PCA/ICA, HMMs and emsemble models.


Rui Ponte Costa (RPC) Unit Director
James Cussens (JC)
Edwin Simpson (ES)
Note: For any questions please ask in the appropriate Teams channels or directly to us using Teams chat.

Teaching Assistants

Tashi Namgyal, Amarpal Sahota, Saptarshi Sinha, Benjamin Arana-Sanchez, Dabal Pedamonti, Stefan Radic Webster, Will Greedy

Unit Materials

Weeks First lecture Second lecture Labs [Thu 9am-12pm] Live lecture [Q&A class; Tues 1pm-2pm]
1 Introduction
Machine learning concepts
L1: Revision of Jupyter Notebook and ML libraries [answers] General questions about the unit
2 Revisiting regression [stream],
Bayesian regression [stream],
Classification and neural networks [stream]
Kernel machines L2: Regression, nnets and SVMs [answers] ML concepts, regression, classification and nnets
3 Introduction to graphical models Bayesian ML using graphical models L3: Probabilistic graphical models [answers] Kernel machines and probabilistic graphical models
4 k-means and mixtures of Gaussians [The EM algorithm stream] L4: k-means and EM [answers] The k-means and EM
5 PCA kernel PCA and ICA L5: PCA and ICA [answers] PCA and ICA
6 Reading week
7 Seqential data Sequential data L6: Hidden Markov Models [answers] Modelling sequential data
8 Selection and Combination Trees, Mixtures and Crowds L7: Trees and Ensemble methods [answers] Combining models using ensembles and probabilistic methods
9-11 Coursework weeks
12 Review week

Assessment Details

The assessment for this unit can be one of the following options:

  1. 100% Coursework assessment [W9-W11].
  2. 100% Exam assessment

Lab Work

The labs are formative assessements which we strongly encourage you to complete. Note that these were designed to help your understanding of ML methods.

Installation Instructions:

Jupyter Notebook - For all COMS30035 needs you are encouraged to install Anaconda (Python 3.7) as it bundles all the course's requirements. Alternatively for manual installation, you will require Python 3.7.x with 'Jupyter' and 'iPython' both possibly in version 4.x.x. All the packages needed will be listed at the beginning of each lab sheet.

If you login remotely to the university Linux machines, you should be able to just run Jupyter Notebook with this command
$ /opt/anaconda/3-2022/bin/jupyter notebook

For help on logging in remotely to Bristol machines see here.

Programming in a Browser
If you feel as akward as me about the idea about coding in a webbrowser you can use Emacs to render Jupyter notebooks. This way you get the finest of editing while at the same time having the benefit of Jupyter. Have a look at this repository on how to make it work URL. Here is a video showing how it can be done URL.

Text books

  1. Bishop, C. M., Pattern recognition and machine learning (2006). This is one of the best ML textbooks and will be our main textbook. The book is freely available here.
  2. Murphy, K., Machine learning a probabilistic perspective (2012). We will also use this book for parts of the topics covered. This is a more recent textbook and provides a particularly good coverage of probabilistic methods. The book is freely available here.


All technical resources will be posted on the COMS30035 Github organisation. If you find any issues, please kindly raise an issue in the respective repository.