The course

Full-time, Part-time
Course type
Entry date


Got questions about this programme?

Contact our enquiries team.


This MSc programme is for students with a degree in mathematics, engineering, physics or equivalent who want to learn about the statistical foundations and computational techniques in data science. The programme will provide students with a unique set of skills to address data science challenges, for applications ranging from imaging and vision, to ecology and climate, or stochastic modelling. This is a joint MSc programme between the Schools of Mathematical and Computer Sciences (MACS), and Engineering and Physical Sciences (EPS). The two Schools have a multidisciplinary staff team with strong expertise in the field.

Why study the MSc Computational Data Science?

  • Data scientists, data engineers and business analysts are among the most sought-after careers, and Edinburgh specifically is intended to be the ‘Data Capital of Europe’
  • Students will acquire unique interdisciplinary knowledge at the interface between the statistical foundations of data science and associated computational techniques
  • Students will be exposed to specialised applications, ranging from imaging and vision, to ecology and climate, or stochastic modelling

September 2022 entry

The deadline for applications from Scottish and RUK students is 31 August 2022. For EU and overseas applicants, we guarantee to consider applications submitted by the 8 August 2022.

Course content

Core courses

All students are required to take a total of three mandatory courses in Semesters 1 and 2 which are designed to equip students with the foundational tools of data science, with a first clear opening to applications, as well as with fundamentals of critical analysis and research preparation.

Semester 1

Statistical Models
Course aims

In this module students will develop an understanding of the different methodologies of statistical inference

  • Develop skills in practical, computer-based estimation and inference
  • Develop report writing and presentation skill
  • Develop independent research skills
  • Inference and decision making
  • Parameter estimation
  • Likelihood
  • Bayesian estimation and credibility theory
  • Hypothesis testing
  • Project preparation
  • Applied statistical project
Statistical Machine Learning 
Course Aims
In this course, students will develop:
  • An understanding of the fundamental concepts and techniques used in data mining and machine learning.
  • An understanding of the mathematics underpinning data mining and machine learning.
  • A critical awareness of the appropriateness of different data mining and machine learning techniques and the
  • relationships between them.
  • Familiarity with common applications of data mining and machine learning techniques.


  • Basic Concepts: classification, clustering, supervised and unsupervised learning.
  • Generative Models: probabilistic graphical models; cluster analysis (including k-means clustering, expectation maximisation and mixture models); regression analysis.
  • Discriminative Learning: Instance-based learning and decision tree learning; artificial neural networks (perceptron, multilayer perceptron, back-propagation, deep learning architectures); maximum entropy models; support vector machines; ensemble methods (such as bagging and boosting).
  • Optimisation and Deep Learning for Imaging and Vision 1
Optimisation and Deep Learning for Imaging and Vision 1
Course Aims
What is the most efficient way to sense, or sample signals that we want to observe? Once data have been acquired, how do we retrieve the sought signal from these acquired data? Such "inference" or "inverse" problems are core to Data Science, particularly when the size of the signal is large and the "inference algorithms" need to be "scalable". This module approaches these questions both from a theoretical perspective (underpinned by the theories of compressive sensing, convex optimisation, deep learning) and in the context of "computational imaging" applications in a variety of domains ranging from astronomy to medicine.
The first part of the course will be taught. We will review the basic notion of Nyquist sampling of signals and rapidly dive into "computational imaging". In this context, mathematical algorithms need to be designed to solve an "inference" or "inverse" problem for image recovery from incomplete data. The size of the variables of interest in modern imaging application (e.g. in astronomy or medicine) can be very large. In our journey, and concentrating on the data processing (rather than hardware, or application) aspects, we will learn the basics of the theories of compressive sensing (which tells us how to design intelligent data acquisition schemes for sub-Nyquist sampling) and convex optimisation (which provides a whole wealth of algorithms capable to solve inverse problems, and scalable to high-dimensional problems).
Subjects covered (1)
  • Compressive sensing: motivation for compressive sensing; concepts of sparsity, incoherence, and randomness; theorems for l1 recovery.
  • Convex optimisation: convex problem and optimality conditions; proximal operators; Forward backward algorithm; Alternating Direction Method of Multipliers algorithm.
  • Applications to computational imaging

The second part of the course will take the form of a project which will enable us to explore how machine learning algorithms (more specifically deep neural networks) can provide an alternative framework to solve inverse problems (in particular for imaging applications), or otherwise integrate and enhance optimisation algorithms.


Subjects covered (2)
  • Deep learning: deep neural networks (in particular convolutional neural networks), network architecture, training & testing, networks for end-to-end inference, networks in optimisation algorithms.
  • Applications to computational imaging.

Semester 2

Optimisation and Deep Learning for Imaging and Vision 2
Course aims
This course builds on the module “Optimisation and Deep Learning for Imaging and Vision I” (B31XO) which it takes as a pre-requisite to further study "inference" or "inverse" problems core to Data Science, particularly when the dimension of the variable of interest is large and the "inference algorithms" need to be "scalable". This module will introduce advanced computational methods at the interface of optimisation theory, deep learning, and Bayesian inference. These methods will be studied in theory, compared and illustrated in a variety of Data Science applications such as imaging, computer vision, and machine learning. 
The taught part of the course is divided in four chapters:
Chapter 1 - Introduction: This chapter is a summary of B31XO. We will then emphasise the challenges of data sciences applications to motivate the need for more advanced (proximal) optimisation methods.
Chapter 2 - Advanced optimisation methods: The basic algorithms introduced in B31XO can be used as foundation to develop more advanced proximal methods for high dimensional problems. In this chapter, we will discuss two main techniques: acceleration via inertia, and scalability using duality.
Chapter 3 - Stochastic optimisation methods: This chapter introduces different stochastic algorithms, necessary to handle huge datasets.
Chapter 4 - Bayesian inference: In this chapter we first provide basics in Bayesian theory (defining Bayesian estimators and uncertainty). We then introduce an optimisation method to quantify uncertainty in the Maximum A Posteriori estimate. We finally discuss sampling techniques leveraging proximal operators for scalable Bayesian inference.
There will be three practical projects. This will give us the opportunity to implement some of the methods introduced in the taught part, with application to computational imaging and computer vision.
Project 1: Accelerated inertial forward-backward algorithm for image reconstruction
Project 2: Primal-dual forward-backward algorithm for computer vision
Project 3: Stochastic forward-backward algorithm for machine learning
Critical Analysis and Research Preparation
Course Aims
This course aims at preparing students for carrying out an extended SQCF Level 11 research or development project in a science or engineering programme by developing their skills in critical thinking, research planning and management, academic writing, experimental design and data handling.
  • Planning a research / development project: Defining measurable and realistic aims; planning the structure of your project; time-management; defining a reporting / review schedule; milestone setting; writing a Gantt chart
  • Getting the most from your supervisor: Preparing for review meetings; communication skills; reviewing and evaluating your progress and results; keeping a record of your progress
  • Background research: Carrying out a literature review; electronic bibliographic databases; online journals; using the library; how to reference other work
  • Academic writing: Writing techniques; plagiarism and how to avoid it; structuring a dissertation; reviewing your own writing
  • Data analysis and presentation: Statistical techniques including regression and error analysis
Bayesian Inference and Computational Methods
Course Aims
To provide students with knowledge of modern Bayesian Statistical inference, an understanding of the theory and application of computational methods in statistics and stochastic simulation methods including MCMC, and experience of implementing the Bayesian approach in practical situations.
  • Statistical programming. This will include an introduction to the use of R (and/or, potentially, other languages and packages) for probabilistic and statistical calculations, including the use of built-in simulation capabilities, iterative procedures, solution of equations and maximisation of functions.
  • Philosophy of Bayesian inference. This will include treatment of subjective and frequentist probability; the role of likelihood as a basis for inference; comparative treatment of Bayesian and frequentist approaches.
  • Implementing the Bayesian approach. This will include: the formulation of likelihood for a range of statistical models and sampling designs; the incorporation of prior knowledge through prior density selection; conjugacy; the use of non-informative and non-subjective priors (including Jeffrey's prior); the interpretation of the posterior distribution as the totality of knowledge; predictive distributions.
  • Theory of stochastic processes. Markov chains, classification of states, irreducibility, aperiodicity etc., stationary distributions, generalised and detailed balance, convergence.
  • Markov-chain and other stochastic methods for investigating target distributions. Ideas covered will include: simple simulation methods using transformations, distribution function inversion and acceptance-rejection sampling; construction of MCMC methods using standard recipes - Metropolis (and Metropolis-Hastings) algorithm, Gibb's sampler, sequential Monte Carlo, approximate Bayesian computation, implementation of most methods using the R computing package; investigation of properties through simulation.
  • Application of MCMC methods in Bayesian inference Ideas covered will include: formulation of samplers for inferential problems in e.g. pattern recognition, signal classification, population dynamics; implementation of methods using R; application to problems involving missing data; informative methods of summarising posterior densities.

Optional courses

Students will be requested to choose one additional course per semester. These may be grouped into three separate pairs, providing three clear application pathways for the students: imaging and vision, ecology and climate, stochastic modelling. Students however are allowed to choose courses from different pathways, as long as pre-requisites are met.

Pathway: Imaging and Vision 

Semester 1 - Foundations of Learning and Computer Vision
Course aims
This course introduces solid foundations on learning theory, which is essential to understand how modern and classical machine learning algorithms work, and explores applications in computer vision, one of the fields that has been most impacted by recent advances in machine learning.
Students will learn different frameworks for measuring the complexity and effectiveness of learning models. They will also become familiar with the basic elements of neural networks and acquire practical knowledge on training and deployment them in various computer vision problems. Students are expected to develop critical knowledge about the benefits and limitations of deep learning, focusing mostly on computer vision problems.
Part I: Learning
  1. Foundations of learning: learning frameworks (PAC, Rademacher) and illustration in classification problems
  2. Optimisation in machine learning: gradient descent for empirical risk minimisation, regularisation, margin, and stability
  3. Deep learning: basic layers, backpropagation, vanishing gradients, generalisation results
  4. Datasets and best practices
Part II: Computer Vision
  1. Physical models in vision: geometric transformations, camera models, optics
  2. Deep learning for vision: review of landmark networks for several problems in computer vision (detection, segmentation, depth estimation, among others)
Semester 2 - Graph Methods for Imaging, Vision and Computing
Course aims
This course aims at introducing models, algorithms and applications of graph-based processing. In particular, the students will learn about how to represent data as graphs and perform estimation and classification tasks using graph theory and probabilistic models.
The course will consist of 3 taught parts. The first part of the course will introduce graphs, spectral graph theory and applications to image restoration and segmentation. The second part will introduce Bayesian graphical models and inference algorithms on probabilistic graphs. The third part will cover Bayesian networks and applications to imaging
Itemised list of subjects
  • Graphs: definitions, properties and Introduction to spectral graph theory
  • Graph-based imaging methods: restoration and segmentation
  • Probability theory: random variables, joint and conditional distributions, Structured probabilistic models
  • Exact inference and message passing algorithms on graphs: applications to image restoration and segmentation
  • Bayesian (neural) networks: principles
  • Variational auto-encoder for imaging applications
  • Bayesian deep neural networks

Pathway: Ecology and Climate

Semester 1 - Mathematical Ecology
Course Aims
The module aims to provide postgraduate students with an advanced knowledge and understanding of the mathematical modelling methods that describe population dynamics, epidemiological processes and evolutionary processes in ecological systems. It will provide training in a wide variety of mathematical techniques which are used to describe ecological systems and provide instruction in the biological interpretation of mathematical results.
  • Single species population models: Continuous and discrete time model formulations and analysis; exponential growth, self limited growth, period-doubling bifurcations, chaos; graphical stability analysis and cobweb diagrams; harvesting problems, insect population dynamics, insect outbreak models.
  • Multi species population models: Continuous and discrete time model formulations and analysis; nondimensionalisation, linear stability analysis, phase plane methods; Models for interacting species, symbiotic, competitive, predator-prey and host-parasite ecological interactions; Age-structured models.
  • Mathematical models of ecological systems: Develop mathematical models from descriptive information of ecological systems; model analysis and biological interpretation of results.
  • Epidemiological models: Models of infectious disease; threshold conditions for epidemic outbreaks, the basic reproductive rate of a disease; vaccination strategies to control infection, pulse vaccination strategies.
  • Evolution and evolutionary game theory: Modelling the evolution of life history parameters; the evolution of reproduction and carrying capacity, the evolution of infection, trade-offs between parameters; Game theoretical approaches to evolution; 2-strategy games(Hawk-Dove).
  • Additional course material: Additional topics on Mathematical Ecology
Semester 2 - Data Assimilation 
Course Aims

The aims of this course are to develop techniques of data assimilation in numerical weather prediction and climate change modelling. This will be achieved by a mixture of lectures on basic methodology, tutorial exercises, computer labs, case studies and a large group-based modelling, implementation and simulation project. We will introduce a number of data assimilation approaches that are widely used in applications to numerical weather prediction and climate change modelling, including basic regression analysis, variational approaches, Kalman filtering, extended and ensemble Kalman filtering and the Bayesian inference approach. The course will teach practical implementation of these data assimilation techniques in the context of computer simulations, which will be illustrated by prototype applications. These methodologies will form the basis for a series of modelling case studies as well as the group-based project component of the course.

  • Background/Review: Data sets, statistics, re-analysis; polynomial interpolation (Legendre, Chebychev, etc) and errors; Fourier Transform and Fast Fourier Transform (theory and practical implementation).
  • Variational Approach: Cost functions; regression analysis review (least squares, linear and nonlinear models); Optimal Interpolation, 3DVar and 4DVar approaches, applications to Numerical Weather Prediction.
  • Kalman Filtering: Basic ideas (filtering/smoothing) of Kalman Filter; Extended Kalman Filter (nonlinear); Ensemble Kalman Filter (sequential MC methods), numerical implementation, Case study on the ensemble Kalman filter applied to the Lorenz system.
  • Bayesian Inference Approach: Basic ideas, Bayes Theorem, prior and posterior distributions (selection and interpretation); implementation: acceptance-rejection sampling, MCMC approach to target distributions (Metropolis-Hastings), example from diffusion problems, wave equation, fluid mechanics, geophysics and molecular dynamics. Discussion of general Inverse Problems, Uncertainty Quantification and Extreme Events.
  • Modelling, Data Assimilation and Simulation Project: Group-based work on an extended project related to biology, climate change or finance, including the modelling and subsequent direct numerical implementation of one or more of the data assimilation approaches above. The project includes a background literature search, development of the underlying model, assessment of the data and appropriate data assimilation techniques that can be applied, hands-on simulation of one or more of these techniques, a group-based presentation and a written report.

Pathway: Stochastic Modelling

Semester 1 - Probabilistic Methods
Course aims 

To introduce fundamental stochastic processes which are useful in stochastic modelling and data science

  • Random walks and Large Deviations

definition of a random walk, introduction to large deviations theory, introduction to rare event simulation

  • Conditional Expectation
  • Markov chain

    Sequences of random variables and the Markov property
    Using the Markov property
    Absorbing Markov chains with finite state space
    First step (backwards) equations

    Basic examples

    Stationarity problem for finite state space chains
    Convergence to stationarity
    Markov chains with infinite but countable state space

  • Simple point processes, Poisson and compound Poisson processes
  • Continuous-time Markov processes
  • Renewal theory

elementary renewal theory
properties of the renewal function
discrete renewal theory

  • Martingales
Semester 2 - Stochastic Networks 
Course Aims

To introduce stochastic processes used in stochastic and statistical modelling, and to provide an introduction to modern mathematical tools for studying such processes

  • Branching process

Survival vs extinction
Moments for number of individuals in a generation
Limiting results

  • Branching random walks
  • Approximations for sums of random variables
  • Random Graph models

Introduction and basic definitions of graphs and associated theory
Definition of random graph models
Basic properties of random graph models including Erdos-Reyni random graph, preferential attachment, configuration model

  • Percolation and epidemic and data spread over a graph

Dissertation (Semester 3)

Dissertation in Computational Data Science


Course aims

The dissertation is based on individual student projects in Semester 3. The projects will be university-based. Students will choose a project with a primary supervisor either in the School of MACS or the School of EPS. The secondary supervisor will either fully engage in the project, or act as a second reader of the dissertation. Collaboration with industry is possible and even encouraged, and it is under the responsibility of the first supervisor. Industry is not formally involved in the supervision.


The project should be closely linked with research areas underpinning the statistical and computational aspects of data science, using optimisation, machine learning, and/or statistical inference theories. The project should be linked to the domains of imaging and vision, ecology and climate and/or stochastic modelling. The project should have a significant computational component and make use of the computing facilities provided by the university.

Entry requirements

You will need a first or upper second-class honours degree (or its overseas equivalent) that has imparted reasonable know-how in programming and mathematics. Suitable candidates are likely to have studied a first degree in mathematics, statistics, physics, engineering or computer science. Lesser qualifications combined with relevant work experience may also be suitable.

English language requirements

Important: If your first language is not English, or your first degree was not taught in English, we’ll need to see evidence of your English language ability.

The minimum requirement for English language is IELTS 6.5, we also accept TOEFL (scores of 79 and higher).

We also offer a range of English language courses to help you meet the English language requirement prior to starting your master’s programme:

  • 20 weeks English (for IELTS of 5.0 with no skill lower than 4.5)
  • 14 weeks English (for IELTS of 5.0 with minimum of 5.0 in writing and no skill lower than 4.5)
  • 10 weeks English (for IELTS of 5.5 with no skill lower than 5.0)
  • 6 weeks English (for IELTS 5.5 with no skill lower than 5.5)


Tuition fees for entry
Status Full-time Part-time
UK £9000 £4500
Overseas £20808 £10404


  1. Your residency 'status' is usually defined as the country where you have been ordinarily resident for the three years before the start of your course. Find out more about tuition fees.
  2. Overseas includes applications from European Union countries who do not hold Pre-Settled or Settled status in the UK. Read more about the application process for EU nationals.

Additional fees information

Tuition fees for Sept 2022 entry
Status Full-time Part-time
UK tbc tbc
Overseas tbc tbc
  1. Your residency 'status' is usually defined as the country where you have been ordinarily resident for the three years before the start of your course. Find out more about tuition fees.
  2. Overseas includes applications from European Union countries who do not hold Pre-Settled or Settled status in the UK. Read more about the application process for EU nationals.

Scholarships and bursaries

We aim to encourage well-qualified, ambitious students to study with us and we offer a wide variety of scholarships and bursaries to achieve this. Over £6 million worth of opportunities are available in fee and stipend scholarships, and more than 400 students benefit from this support.

View our full range of postgraduate scholarships.