Algebraic Graph Theory (S1, Dr Andriantiana):
This course introduces graph theory by studying basic properties of graphs. For this it covers the notion of homomorphism, transitivity, and spectra of graphs. The main part of the course consists of studying adjacency matrices of graphs. This involves looking at characteristic polynomial and spectrum of the adjacency matrices and using them for combinatoric purpose, to extract information about the graph. Laplacian matrices of graph will be considered as well. As for adjacency matrices, relations between their characteristic polynomials and spectra to properties of graphs will also be analysed. The last part of the course is on applications of graph theory to chemistry, physics and computer science.
Measure theory (S1, Dr Pinchuck):
We begin by study Riemann integration and some of its shortcomings. In order to overcome many of these difficulties, we introduce the concept of a measure space. In particular, we focus on the Lebesgue measure on a Euclidean space which assigns a conventional length, area and volume of Euclidean geometry. This leads to an improved theory of integration of real-valued functions. We will cover the important theorems in measure theory and emphasis will be put on solving various problems involving measure and integration.
Pre-requisite: 3rd year real analysis or equivalent.
Numerical Modelling (S1, Prof. Pollney):
Partial differential equations arise in a number of applications modelling
real-world phenomena. However, nonlinear equations do not lend themselves
easily to analytic solution so that computer methods are necessary to model
their behaviour. This course develops advanced numerical methods for
solving evolution problems, in particular hyperbolic and parabolic PDEs. We
introduce discrete approximations to continuous equations, study potential
sources of error, and develop advanced techniques from the method of
characteristics in order to approximate generic phenomena such as shocks.
Pre-requisite: MAP311 Numerical Analysis, MAP314 Partial Differential
Neural Networks (S1, Prof. Burton):
An artificial neural network (ANN) is a parallel computational system which consists of layers of neurons (which are just functions, called transfer functions) and connections between the layers (the connections are achieved by means of matrices of adjustable parameters). The parameters are adjusted in a way which roughly resembles the way in which neurotrasmitters in the brain are adjusted during learning.
The study of ANNs is heavily dependent on linear algebra and multivariate calculus and some basic concepts from statistics. The course will develop as follows.
• Regression and beyond
• Perceptrons: These simple ANNs were introduced in the 1950s and were the first real attempt to get a computer to classify patterns. However, they were not able to classify certain patterns and the initial excitement died down. Although they are not used anymore they are worth studying as an introduction to neural computing.
• Multi–layer, feed–forward networks with differentiable transfer functions: The machine learning pioneer, Paul Werbos, used a crucial notion, called backpropagation, to adjust the parameters in a neural network. This put ANNs back on the map and the study of artificial intelligence took off. We will study this in depth and construct ANNs from scratch using the MATLAB computational environment. With this experience behind us we will explore the MATLAB Deep Learning toolbox and use this to solve complex pattern recognition and prediction problems.
• Radial basis function networks (RBF): These ANNs are very different from feed–forward networks but have been shown to be equivalent to them. We will construct these RBF networks to solve complex problems.
• Dynamic Networks: These networks are deployed for time–series problems. We will develop our own methods, as well as the MATLAB toolbox, to construct and deploy them to solve time–series problems.
• Competitive Learning: The neurons in a competitive layer compete, in a sense which will be made precise, for the input data points. An updating procedure clusters the input data points such that input points in a cluster are similar to each other and dissimilar from data points in other clusters. In this way, the set of input data points are classified. This is an example of unsupervised learning. We will do this from first principles and also with the MATLAB Deep Learning toolbox.
Machine Learning (S2, Dr Atemkeng):
This course will cover topics on Dimensionality reduction (SVD, PCA, Autoencoders), clustering algorithms ( K-means, Hierarchical Clustering, Gaussian Mixture model), Ensemble methods (Supervised decision trees and random forests, Unsupervised random forests), Outlier detection, Cross-Validation and an introduction to Artificial Neural Network. Each aspect of the course will be linked to a practical and theory assessment.
General Relativity (S2, Dr John):
Special relativity, Maxwell's electromagnetism in special relativity, basic differential geometry and tensor formalism, Riemannian geometry, geodesics and curvature, the Einstein field equations, the Schwarzschild black hole, light and particle dynamics on curved spacetimes, and (time permitting!) a selection from introductory cosmology or gravitational waves.
Pre-requisite: It is strongly recommended that students have a good background in Physics 3 or Applied Maths 3 before enrolling in this course.
Matrix Groups (S2, Dr Remsing):
Groups of transformations; actions of groups on sets;
Euclidean spaces; matrix groups; the matrix exponential; Lie algebras;
OPTIONAL: (matrix) groups and geometry.
Lie Groups, Analytical Mechanics and Field Theory (S2, Dr Stevens):
Tensor algebra, Tensor calculus, One-parameter groups and Lie derivatives, Lie groups, The groups SO(3) and SU(2), The groups S0(3,1) and SL(2,C), Spinors, Lagrangian and Hamiltonian Methods, Outline of relativistic physics, Lagrangian field theory.
Topology (S2, Dr Mclean):
Topology is the study of geometrical properties and spatial relations that are preserved by the
continuous change of shape or size of objects. This course will cover general topology (the
foundation for most other branches of topology.) The topics covered will be set theory
(briefly), topological spaces, connectedness, compactness, and the countability and
Enumerative Combinatorics(S2, Prof Burton)
Dealing a hand of cards, throwing a pair of dice, selecting a handful of marbles from a bag of marbles of various colours, painting the vertices of a regular solid with a selection of colours are all processes, which we shall call experiments, with observable outcomes.
The objective of this course is to develop techniques for counting the number of ways in which outcomes with certain properties can occur. An outline of the course is as follows.
The course will develop problem-solving skills and the results that we prove will have a wide range of applications. Some typical problems:
Last Modified: Fri, 14 Feb 2020 12:34:00 SAST