### Honours courses

-------------------------------------------------------------------------------------

Applied Mathematics Courses: Machine Learning; Neural Networks; Numerical Modelling, Spectral Collocation Method.

Mathematics Courses: Algebraic Graph Theory; Enumerative Combinatorics; Introduction to Category Theory; Matrix Groups; Measure Theory; Topology.

-------------------------------------------------------------------------------------

## Descriptions:

Algebraic Graph Theory (S1, Dr Andriantiana):

This course introduces graph theory by studying basic properties of graphs. For this it covers the notion of homomorphism, transitivity, and spectra of graphs. The main part of the course consists of studying adjacency matrices of graphs. This involves looking at characteristic polynomial and spectrum of the adjacency matrices and using them for combinatoric purpose, to extract information about the graph. Laplacian matrices of graph will be considered as well. As for adjacency matrices, relations between their characteristic polynomials and spectra to properties of graphs will also be analysed. The last part of the course is on applications of graph theory to chemistry, physics and computer science.

Enumerative Combinatorics(S2, Prof Burton):

Dealing a hand of cards, throwing a pair of dice, selecting a handful of marbles from a bag of marbles of various colours, painting the vertices of a regular solid with a selection of colours are all processes, which we shall call experiments, with observable outcomes.

The objective of this course is to develop techniques for counting the number of ways in which outcomes with certain properties can occur. An outline of the course is as follows.

• Set Calculus
• Basic Counting and Probability
• Problem Solving Techniques
• Advanced Counting: Inclusion and Exclusion and applications
• Generating Functions
• Counting Patterns: Group Actions and applications
• Recurrence Relations

The course will develop problem-solving skills and the results that we prove will have a wide range of applications. Some typical problems:

• A man has two children. One is a boy born on January 11. What is the probability that the other child is a boy?
• Ten men throw their keys into a hat and then each man withdraws a key at random. What is the probability that each man selects another man's key?
• An atom has the shape of a tetrahedron and radicals attach to the four corners. How many different molecules can be formed by attaching radicals of ten different types to the corners?
• A hand of five cards is dealt from a standard deck of cards. What is the probability that the hand is a full house?
• Find the number of ways of seating 20 people in a row if four of them are special and must have at least two other people between them.
• Find the number of ways of painting tweny different objects with four different colours if each colour must be used at least once.
• Three dice are thrown and the scores are added to obtain a sum between 3 and 18. What is the probability that the sum will be 10?
• Find the number of binary sequences of length k which do not contain the pattern 00.
• A disease spreads among a population in such a way that the change in the number infected between two successive weeks is twice to the change in the number infected in the two previous weeks. Obtain an expression for the number infected in week n.

Introduction to Category Theory (S2, Mr Nxumalo):

Category theory provides an absolute structure for comparing different branches of mathematics and also develops general constructions which can be used to describe and formalize abstraction of other mathematical concepts such as sets, groups and topology.
The aim of this course is to introduce basic notions of category theory, their primary properties and examples (predominantly in sets, algebra and topology).

Pre-requisite: MAM202 or equivalent course.

Co-requisite: Topology.

Machine Learning (S2, Dr Atemkeng):

This course will cover topics on Dimensionality reduction (SVD, PCA, Autoencoders), clustering algorithms ( K-means, Hierarchical Clustering, Gaussian Mixture model),  Ensemble methods (Supervised decision trees and random forests, Unsupervised random forests), Outlier detection,  Cross-Validation and an introduction to Artificial Neural Network. Each aspect of the course will be linked to a practical and theory assessment.

Matrix Groups (S2, Dr Remsing):

Groups of transformations; actions of groups on sets; Euclidean spaces; matrix groups; the matrix exponential; Lie algebras.
OPTIONAL: (matrix) groups and geometry.

Measure theory (S1, Dr Pinchuck)
:

We begin by study Riemann integration and some of its shortcomings. In order to overcome many of these difficulties, we introduce the concept of a measure space. In particular, we focus on the Lebesgue measure on a Euclidean space which assigns a conventional length, area and volume of Euclidean geometry. This leads to an improved theory of integration of real-valued functions. We will cover the important theorems in measure theory and emphasis will be put on solving various problems involving measure and integration.

Pre-requisite: 3rd year real analysis or equivalent.

Neural Networks (S1, Prof. Burton):

An artificial neural network (ANN) is a parallel computational system which consists of layers of neurons (which are just functions, called transfer functions) and connections between the layers (the connections are achieved by means of matrices of adjustable parameters). The parameters are adjusted in a way which roughly resembles the way in which neurotrasmitters in the brain are adjusted during learning.

The study of ANNs is heavily dependent on linear algebra and multivariate calculus and some basic concepts from statistics. The course will develop as follows.

• Regression and beyond

• Perceptrons: These simple ANNs were introduced in the 1950s and were the first real attempt to get a computer to classify patterns. However, they were not able to classify certain patterns and the initial excitement died down. Although they are not used anymore they are worth studying as an introduction to neural computing.

• Multi–layer, feed–forward networks with differentiable transfer functions: The machine learning pioneer, Paul Werbos, used a crucial notion, called backpropagation, to adjust the parameters in a neural network. This put ANNs back on the map and the study of artificial intelligence took off. We will study this in depth and construct ANNs from scratch using the MATLAB computational environment. With this experience behind us we will explore the MATLAB Deep Learning toolbox and use this to solve complex pattern recognition and prediction problems.

• Radial basis function networks (RBF): These ANNs are very different from feed–forward networks but have been shown to be equivalent to them. We will construct these RBF networks to solve complex problems.

• Dynamic Networks: These networks are deployed for time–series problems. We will develop our own methods, as well as the MATLAB toolbox, to construct and deploy them to solve time–series problems.

• Competitive Learning: The neurons in a competitive layer compete, in a sense which will be made precise, for the input data points. An updating procedure clusters the input data points such that input points in a cluster are similar to each other and dissimilar from data points in other clusters. In this way, the set of input data points are classified. This is an example of unsupervised learning. We will do this from first principles and also with the MATLAB Deep Learning toolbox.

Numerical Modelling (S1, Prof. Pollney):

Partial differential equations arise in a number of applications modelling real-world phenomena. However, nonlinear equations do not lend themselves easily to analytic solution so that computer methods are necessary to model their behaviour. This course develops advanced numerical methods for solving evolution problems, in particular hyperbolic and parabolic PDEs. We introduce discrete approximations to continuous equations, study potential sources of error, and develop advanced techniques from the method of characteristics in order to approximate generic phenomena such as shocks.

Pre-requisite: MAP311 Numerical Analysis, MAP314 Partial Differential Equations.

Spectral Collocation Method (S2, Dr Oloniiju):

This course will cover the fundamentals of spectral and pseudo-spectral collocation methods, emphasizing implementation in Python or MATLAB. Spectral methods are powerful computational methods for solving differential equations that arise in applied sciences. Compared to other traditional numerical methods, spectral-based methods have been reported to be more accurate, particularly for problems with smooth solutions. This course gives students a thorough foundation and offers a hands-on approach to applying and implementing spectral methods. The course will cover topics in: monomial and Lagrange interpolation on equal and unequal grids; collocation with monomial, Chebyshev and Lagrange polynomials as basis function; matrix-based approach to collocation and interpolation; domain decomposition for time-dependent differential equations; linearization techniques for nonlinear problems and computing eigenvalues of linear boundary value problems.

Pre-requisites: Numerical Analysis, Numerical Programming with MATLAB/Python, Differential Equations.

Topology (S2, Dr Mclean):

Topology is the study of geometrical properties and spatial relations that are preserved by the continuous change of shape or size of objects. This course will cover general topology (the foundation for most other branches of topology.) The topics covered will be set theory (briefly), topological spaces, connectedness, compactness, and the countability and separation axioms.