The course focuses on methods for learning the underlying structures from data and to train models that can make predictions when presented with new data. Such predictions can typically involve the discrimination between different categories of data, or pattern classification, which will be the main focus of this course.

At the end of this course, the student should be able to recognize problems that can be handled by machine learning methods. Furthermore, the student shall be able to use the subject terminology acquired throughout the course to state the problem in a precise manner. To be able to solve the problem, the student must be able to implement a classifier by training it using a representative data material and make sure that it is capable of handling new data. The student should be able to handle different type of classifiers and know the theory for these so that specially designed solutions can be made.

Content

The course starts with an introduction to the fundamental theory, Bayes decision theory. This statistical based theory let us define optimal decision thresholds to distinguish between data elements, represented by so called feature vectors. These decision thresholds are optimal with respect to minimizing the expected error rate. The introductory theory assumes that the statistical functions describing the data are known. This will not be true in practice where we will have to estimate these functions using parametric and non-parametric methods. Alternatively to estimating the statistical functions directly, we can estimate the coefficients in the polynomials describing the decision borders directly. This is introduced with linear discriminant functions where we seek to find the polynomial coefficients the rate of error expressed by a criteria function. To do this, we use iterative gradient descent techniques. Curve fitting by regression analysis is also presented in this context. Further to this neural networks are presented as a method to use when linear discriminant functions falls short. As part of this, deep neural networks will also be discussed, which is the foundation for deep learning. In the techniques presented this far, the class to which each data element belongs is assumed known. In the application of clustering techniques we no longer make this assumption and seek to find natural clusters in the data material. Finally methods for evaluating classifier performance are presented. Another important aspect of classification is how to characterize the data material as feature vectors. During the course illustrative examples from ongoing research projects within biomedical data analysis are presented.

Required prerequisite knowledge

None

Recommended prerequisites

BID230 Introduction to programming, DAT110 Introduction to Programming, ÅMA100 Mathematical methods 1, ÅMA110 Introduction to Probability and Statistics, ÅMA260 Mathematical Methods 2

In the course the different methods are communicated by presenting and explaining the mathematical details. It is recommended that students who wish to follow the course should have solid prior mathematical knowledge especially in linear algebra and statistics. There is a lot of emphasis on the laboratory part of the course where one uses Scientific Python. Those who follow the course should therefore also have good programming skills, and must be prepared to write functions using iterative control structures and think about code reuse.

Exam

Form of assessment

Weight

Duration

Marks

Aid

Written exam

1/1

4 Hours

A - F

No printed or written materials are allowed. Approved basic calculator allowed

Coursework requirements

Exercises

Mandatory work requirements (such as theoretical exercises, laboratory assignments, project assignments and the like) must be approved by the subject lecturer within the specified deadline. The mandatory assignment plan must be approved in order to be admitted to the examination. Candidates who fail the compulsory practice program may not be able to complete this until the next time the course has ordinary teaching.