Keywords: Machine learning,
probabilistic modelling, Neural Networks, Bayesian Statistics, Learning
Theory, Support Vector Machines, Kernel Methods and Reinforcement Learning.
Code: COMP GI01 / COMP 4c55
Year: MSc in Intelligent Systems, PhD course at the Gatsby Unit
Prerequisites: A good background in statistics, calculus, linear algebra, and computer science. You should know some programming language (Matlab/Octave, C, Java, ...). It is preferable to be competent in either Matlab or Octave, or be willing to learn it on your own. Any student or researcher at UCL meeting these requirements is welcome to attend the lectures.
Term: 1, 2004
Time: 14.00 to 17.00 Mondays
Location: 305 Pearson Building UCL
Coordinated By: Fernando Pérez-Cruz (fernando-at-gatsby.ucl.ac.uk)
Lecturers: Nathaniel Daw, David
J. C. MacKay, Iain Murray, Fernando Pérez-Cruz, Edward Snelson.
Homework Assignments: all assignments (coursework) for this course are to be handed in to the Gatsby Unit, not to the CS department. Please hand in all assignments at the beginning of lecture on the due date to either Fernando Perez-Cruz or the lecturer on that date. Late assignments will be penalised. If you are unable to come to class, you can also hand in assignments to Fernando Pérez-Cruz in its Office, Room 402, Gatsby Unit (Alexandra House 17 Queen Square).
Late Assignment Policy: Assignments that are handed in late will be penalised as follows: 10% penalty per day for every weekday late.
Cristhopher M. Bishop (1995) Neural Networks for Pattern Recognition. Claredom Press.
David J.C. MacKay (2003) Information Theory, Inference, and Learning Algorithms, Cambridge University Press. (also available online)
Bernhard Schölkopf and Alexander J. Smola (2002) Learning with Kernels. MIT Press. (partially available online)
Tom M. Mitchell (1997) Machine
October 6th: Intorudction to Supervised Learning
Readings: Chapter 1 in Bishop's book.
October 11th: Math and Matlab Review. Basic Optimization
Readings: Cribsheet of Basic Maths Needed for Machine Learning.
Readings: Chapter 6 in Schölkopf and Smola's Book.
October 18th: Multilayered Perceptrons and Bayesian Learning.
Assignment1: Due on the 25th of October (2pm).
October 25th: Optimisation. Basic tools in Machine Learning
Readings: Chapter 6 in Schölkopf and Smola's Book. Sections 9.2 and
9.8 in Bishop's book.
Chapter 6 in Mitchell's book.
Assignment2: Due on Friday the 5th of November
(2pm). Assignment2a Due on Monday the 22nd of
November 1st: Approximation and Sampling
November 15th: Gaussian Processes for Regression
Assignment3: Due on Monday the 29th of November
(2pm). Data for the last 2 questions GPdata
November 22nd: Introduction to Learning Theory
Further Readings: Kernel Methods and their Potential Use in Signal Processing.
November 29th: Support Vector Machines for Classification
Code: SVM 2D demo.
Assignment4: Due on Monday the 6th of December
(2pm). Data for the last questions: data.
Hint for Q3: Good values for C range from 5 to 20 and for sigma between 1 and 10.
December 6th: Extensions to kernel methods
Assignment5: Due on Wednesday the 15th of December
(2pm). data and kpca.m.
December 10th: Reinforcement Learning (B10 Gatsby Unit)
Slides: Reinforcement Learning.