Pattern Recognition [PR]
This lecture gives an introduction into the basic and commonly used classification concepts. First the necessary statistical concepts are revised and the Bayes classifier is introduced. Further concepts include generative and discriminative models such as the Gaussian classifier and Naive Bayes, and logistic regression, Linear Discriminant Analysis, the Perceptron and Support Vector Machines (SVMs). Finally more complex methods like the Expectation Maximization Algorithm, which is used to estimate the parameters of Gaussian Mixture Models (GMM), are discussed. In addition to the mentioned classifiers, methods necessary for practical application like dimensionality reduction, optimization methods and the use of kernel functions are explained. Finally, we focus on Independent Component Analysis (ICA), combine weak classifiers to get a strong one (AdaBoost), and discuss the performance of machine classifiers. In the tutorials the methods and procedures that are presented in this lecture are illustrated using theoretical and practical exercises.
Dates & Rooms:
Friday, 10:15 - 11:45; Room: H15
Monday, 9:00 - 9:45; Room: H2 Egerlandstr.3
The lecture will start on Friday, October 16th.
The lecture will take place on
The exercises will take place on
Please register on StudOn for "Pattern Recognition", so that you will be on the e-mail list.
I will have to go to Baltiomre, USA on a short notice. Therefore there will be no lecture on Fri., 13th of Nov. and on Mon., 16th of Nov.
There will be no class on the 8th of January. Instead the class on the 11th and 18th of January will go from 8:15-9:45!