Pattern Recognition [PR]
This lecture gives an introduction into the basic and commonly used classification concepts. First the necessary statistical concepts are revised and the Bayes classifier is introduced. Further concepts include generative and discriminative models such as the Gaussian classifier and Naive Bayes, and logistic regression, Linear Discriminant Analysis, the Perceptron and Support Vector Machines (SVMs). Finally more complex methods like the Expectation Maximization Algorithm, which is used to estimate the parameters of Gaussian Mixture Models (GMM), are discussed. In addition to the mentioned classifiers, methods necessary for practical application like dimensionality reduction, optimization methods and the use of kernel functions are explained. Finally, we focus on Independent Component Analysis (ICA), combine weak classifiers to get a strong one (AdaBoost), and discuss the performance of machine classifiers. In the tutorials the methods and procedures that are presented in this lecture are illustrated using theoretical and practical exercises.
Dates & Rooms:
Friday, 8:15 - 9:45; Room: C1 - Chemikum
Tuesday, 16:15 - 17:45; Room: H8
Announcements regarding the first week of classes (October 16th and 19th)
- Professor Nöth is going to be at a congress during the first week of the lecture period; therefore, the first two lectures (on October 16th and 19th) will not be given.
- The tutors will attend the first class on October 16th to explain some organizational details.
- In order to catch up, the classes on the following weeks are going to be 2 hours long instead of 1.5 hours (4 hours instead of 3 per week).
- All the announcements and course materials are going to be posted on the StudOn's web-site of the course. The links of the course and of the programming assignments are shown bellow, please register.
https://www.studon.fau.de/studon/goto.php?target=crs_2327822 (only for +2.5 ECTS)