Contact
+49-9131-85-27775
+49-9131-85-27270
Secretary
Monday | 8:00 - 12:15 |
Tuesday | 8:00 - 16:45 |
Wednesday | 8:00 - 16:45 |
Thursday | 8:00 - 16:45 |
Friday | 8:00 - 12:15 |
Address
Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU)
Lehrstuhl für Informatik 5 (Mustererkennung)
Martensstr. 3
91058 Erlangen
Germany
Powered by
|
Introduction to Pattern Recognition [IntroPR]
Summary
The goal of this lecture is to familiarize the students with the overall
pipeline of a Pattern Recognition System. The various steps involved from
data capture to pattern classification are presented. The lectures start
with a short introduction, where the nomenclature is defined. Analog to
digital conversion is briefly discussed with a focus on how it impacts
further signal analysis. Commonly used preprocessing methods are then
described. A key component of Pattern Recognition is feature extraction.
Thus, several techniques for feature computation will be presented including
Walsh Transform, Haar Transform, Linear Predictive Coding, Wavelets,
Moments, Principal Component Analysis and Linear Discriminant Analysis. The
lectures conclude with a basic introduction to classification. The
principles of statistical, distribution-free and nonparametric
classification approaches will be presented. Within this context we will
cover Bayesian and Gaussian classifiers, as well as artificial neural
networks. The accompanying exercises will provide further details on the
methods and procedures presented in this lecture with particular emphasis on
their application.
Dates & Rooms:Tuesday, 10:15 - 11:45; Room: 0.68 Wednesday, 10:15 - 11:45; Room: 0.68
- Friday's exercises and Wednesday's exercises are again synchronized.
- Tentative course syllabus: It provides a roadmap to the lectures throughout the semester.
- The lecture videos from Winter Semester 2013 are available here. (They are only accessible from inside the university network; If you want to watch the videos from home, consider tunnelling the connection).
- Prof. Niemann's textbook on Pattern Recognition Klassifikation von Mustern is available online.
A. Exam Dates
Monday 17.02.2014
Tuesday 18.02.2014
Thursday 20.02.2014
Wednesday 19.03.2014
Monday 31.03.2014
Tuesday 01.04.2014
B. Signing up for the Exam
Reserving a slot for the exam is only possible after January 6th, 2014. You must reserve a time-slot for the exam, independent of whether you have signed up at meinCampus. You can do so:
either by personally visiting the secretaries at the Pattern Recognition Lab, at the 09.138 at Martenstr. 3, 91058 Erlangen,
or by sending them an email at Kristina Müller at mueller(at)cs.fau.de or at Iris Koppe at koppe(at)cs.fau.de . Make sure in your email to write your full name, student ID, program of Studies, birthdate, number of credits and type of exam (e.g. benoteter Schein, unbenoteter Schein, Prüfung durch meinCampus, etc.). NEWS: The results of the IntroPR exam were added to myCampus. If you want to take a look at your corrected exam (Prüfungseinsicht), please contact Simone Gaffling. The deadline for Prüfungseinsicht is Nov 7th, after this date the grades are finalized.
The updated slides will be posted on the web soon after the corresponding lecture is completed.
In order to prepare yourself for an upcoming lecture, look at the slides of the previous Winter semester. Introduction: | course outline, examples of PR systems | Key PR Concepts: | the pipeline of a PR system, terminology, postulates of PR | Sampling: | review of Fourier analysis, Nyquist sampling theorem | Quantization: | signal-to-noise ratio, pulse code modulation, vector quantization, k-means algorithm | Histogram Equalization and Thresholding: | histogram equalization, thresholding, binarization, maximum likelihood estimation, various thresholding algorithms (intersection of Gaussians, Otsu's algorithm, unimodal algorithm, entropy-based) | Noise Suppression: | Linear Shift Invariant transformations, convolution, mean filter, Gaussian filter, median filter | Edge Detection: | gradient-based edge detector, Laplacian of Gaussian, sharpening | Non-linear Filtering: | recursive filters, homomorphic filters, cepstrum, morphological operators, rank operators | Normalization: | size normalization, location normalization, pose normalization, geometric moments, central moments | Introduction to Feature Extraction: | curse of dimensionality, heuristic versus analytic feature extraction methods, projection on orthogonal bases, Fourier transform as a feature | Orthonormal Bases for Feature Extraction: | spectrogram, Walsh-Hadamard transform, Haar transform | LPC and Moments: | linear predictive coding, moments as features, Hu moments | Multiresolution Analysis: | short-time fourier transform, continuous wavelet transform, discrete wavelt transform, wavelet series | PCA, LDA: | component analysis, eigenfaces, linear discriminant analysis, fisherfaces | OFT: | optimal feature transform, Mahalanobis distance, feature transform | Optimization Methods: | gradient descent, coordinate descent | Feature Selection: | objective functions for feature selection including entropy and KL-divergence, strategies for exploring the space of feature subsets including branch-and-bound | Bayesian Classifier: | introduction to classification, decision function, misclassification cost, misclassification risk, Bayesian classifier | Gaussian Classifier: | Gaussian classifier, linear vs. quadratic decision boundaries | Polynomial Classifiers: | polynomial classifier, discriminant functions | Non-parametric Classifiers: | K-nearest neighbor density estimation, Parzen windows | An Intro to Artificial Neural Networks: | introduction to ANNs, ANN and classification, Radial Basis Function ANNs | Multilayer Perceptrons: | ANN layouts, feed-forward networks, perceptron, MLPs, back-propagation | Deep Convolutional Neural Nets: | Convolutional Neural Networks, local computation, shared weights, convolutional layer, examples | Review: | end of lecture review, brief recap of what was covered in class |
|