A. Exam Dates
Monday 18.02.2013 (only in the morning)
Wednesday 20.02.2013 (only in the morning)
B. Signing up for the Exam
Reserving a slot for the exam is only possible after January 6th, 2013.
You must reserve a time-slot for the exam, independent of whether you have signed up at meinCampus. You can do so by:
either by personally visiting the secretaries at the Pattern Recognition Lab, at the 09.138 at Martenstr. 3, 91058 Erlangen,
or by sending them an email at Kristina Müller at mueller(at)cs.fau.de or at Iris Koppe at koppe(at)cs.fau.de . Make sure in your email to write your full name, student ID, program of Studies, birthdate, number of credits and type of exam (e.g. benoteter Schein, unbenoteter Schein, Prüfung durch meinCampus, etc.).
The updated slides will be posted on the web soon after the corresponding lecture is completed.
In order to prepare yourself for an upcoming lecture, look at the slides of the previous Winter semester.
|Key PR Concepts:||the pipeline of a PR system, terminology, postulates of PR|
|Sampling:||review of Fourier analysis, Nyquist sampling theorem|
|Quantization:||signal-to-noise ratio, pulse code modulation, vector quantization, k-means algorithm|
|Equalization and Thresholding:||histogram equalization, thresholding, binarization, maximum likelihood estimation, various thresholding algorithms (intersection of Gaussians, Otsu's algorithm, unimodal algorithm, entropy-based)|
|Noise Suppression:||Linear Shift Invariant transformations, convolution, mean filter, Gaussian filter, median filter|
|Edge Detection:||gradient-based edge detector, Laplacian of Gaussian, sharpening|
|Non-linear Filtering:||recursive filters, homomorphic filters, cepstrum, morphological operators, rank operators|
|Pattern Normalization:||size normalization, location normalization, pose normalization, geometric moments, central moments|
|Introduction to Feature Extraction:||curse of dimensionality, heuristic versus analytic feature extraction methods, projection on orthogonal bases, Fourier transform as a feature|
|Orthonormal Bases for Feature Extraction:||spectrogram, Walsh-Hadamard transform, Haar transform|
|LPC and Moments:||linear predictive coding, moments as features, Hu moments|
|Multiresolution Analysis:||short-time fourier transform, continuous wavelet transform, discrete wavelt transform, wavelet series|
|PCA, LDA:||introduction to analytic feature extraction, principal component analysis, eigenfaces, linear discriminant analysis, fisherfaces|
|OFT:||optimal feature transform, Mahalanobis distance, feature transform|
|Optimization Methods:||gradient descent, coordinate descent|
|Feature Selection:||objective functions for feature selection including entropy and KL-divergence, strategies for exploring the space of feature subsets including branch-and-bound|
|Bayesian Classifier:||introduction to classification, decision function, misclassification cost, misclassification risk, Bayesian classifier|
|Gaussian Classifier:||Gaussian classifier, linear vs. quadratic decision boundaries|
|Polynomial Classifiers:||polynomial classifier, discriminant functions|
|Non-parametric Classifiers:||K-nearest neighbor density estimation, Parzen windows|
|Artificial Neural Networks:||introduction to ANNs, ANN and classification, Radial Basis Function ANNs|
|Multilayer Perceptrons:||ANN layouts, feed-forward networks, perceptron, MLPs, back-propagation|
|Review:||end of lecture review, brief recap of what was covered in class|