A tentative syllabus can be found here.
A. Exam Dates
Special Date for people not using Mein Campus: Thursday 08.03.2012
B. Signing up for the Exam
You must reserve a time slot for the exam. You can do so :
either by personally visiting the secretaries at the Pattern Recognition Lab, at the 09.138 at Martenstr. 3, 91058 Erlangen.
or by sending them an email at Kristina Müller at mueller(at)cs.fau.de or at Iris Koppe at koppe(at)cs.fau.de . Make sure in your email to write your full name, student ID, program of Studies, birthdate, type of exam and number of credits (e.g. benoteter Schein, unbenoteter Schein, Prüfung durch meinCampus, etc.).
Students who are interested in hands-on PR experience and an additional 2.5 ECTS credits, can work on one of the following projects:
For more details, contact me directly.
On Wednesday 19.02.2012 there will be a tour of the different on-going projects at the Patter Recognition lab. This is a great opportunity to familiarize yourself with the work performed at LME. It is a must if you are considering doing your Bachelor or Master's thesis at LME.
|Key PR Concepts:||the pipeline of a PR system, terminology, postulates of PR|
Fourier analysis review, Nyquist sampling theorem
|Quantization:||signal-to-noise ratio, pulse code modulation, vector quantization, k-means algorithm|
|Equalization and Thresholding:||histogram equalization, thresholding, binarization, maximum likelihood estimation, various thresholding algorithms (intersection of Gaussians, Otsu's algorithm, unimodal algorithm, entropy-based)|
|LSI Filtering:||Linear Shift Invariant transformations, convolution, mean filter, Gaussian filter, gradient-based edge detector, Laplacian of Gaussian|
|Non-Linear Filtering:||recursive filters, homomorphic filters, cepstrum, morphological operators, rank operators|
|Pattern Normalization:||size normalization, location normalization, pose normalization, geometric moments, central moments|
|Introduction to Feature Extraction:||curse of dimensionality, heuristic versus analytic feature extraction methods, projection on orthogonal bases, Fourier transform as a feature|
|Features from Projection to Orthonormal Bases:||spectrogram, Walsh-Hadamard transform, Haar transform|
|LPC and Moments:||linear predictive coding, moments as features, Hu moments|
|Multiresolution Analysis:||continuous wavelet transform, discrete wavelt transform, wavelet series|
|PCA, LDA:||intro to analytic feature extraction, principal component analysis, eigenfaces, linear discriminant analysis, fisherfaces|
|OFT:||optimal feature transform, Mahalanobis distance, decision boundary|
|Optimization Methods:||gradient descent, coordinate descent|
|Feature Selection:||objective functions for feature selection including conditional entropy and KL-divergence, search strategies, including branch and bound|
|Bayes Classifier:||introduction to classification, decision function, misclassification cost, misclassification risk, Bayes classifier|
|Gaussian Classifier:||Gaussian classifier, linear vs. quadratic decision boundaries|
|Polynomial Classifier:||polynomial classifier, discriminant functions|
|Non-parametric Classifiers:||K-nearest neighbor density estimation, Parzen windows|
|Artificial Neural Networks:||introduction to ANNs, ANN and classification, Radial Basis Function ANNs.|
ANN layouts, feed-forward networks, perceptron, MLPs, back-propagation
|Self-Organizing Map:||unsupervised training, training of SOMs, SOMs in multispectral imaging|
1. Follow this link for additional information on Covariance Matrices.
2. A more detailed tutorial on Principal Component Analysis can be found here.
3. Here is the pseudocode for feature selection using Branch and Bound.
4. You may want to read the following paper which describes how Branch and Bound can be used in Feature Selection. Due to copyright issues, a copy can not be placed on this web-site.
P.M. Narendra and K. Fukunaga, "A Branch and Bound Algorithm for Feature Subset Selection, " IEEE Transactions on Computers, Vol. C-26, No. 9, 1977, pp. 917-922.
5. A general overview on the Branch and Bound methodology is presented here.
6. A detailed paper on Branch and Bound can be found in this web-page.