Print
Univis
Search
 
FAU-Logo
Techn. Fakultät Website deprecated and outdated. Click here for the new site. FAU-Logo

Introduction to Pattern Recognition [IntroPR]

Summary
The goal of this lecture is to familiarize the students with the overall pipeline of a Pattern Recognition System. The various steps involved from data capture to pattern classification are presented. The lectures start with a short introduction, where the nomenclature is defined. Analog to digital conversion is briefly discussed with a focus on how it impacts further signal analysis. Commonly used preprocessing methods are then described. A key component of Pattern Recognition is feature extraction. Thus, several techniques for feature computation will be presented including Walsh Transform, Haar Transform, Linear Predictive Coding, Wavelets, Moments, Principal Component Analysis and Linear Discriminant Analysis. The lectures conclude with a basic introduction to classification. The principles of statistical, distribution-free and nonparametric classification approaches will be presented. Within this context we will cover Bayesian and Gaussian classifiers, as well as artifical neural networks. The accompanying exercises will provide further details on the methods and procedures presented in this lecture with particular emphasis on their application.

Dates & Rooms:
Tuesday, 12:15 - 13:45; Room: 0.68
Wednesday, 12:15 - 13:45; Room: 0.68



General Information

A tentative syllabus can be found here.

Exam Information

A. Exam Dates

Tuesday 14.02.2012

Thursday 29.03.2012

Special Date for people not using Mein Campus: Thursday 08.03.2012

B. Signing up for the Exam

You must reserve a time slot for the exam. You can do so :

either by personally visiting the secretaries at the Pattern Recognition Lab, at the 09.138 at Martenstr. 3, 91058 Erlangen.

or by sending them an email at Kristina Müller at mueller(at)cs.fau.de or at Iris Koppe at koppe(at)cs.fau.de . Make sure in your email to write your full name, student ID, program of Studies, birthdate, type of exam and number of credits (e.g. benoteter Schein, unbenoteter Schein, Prüfung durch meinCampus, etc.).

 

Project Information

Students who are interested in hands-on PR experience and an additional 2.5 ECTS credits, can work on one of the following projects:

1. Vessel segmentation in MRA projection images.

2. Manual detection of the optic disk in OCT images.

For more details, contact me directly.

 

Lab Tours

On Wednesday 19.02.2012 there will be a tour of the different on-going projects at  the Patter Recognition lab. This is a great opportunity to familiarize yourself with the work performed at LME. It is a must if you are considering doing your Bachelor or Master's thesis at LME.

Slides

Introduction:course outline
Key PR Concepts: the pipeline of a PR system, terminology, postulates of PR 
Sampling:

Fourier analysis review, Nyquist sampling theorem

Quantization:signal-to-noise ratio, pulse code modulation, vector quantization, k-means algorithm
Equalization and Thresholding: histogram equalization, thresholding, binarization, maximum likelihood estimation, various thresholding algorithms (intersection of Gaussians, Otsu's algorithm, unimodal algorithm, entropy-based)
LSI Filtering:Linear Shift Invariant transformations, convolution, mean filter, Gaussian filter, gradient-based edge detector, Laplacian of Gaussian
Non-Linear Filtering:recursive filters, homomorphic filters, cepstrum, morphological operators, rank operators
Pattern Normalization:size normalization, location normalization, pose normalization, geometric moments, central moments
Introduction to Feature Extraction:curse of dimensionality, heuristic versus analytic feature extraction methods, projection on orthogonal bases, Fourier transform as a feature
Features from Projection to Orthonormal Bases:spectrogram, Walsh-Hadamard transform, Haar transform
LPC and Moments:linear predictive coding, moments as features, Hu moments
Multiresolution Analysis:continuous wavelet transform, discrete wavelt transform, wavelet series
PCA, LDA:intro to analytic feature extraction, principal component analysis, eigenfaces, linear discriminant analysis, fisherfaces
OFT:optimal feature transform, Mahalanobis distance, decision boundary 
Optimization Methods:gradient descent, coordinate descent
Feature Selection:objective functions for feature selection including conditional entropy and KL-divergence, search strategies, including branch and bound 
Bayes Classifier:introduction to classification, decision function, misclassification cost, misclassification risk, Bayes classifier
Gaussian Classifier:Gaussian classifier, linear vs. quadratic decision boundaries
Polynomial Classifier:polynomial classifier, discriminant functions
Non-parametric Classifiers:K-nearest neighbor density estimation, Parzen windows
Artificial Neural Networks:introduction to ANNs, ANN and classification, Radial Basis Function ANNs.
Multilayer Perceptron:

ANN layouts, feed-forward networks, perceptron, MLPs, back-propagation

Self-Organizing Map:unsupervised training, training of SOMs, SOMs in multispectral imaging

Supplemental Material

1. Follow this link for additional information on Covariance Matrices. 

2. A more detailed tutorial on Principal Component Analysis can be found here.

3. Here  is the pseudocode for feature selection using Branch and Bound.

4. You may want to read the following paper which describes how Branch and Bound can be used in Feature Selection. Due to copyright issues, a copy can not be placed on this web-site.

P.M. Narendra and K. Fukunaga, "A Branch and Bound Algorithm for Feature Subset Selection, " IEEE Transactions on Computers, Vol. C-26, No. 9, 1977, pp. 917-922.

5. A general overview on the Branch and Bound methodology is presented here.

6. A detailed paper on Branch and Bound can be found in this web-page.