Drucken
Univis
Search
 
FAU-Logo
Techn. Fakultät Willkommen am Institut für Informatik FAU-Logo

Lecture

The lecture is

  • Wednesday, 08.30 - 10.00 am, H5
  • Friday, 12.15 - 1.45 pm, H5

Start: April 18, 2012

Please sent me an email with your name, field of study, whether you want to take the 5 ECTS or 7.5 ECTS exam and when you want to take it.

There will be no class on 5/30, 6/1 and 6/13. The time is compensated since we have 2 90-minute classes per week instead of 90+45 (see the table "Progression of course" below)

Contents

The lecture will be composed of the following topics:

  • Assessment and comparison of classifiers: ROC analysis
  • Classification and regression trees (CARTs)
  • Graphical models: Bayesian networks, Markov random fields
  • Boosting revisited: probabilistic boosting trees and AdaBoost
  • Probability density estimation: Parzen window, k-nearest neighbour
  • Hidden Markov models
  • Graph cuts

Slides

Topic
1 Slide p.p.
4 Slides p.p.
Progression of the coursePA-TAB18.4.-29.6.
Evaluation MeasuresPA-01PA-01-4
Optimal ClassifierPA-02PA-02-4
Logistic RegressionPA-03

PA-03-4

Mycin, CART PA-04PA-04-4
Bayes NetsPA-05PA-05-4
POMDPPOMDP

POMDP-4

Markov Random FieldsMRFMRF-4
Hidden Markov Modelle

HMM-Theorie

HMM-Phonetik

HMM-Theorie-4

HMM-Phonetik-4
Language modelsLanguageModelsLanguageModels-4
Parzen-EstimateParzenParzen-4

Additional Material

Paper
Topic
Sheet
Tom Fawcett
An introduction to ROC analysis

Pattern Recognition Letters, Volume 27, Issue 8, ROC Analysis in Pattern Recognition, Pages 861 - 874, June 2006
ROCPDF
John A. Swets, Robyn M. Dawes and John Monahan
Better Decicions through Science
Scientific American Magazine, Pages 82 - 87, October 2000
ROCPDF

R. Kuhn
Keyword Classification Trees for Speech Understanding Systems
PhD thesis, McGill Univ., June 1993.

CART

PDF

Nöth, E.; De Mori, R.; Fischer, J.; Gebhard, A.; Harbeck, S.; Kompe, R.; Kuhn, R.; Niemann, H.; Mast, M.
An Integrated Model of Acoustics and Language Using Semantic Classification Trees

Proc. Int. Conf. on Acoustics, Speech and Signal Processing (Int. Conf. on Acoustics, Speech and Signal Processing), Atlanta, 1996, vol. 1, pp. 419-422, 1996
CARTPDF
Trevor Hastie, Robert Tibshirani, and Jerome Friedmann
The Elements of Stastical Learning

(Chapters 9 - Additive Models, Trees and Related Methods)
(Chapter 10 - Boosting and Additive Trees)
2nd Edition, Springer Verlag
CARTPDF
Richard O. Duda, Peter E. Hart, and David G. Stork
Pattern Classification

(Chapter 8 - Non-metric Methods)
2nd Edition, Wiley Interscience
CARTPDF
Eugene Charniak: Bayesian Networks without Tears. AI Magazine 12(4): 50-63 (1991)Bayesian netsPDF
Lehrstuhl für Mustererkennung, FAU, 2004Bayesian netsPDF
Ralf Herbrich, Tom Minka, and Thore Graepel. TrueSkill(TM): A Bayesian Skill Rating System. In: Advances in Neural Information Processing Systems 20,569-576. 2007.Bayesian RankingPDF
B. Thomson and S. Young (2010). "Bayesian update of dialogue state: A POMDP framework for spoken dialogue systems." Computer Speech and Language,24(4): 562-588.POMDPPDF
S. Young (2010). "Still Talking to Machines (Cognitively Speaking). " Interspeech 2010, Chiba, Japan.POMDPPDF
www.superlectures.com/interspeech2010/lecture.phpPOMDP
Chapter 8: Graphical Models, Pattern Recognition and Machine Learning, C. M. Bishop, Springer, 2006, pp. 359-422Markov Random FieldsPDF
L. R. Rabiner and B. H. Juang
An Introduction to Hidden Markov Models

ASSP Magazine, IEEE, Vol. 3, No. 1, pp. 4 - 16, Jan 1986
HMMPDF
Lawrence R. Rabiner
A Tutorial on Hidden Markov Models and Selected Applications in Speech Recognition

Proceedings of the IEEE, Vol. 77, No. 2, pp.257 - 286, Feb 1989
HMMPDF
Richard O. Duda, Peter E. Hart, and David G. Stork
Pattern Classification

(CHAPTER 4. NONPARAMETRIC TECHNIQUES)
2nd Edition, Wiley Interscience
ParzenPDF

Lecture Slides

It is absolutely recommended to attend the lectures in order to make personal notes. In addition, covered topics should be completed with personal research in according technical literature and publications that will be announced in the lecture.