- Welcome to all students who register this course.
Pattern recognition techniques are used to automatically classify physical objects (handwritten characters, tissue samples, faces) or abstract multidimensional patterns (n points in d dimensions) into known or possibly unknown number of categories. The design of a pattern recognition system consists of the following main modules: (i) sensing, (ii) feature extraction, (iii) decision making, and (iv) performance evaluation. The availability of low cost and high-resolution sensors (e.g., digital cameras, microphones and scanners) and data sharing over the Internet have resulted in huge repositories of digitized documents (text, speech, image and video). The need for efficient archiving and retrieval of this data has fostered the development of pattern recognition algorithms in new application domains (e.g., text, image and video retrieval, bioinformatics, and face recognition).
A pattern recognition system can be designed based on a number of different approaches: (i) template matching, (ii) geometric (statistical) methods, (iii) structural (syntactic) methods, and (iv) neural (deep) networks. This course will present an introduction to statistical pattern classification. The fundamental background for the course is probability theory. The course will cover techniques for visualizing and analyzing multi-dimensional data along with algorithms for projection, dimensionality reduction, clustering, and classification. The course will present various approaches to classifier design so students can make judicious choices when confronted with real pattern recognition problems. The course is suitable for students in engineering, mathematics, and computer science, who have a basic background in calculus, linear algebra, and probability theory (as typically covered in an undergraduate program in any one of these fields), and who have some interest in exploring the field of pattern recognition.
- Introduction to Pattern Recognition, Feature Detection, Classification
- Review of Probability Theory, Conditional Probability and Bayes Rule
- Random Vectors, Expectation, Correlation, Covariance
- Review of Linear Algebra, Linear Transformations
- Decision Theory, ROC Curves, Likelihood Ratio Test
- Linear and Quadratic Discriminants, Fisher Discriminant
- Sufficient Statistics, Coping with Missing or Noisy Features
- Template-based Recognition, Feature Extraction
- Eigenvector and Multilinear Analysis
- Training Methods, Maximum Likelihood and Bayesian Parameter Estimation
- Linear Discriminant/Perceptron Learning, Optimization by Gradient Descent
- Support Vector Machines
- K-Nearest-Neighbor Classification
- Non-parametric Classification, Density Estimation, Parzen Estimation
- Unsupervised Learning, Clustering, Vector Quantization, K-means
- Mixture Modeling, Expectation-Maximization
- Hidden Markov Models, Viterbi Algorithm, Baum-Welch Algorithm
- Linear Dynamical Systems, Kalman Filtering
- Bayesian Networks
- Decision Trees, Multi-layer Perceptrons
- Reinforcement Learning with Human Interaction
- Genetic Algorithms
- Combination of Multiple Classifiers "Committee Machines"
All course announcements, lecture slides, assignments, and projects will be made available on Blackboard, please frequently check the site and contact the instructor if there is any question.
This class demands preliminary backgrounds (e.g., undergraduate-level understanding) in probability, statistics, and linear algebra. Basic knowledge of MATLAB and Python programming is essential.
Prof. Zhanpeng Jin
Department of Electrical and Computer Engineering
Email: zjin at binghamton dot edu (preferred)
- Lecture: Tuesday and Thursday 7:35PM ~ 9:00PM @ES-2324
- Office Hours: Thursday 10:00am ― 12:00pm @ ES-2306
- (Required) Richard O. Duda, Peter E. Hart, and David G. Stork, Pattern Classification, 2th edition, Wiley, 2001.
- (Optional) David G. Stork and Elad Yom-Tov, Computer Manual in MATLAB to Accompany Pattern Classification, Wiley, 2004.
- (Optional) Christopher M. Bishop, Pattern Recognition and Machine Learning, Springer, 2006.