Pattern Classification, Part 1This unique text/professional reference provides the information you need to choose the most appropriate method for a given class of problems, presenting an in-depth, systematic account of the major topics in pattern recognition today. A new edition of a classic work that helped define the field for over a quarter century, this practical book updates and expands the original work, focusing on pattern classification and the immense progress it has experienced in recent years."--BOOK JACKET. |
From inside the book
Results 1-3 of 83
Page 153
... Consider a small perturbation away from the optimal , w + Aw , and derive the solution condition of Eq . 104 . 43. Consider multidiscriminant versions of Fisher's method for the case of c Gaus- sian distributions in d dimensions , each ...
... Consider a small perturbation away from the optimal , w + Aw , and derive the solution condition of Eq . 104 . 43. Consider multidiscriminant versions of Fisher's method for the case of c Gaus- sian distributions in d dimensions , each ...
Page 204
... Consider the simple nearest - neighbor editing algorithm ( Algorithm 3 ) . ( a ) Show by counterexample that this algorithm does not yield the minimum set of points . ( Consider a problem where the points from each of two categories are ...
... Consider the simple nearest - neighbor editing algorithm ( Algorithm 3 ) . ( a ) Show by counterexample that this algorithm does not yield the minimum set of points . ( Consider a problem where the points from each of two categories are ...
Page 271
... Consider two circular Gaussian distributions_p ( x | w ; ) ~ N ( u , a I ) and P ( w ; ) for i = 1,2 where I is the identity matrix and the other parameters can take on arbitrary values . Without performing any explicit calculations ...
... Consider two circular Gaussian distributions_p ( x | w ; ) ~ N ( u , a I ) and P ( w ; ) for i = 1,2 where I is the identity matrix and the other parameters can take on arbitrary values . Without performing any explicit calculations ...
Contents
MAXIMUMLIKELIHOOD AND BAYESIAN | 84 |
NONPARAMETRIC TECHNIQUES | 161 |
LINEAR DISCRIMINANT FUNCTIONS | 215 |
Copyright | |
10 other sections not shown
Other editions - View all
Computer Manual in MATLAB to accompany Pattern Classification David G. Stork,Elad Yom-Tov No preview available - 2004 |
Computer Manual in MATLAB to accompany Pattern Classification David G. Stork,Elad Yom-Tov No preview available - 2004 |
Common terms and phrases
analysis approach assume backpropagation Bayes Bayesian bias binary Boltzmann calculate Chapter cluster centers component classifiers Consider convergence corresponding covariance matrix criterion function d-dimensional data set decision boundary denote derivation discriminant function distance distribution entropy error rate feature space FIGURE Gaussian given gradient descent Hidden Markov Models hidden units independent input iteration jackknife estimate labeled large number learning algorithm maximum-likelihood estimate mean methods minimize minimum minimum description length mixture density nearest-neighbor neural networks node nonlinear normal number of clusters number of samples obtain optimal output units p(xw parameters pattern recognition Perceptron points prior probabilities probability density problem procedure random variables randomly Section sequence shown shows simple solution split statistical statistically independent string Suppose target tion training data training error training patterns training set tree two-category unsupervised learning variance w₁ weight vector x₁ zero