A natural environment, the high dimensional measurement signals, lying in the F-dimensional measurement space, usually represent patterns residing in a much lower, D- dimensional subspace embedded in the ambient measurement space. Subclass discriminant analysis (MSDA) and restricted Gaussian model is first presented, and then two further discriminant analysis (DA) methods, fractional step MSDA (FSMSDA) and kernel MSDA (KMSDA) are proposed. Linking MSDA to an appropriate Gaussian model allows the derivation of a new DA method under the Expectation Maximization (EM) framework (EM-MSDA), that derives simultaneously the discriminant subspace as well as the maximum likelihood estimates. Dimensionality reduction (DR) is an important component of statistical pattern classifiers that helps to overcome estimation problems in noisy high-dimensional environments, and thus, often results in improved classifier accuracy as well as lower storage and processing time requirements. A fundamental DR technique is linear discriminant analysis. Despite its elegant algebraic formulation, two important shortcomings of LDA restrict its use in real-world applications: a) The LDA criterion cannot be applied directly when the matrix Sw is rank-deficient, a situation that occurs frequently in many applications involving small sample size (SSS) data. Several methods have been proposed to deal with this problem, including PCA+LDA, MMC LDA, dICA. LDA faces difficulties in deriving a discriminant subspace when the classes are not linearly separable (a problem called hereafter nonlinearity problem). This problem has been mostly addressed by using kernel extensions of LDA, or methods that use local linear discriminant analyzers to learn the nonlinear data structure. The SSS problem remains, and to address it similar solutions to those discussed above are exploited for both the kernel-based and local-based LDA variants.
You are here: Home / ieee projects 2013 / CLASSIFICATION OF ANALYSIS- CLUSTERING PATTERN DISCRIMINATIVE ALGORITHMS