In the previous activity we used minimum distance classification as our predictor in classifying objects. Essentially, what we did is to calculate the deviation of an object from the mean features of different classes. However such classification fails when used with features that overlaps in both classes. In this activity we used Linear Discriminant analysis as predictor in classifying objects. We limit the number of classes to two and use the same features as the previous activity.
Consider the plot below of the features in phase space plot.
Figure 1: Phase space plot using area and perimeter. The regions are well segmented and we expect that simple minimum distance classification should work well as demonstrated in activity 14.
Figure 2: Phase space plot using red and green composition of the image. As can be seen, there are overlaps in both coordinates for both classes such that when the features (independent variables) are used alone, the classification would fail.
LDA transforms a class as a linear superposition of features and assumes therefore that features are linearly independent from each other. The aim of LDA is to minimize the error of classification that an object belongs to a certain class. The classification rule is that an object belongs to a class if it has the highest conditional probability in that class. The probability of "belongingness" can be computed using the formula below. It is worthwhile to state that the equation below was derived with the assumption that each class has normally distributed conditional probability[1-3].
where
- ui: mean features in group i.
- C: pooled covariance defined as
where ci is defined as the covariance matrix of group i, n is the total number of samples and ni is the number of samples in group i
- Pi: prior probability usually assumed to be the number of samples in the group over the total number of samples, ni/n
- xi: features data of group i.
- xk: feature data of row k for a certain group.
Using the above formulas we compute for fi and determine the class with the highest f for each test object using the features in figure 1 and 2.
Table 1: Calculated values of f1 and f2 for different test objects using area and perimeter as features.
The classification is 100% as expected since the phase space plot of area and perimeter are already well segmented as shown in figure 1.
Table 2:Calculated values of f1 and f2 for different test objects using red and green composition as features.
The classfication failed 1/10 resulting to 90% success classification rate. This error is most probably due to the limited number of training sets (5) and the assumption of normal distribution conditional probability might not yet be satisfied.
In this activity, I give myself a grade of 10 for understanding a bit on LDA and performing the classifications well.
I would like to acknowledge Mark Jayson Villangca and Jay Samuel Combinido for useful discussions.
References:
[1] http://people.revoledu.com/kardi/tutorial/LDA/Numerical%20Example.html
[2] Applied Physics 186 Activity manual (c)2007 Maricor Soriano
[3] http://en.wikipedia.org/wiki/LDA
No comments:
Post a Comment