Monday, September 29, 2008

A19-Probabilistic Classification

I used data from the previous activity. There were again two classes (piattos and pillows), each with four training images . There was only one test image chosen for analysis. The features set was the same as in the previous activity.

----RG----Area--Classification----

0.3130230 5428. piattos
0.2596636 6817. piattos
0.2721711 5223. piattos
0.3666842 5742. piattos
0.8614653 5689. pillows
0.8959559 4862. pillows
0.9718898 3951. pillows
0.9224472 3537. pillows






We have our global feature set


x = [0.3130230 5428.;


0.2596636 6817.;


0.2721711 5223.;


0.3666842 5742.;


0.8614653 5689.;


0.8959559 4862.;


0.9718898 3951.;


0.9224472 3537.];


and its classification vector


y = [1;1;1;1;2;2;2;2];


We separate the global feature set to class feature sets


x1 = [0.3130230 5428.; 0.2596636 6817.; 0.2721711 5223.; 0.3666842 5742.];
x2 = [0.8614653 5689.; 0.8959559 4862.; 0.9718898 3951.; 0.9224472 3537.];


We get the mean for each feature in each group


u1 = [0.3028855 5802.5];


u2 = [0.9129396 4509.75];


and the global mean vector


u = [0.6079125 5156.125];


Then we get the mean-corrected data (xnaught1 and xnaught2) and the covariance matrix for each group (c1 and c2)


xnaught1 = [- 0.2948895 271.875;


- 0.3482489 1660.875;


- 0.3357414 66.875;


- 0.2412283 585.875];


xnaught2 = [0.2535528 532.875;


0.2880434 - 294.125;


0.3639773 - 1205.125;


0.3145347 - 1619.125 ];


c1 = [0.0947876 - 205.58834;


- 205.58834 795035.89];


c2 = [0.0946674 - 224.37948;


- 224.37948 1111089.3];


The pooled within group matrix C is then


C = [0.0947275 - 214.98391;


- 214.98391 953062.61];


Its inverse is


inv(C) = [21.629463 0.0048790;


0.0048790 0.0000021];


The prior probability vector is p = [4/8; 4/8];


The linear discriminant functions yield


----f1----- ----f2----- Class


40.193166 38.215655 1.


57.712379 55.641361 1.


35.908829 33.609495 1.


46.444826 44.89887 1.


62.954231 64.805784 2.


52.618282 54.544249 2.


42.555135 44.824398 2.


35.055327 36.902364 2.


Let our test image be that of a member of the piattos group (suppose we do not know this fact). It has a feature vector of


test = [0.3322144 0.8215565];


Its f1 and f2 values are 43.269644 and 41.458361, respectively.


We show the linear discriminant functions plot and use it to classify the test image. Through the graph, we can say that the test sample is a piattos chip.


Code:

x = [0.313023 0.7962447; 0.2596636 1.; 0.2721711 0.7661728; 0.3666842 0.842306; 0.8614653 0.8345313; 0.8959559 0.7132170; 0.9718898 0.5795805; 0.9224472 0.5188499];


y = [1;1;1;1;2;2;2;2];

x1 = [0.313023 0.7962447; 0.2596636 1.; 0.2721711 0.7661728; 0.3666842 0.842306];


x2 = [0.8614653 0.8345313; 0.8959559 0.7132170; 0.9718898 0.5795805; 0.9224472 0.5188499];


u1 = mean(x1,'r');

u2 = mean(x2,'r');

u = mean(x,'r');

xnaught1 = [];

for i = 1:size(x1,1);

xnaught1(i,:) = x1(i,:) -u;

end

xnaught2 = [];

for i = 1:size(x2,1);

xnaught2(i,:) = x2(i,:) -u;

end

c1 = xnaught1'*xnaught1/size(xnaught1,1);

c2 = xnaught2'*xnaught2/size(xnaught2,1);

C = (4*c1 + 4*c2)/8;
Cinv = inv(C);

P = [4/8; 4/8];

f1 = u1*Cinv*x' - (u1*Cinv*u1')/2 + log(P(1));

f2 = u2*Cinv*x' - (u2*Cinv*u2')/2 + log(P(2));

test = [0.3322144 0.8215565];

f1test = u1*Cinv*test' - (u1*Cinv*u1')/2 + log(P(1));

f2test = u2*Cinv*test' - (u2*Cinv*u2')/2 + log(P(2));

SELF-GRADE: 9/10 . I enjoyed the activity but it took me long to do this.

No comments: