Pages

Tuesday, December 18, 2007

Monday Morning Algorithm 7: Diffusion Maps for Dimensionality Reduction and Manifold Parametrization.

There aren't that many dimensionality reduction approaches that also try to provide some parametrization of the manifold. The diffusion map approach is one of them. It tries to approximate a Laplace-Beltrami operator on the manifold. This is computed through the algorithm of page 33 of Stephane Lafon's dissertation [1]. Extensions can be found here and interactive animated examples can be found here. For today's algorithm, we will use images of Buzz Lightyear rotating around a single axis. The file is here. Obviously this is an unoptimized code. For better implementation of diffusion maps and other dimensionality reduction technique, this site is a good choice .

What do we do here ? we first compute the inter distance between each frame, then plug this into a kernel which eventually provides an idea of how connected each frame is to each other. Using a thermal heat transfer analogy, a frame is a node and the new distance between any one node to another one is inversely proportional the number of nodes between the two nodes. The new metric induced by this new distance enables a new parametrization of the manifold. In our case, the manifold is made of one object rotating around one axis. Parametrization of the underlying manifold should yield a one parameter family since only one dimension is being varied between each frame (one rotation). In order to show this, we use the first eigenfunction of the Laplace Beltrami and sort each frame according to its projection on that eigenfunction. The sorting along that dimension permits one to sort Buzz Lightyear rotating continuously from one end to the other.

clear
% Stephane Lafon, “Diffusion maps and geometric harmonics”,
% Ph.D. dissertation, Yale University (May
% 2004)
% written by Cable Kurwitz and Igor Carron
%
% get all the files in jpg
d = dir('buzz*.jpg')
ntotal = size(d,1)
for i=1:ntotal
d(i).name
end
Img = imread(d(1).name);
s=size(Img,1);
w=size(Img,2);
hd=size(Img,3);
swhd=s*w*hd;
%reshape these images into one vector each
for ii=1:ntotal
Img = imread(d(ii).name);
b1(:,ii) = double(reshape(Img, swhd,1));
end
%
% Compressed sensing part
%sa=randn(20,swhd);
%b=sa*b1;
b=b1;
%
%
for i=1:ntotal
for j=1:ntotal
D1(i,j)=norm(b(:,i)-b(:,j),2);
end
end
%
n1=ntotal;
D_sum=0;
for l = 1:n1;
D_sum = D_sum + min(nonzeros(D1(l,:)));
end;
epsilon = D_sum/n1/10; %
%
% Step 1
%
kernel_1 = exp(-(D1./epsilon).^1);
%
% Step 2
%
one_V = ones(1,n1);
%
% Step 3
%
p = kernel_1*one_V';
kernel_2 = kernel_1./((p*p').^1);
%
% Step 4
%
v = sqrt(kernel_2*one_V');
%
% Step 5
%
K = kernel_2./(v*v');
%
% Step 6
%
%% Singular Value Decomposition
[U S V] = svd(K);
% Normalization
for i=1:ntotal-1
phi(:,i) = U(:,i)./U(:,1); %
end
eig_funct = phi(:,1:6);
%
% The eigenvalues of \delta are approximated by those of K, and its
% eigenfunctions Ái are approximated by phi(:,i) = U(:; i):=U(:; 1)
%
vv=eig(K);

% Initial order of Buzz images
figure(1)
for i=1:ntotal
subplot(3,4,i)
imshow(d(i).name)
end

X = vv(ntotal-1).*phi(:,2);
% sorting the Buzz images using the first
% eigenfunctions the Laplace Beltrami operator
[Y,I] = sort(X);
% d(I).name
%
figure(2)
for i=1:ntotal
subplot(3,4,i)
imshow(d(I(i)).name)
end
%
figure(3)
plot(vv(ntotal-1).*phi(:,2),'.')
title(' Buzz Lightyear')
xlabel('Index')
ylabel('Lambda_1 phi_1')
%
figure(4)
plot(vv(ntotal-1).*phi(:,2),vv(ntotal-2).*phi(:,3),'.')
title(' Buzz Lightyear')
xlabel('Lambda_1 phi_1')
ylabel('Lambda_2 phi_2')
%
figure(5)
plot(eig(K),'*')
xlabel('Index of eigenvalue')
ylabel('Value of eigenvalue')
title(' Eingenvalues of K')
vv=eig(K);
%


Please note the ability to test the same algorithm by reducing first the frames through random projections (one needs to delete some % ). It is pretty obvious that one would need a robust metric in the first place instead of L2, one could foresee the use of a specifically designed metric (Improving Embeddings by Flexible Exploitation of Side Information) on top of the reduction operated with the random projections. The specifically designed metric would alleviate the need to come up with the ad-hoc epsilon of this method.

The diffusion mapping has been taken to another level by Mauro Maggioni with diffusion wavelets and Sridhar Mahadevan with Value Function Approximation with Diffusion Wavelets and Laplacian Eigenfunctions.


Reference: [1] S. Lafon, “Diffusion maps and geometric harmonics”, Ph.D. dissertation, Yale University (May 2004)

2 comments:

  1. Hi,

    I read your article and it is useful to me. But I need to implement an algorithm to find diffusion distnce. If you help me on this...I am really thankful to you.

    http://cs.ucf.edu/~liujg/dm.html#framework

    In above, how the eigenvectors of ‘P’ representation corresponding to x1,x2,x3…xn..features. According to me, we cant say so and so eigen vectors corresponding to exact features like x1. here…they gave notation for feature x1…psi(x1)…how can i get it like this?
    thanks in advance,

    --
    Thanks
    naidu

    ReplyDelete
  2. Hi,

    I read your article and it is useful to me. But I need to implement an algorithm to find diffusion distnce. If you help me on this...I am really thankful to you.

    http://cs.ucf.edu/~liujg/dm.html#framework

    In above, how the eigenvectors of ‘P’ representation corresponding to x1,x2,x3…xn..features. According to me, we cant say so and so eigen vectors corresponding to exact features like x1. here…they gave notation for feature x1…psi(x1)…how can i get it like this?
    thanks in advance,

    --
    Thanks
    naidu

    ReplyDelete