Professional Documents
Culture Documents
Reconocimiento de Imágenes
Image Recognition
Subspace Methods and Appearance-based
Image Recognition
Francesc J. Ferri
Dept. d’Informàtica.
Universitat de València
Image Representation
We will assume that images or image parts have been processed and are to
be used (as a whole) as inputs for a given pattern recognition system.
These images can be used in its raw form of after a convenient change of
representation (global transform, normalization, global descriptors of color,
texture, etc.
Representation spaces
One of the most important facts is that each image object is a point in a
very high dimensional space.
In the case of images of aligned and normalized gray level images of faces
Subspace-based recognition
Discriminative methods
are based on increasing the dissimilarity or the sepparation among different
categories of the considered objects.
Linear methods
consist of applying a linear transformation on the data followed by a trivial
selection of dimensions (linear projection).
Taxonomy
unsupervised
(PCA,ICA,NMF)
linear
supervised (LDA, Null
Space)
Feature
Extraction
nonparametric (NDA,MDS, Sammon)
nonlinear kernel-based (KPCA, KLDA, KICA)
manifold-based (Isomap, LLE)
y = AT x
P
Let Sx = k (xk − µ)(xk − µ)T be the scatter matrix of X .
AT Sx A = diag (λ1 , . . . , λD )
X
T 2
E [||x − AA x|| ] = λk
discarded
(y1 · · · yn ) = Y = AT X = AT (x1 · · · xn )
Let X be
Best 2D projection
1
2000
0.9
0.8 1000
variance
0.7 0
0.6
−1000
0.5
−2000
0.4
1 2 3 4 5 6 7 8 9 10 11 −3000 −2000 −1000 0 1000 2000
dimensions
Fisherfaces
P
Let SW = ci Si be the within-class scatter matrix (Si is the i-th class
scatter matrix).
P
And let SB = ci ni (µi − µ)(µi − µ)T the between-class scatter matrix
(Sx = SW + SB )
Nonlinear extensions
PCA
Given X and its covariance matrix, Sx , solve λv = Sx v (eigenanalysis).
Then construct A = (v1 , . . . , vd ) and project data as AT x.
A linearization mapping
Let φ be a mapping into a high dimensional space. Let Sφ be the
covariance matrix corresponding to φ(X ). Then we need to solve
λw = Sφ w .
As w ∈ span(φ(X )):
λφ(xi ) · w P
= φ(xi ) · (Sφ w ) ∀i
∃αi : w = i αi φ(xi )
and
X
wk · φ(x) = αi φ(xi ) · φ(x)
i