This paper collects some ideas targeted at advancing our understanding of t
he feature spaces associated with support vector (SV) kernel functions. We
first discuss the geometry of feature space. in particular, we review what
is known about the shape of the image of input space under the feature spac
e map, and how this influences the capacity of SV methods. Following this,
we describe how the metric governing the intrinsic geometry of the mapped s
urface can be computed in terms of the kernel, using the example of the cla
ss of inhomogeneous polynomial kernels, which are often used in SV pattern
recognition. We then discuss the connection between feature space and input
space by dealing with the question of how one can, given some vector in fe
ature space, find a preimage (exact or approximate) in input space. We desc
ribe algorithms to tackle this issue, and show their utility in two applica
tions of kernel methods.. First, we use it to reduce the computational comp
lexity of SV decision functions; second, we combine it with the Kernel PCA
algorithm, thereby constructing a nonlinear statistical denoising technique
which is shown to perform well on real-world data.