The Kantorovich-Rubinstein-Wasserstein metric defines the distance between
two probability measures mu and nu on Rd+1 by computing the cheapest way to
transport the mass of mu onto nu, where the cost per unit mass transported
is a given function c(x, y) on R2d+2. Motivated by applications to shape r
ecognition, we analyze this transportation problem with the cost c(x, y) =
\x - y\(2) and measures supported on two curves in the plane. or more gener
ally on the boundaries of two domains Omega, Lambda subset of Rd+1. Unlike
the theory for measures that are absolutely continuous with respect to Lebe
sgue, it turns out not to be the case that mu -a.e. x is an element of part
ial derivative Omega is transported to a single image y is an element of pa
rtial derivative Lambda; however, we show that the images of x are almost s
urely collinear and parallel the normal to partial derivative Omega at x. I
f either domain is strictly convex, we deduce that the solution to the opti
mization problem is unique. When both domains are uniformly convex, we prov
e a regularity result showing that the images of x is an element of partial
derivative Omega are always collinear, and both images depend on x in a co
ntinuous and (continuously) invertible way. This produces some unusual extr
emal doubly stochastic measures.