A simple method fur extracting emotion from a human face, as a form of nonv
erbal communication,was developed to cope with and optimize mobile communic
ation in a globalized and diversified society. A cartoon face based model w
as developed and used to evaluate emotional content of real faces. After a
pilot survey, basic rules were defined and student subjects were asked to e
xpress emotion using the cartoon face. Their face samples were then analyze
d using principal component analysis and the Mahalanobis distance method. F
eature parameters considered as having relations with emotions were extract
ed and new cartoon faces (based on these parameters) were generated. The su
bjects evaluated emotion of these cartoon faces again and we confirmed thes
e parameters were suitable. To confirm how these parameters could be applie
d to real faces, we asked subjects to express the same emotions which were
then captured electronically. Simple image processing techniques were also
developed to extract these features from real faces and we then compared th
em with the cartoon face parameters. It is demonstrated via the cartoon fac
e that we are able to express the emotions from very small amounts of infor
mation. As a result, real and cartoon faces correspond to each other. It is
also shown that emotion could be extracted from still and dynamic real fac
e images using these cartoon based features.