An autonomous robot involved in long and complex missions should be able to
generate, update and process its own plans of action. In this perspective,
it is not plausible that the meaning of the representations used by the ro
bot is given from outside the system itself. Rather, the meaning of interna
l symbols must be firmly anchored to the world through the perceptual abili
ties and the overall activities of the robot. According to these premises,
in this paper we present an approach to action representation that is based
on a "conceptual" level of representation, acting as an intermediate level
between symbols and data coming from sensors. Symbolic representations are
interpreted by mapping them on the conceptual level through a mapping mech
anism based on artificial neural networks. Examples of the proposed framewo
rk are reported, based on experiments performed on a RWI-B12 autonomous rob
ot. (C) 2001 Elsevier Science B.V. All rights reserved.