Human-agent interaction is a growing area of research; there are many appro
aches that address significantly different aspects of agent social intellig
ence. In this paper, we focus on a robotic domain in which a human acts bot
h as a teacher and a collaborator to a mobile robot. First, we present an a
pproach that allows a robot to learn task representations from its own expe
riences of interacting with a human. While most approaches to learning from
demonstration have focused on acquiring policies (i.e., collections of rea
ctive rules), we demonstrate a mechanism that constructs high-level task re
presentations based on the robot's underlying capabilities. Second, we desc
ribe a generalization of the framework to allow a robot to interact with hu
mans in order to handle unexpected situations that can occur in its task ex
ecution. Without using explicit communication, the robot is able to engage
a human to aid it during certain parts of task execution. We demonstrate ou
r concepts with a mobile robot learning various tasks from a human and, whe
n needed, interacting with a human to get help performing them.