AN INTELLIGENT SPACE ROBOT FOR CREW HELP AND CREW AND EQUIPMENT RETRIEVAL

Citation
Jd. Erickson et al., AN INTELLIGENT SPACE ROBOT FOR CREW HELP AND CREW AND EQUIPMENT RETRIEVAL, Applied intelligence, 5(1), 1995, pp. 7-39
Citations number
88
Categorie Soggetti
Computer Sciences, Special Topics","Computer Science Artificial Intelligence
Journal title
ISSN journal
0924669X
Volume
5
Issue
1
Year of publication
1995
Pages
7 - 39
Database
ISI
SICI code
0924-669X(1995)5:1<7:AISRFC>2.0.ZU;2-5
Abstract
This paper describes the development status of a prototype supervised intelligent robot for space application for purposes of (1) helping th e crew of a spacecraft such as the Space Station with various tasks su ch as holding objects and retrieving/replacing tools and other objects from/into storage, and for purposes of (2) retrieving detached object s, such as equipment or crew, that have become separated from their sp acecraft. In addition to this set of tasks in this low Earth orbiting spacecraft environment, it is argued that certain aspects of the techn ology can be viewed as generic in approach, thereby offering insight i nto intelligent robots for other tasks and environments. Some candidat e requirements for the space applications are presented which will be refined by the results of the prototype development and evaluation tes ting. Our development approach is described, including space simulatio n environments used in developmental testing. Candidate software archi tectures and their key technical issues which enable real work in real environments to be accomplished safely and robustly are addressed. Re sults of computer simulations of retrieving detached objects, includin g the situated reasoning/reaction plan approach used, are presented, a s well as the results of an air bearing floor simulation of retrieving detached objects. Also described are characterization results on the usable reduced gravity environment in an aircraft flying parabolas (to simulate weightlessness) and results on hardware performance there. T hese results show it is feasible to use that environment for evaluativ e testing of dexterous grasping based on real-time vision of freely ro tating and translating objects.