Future space robot generations will replace astronauts in deep space m
issions and routine operations. They will use tools, and perform assem
bly, disassembly and handling tasks for maintenance purposes. A key fe
ature of autonomous, task-level commandable robots is a valid and comp
lete representation of an application's task space for planning and op
timizing a rough action sequence facing a sensorially classified situa
tion. This paper shows such a representation for a robot-based automat
ed material science experiment set-up. It proposes a method of analysi
s by which a valid and complete task space model can be obtained. Resu
lts of practical experiments with a terrestrial laboratory mock-up usi
ng the novel representation scheme are presented.