Integrating optical force sensing with visual servoing for microassembly

Citation
Y. Zhou et al., Integrating optical force sensing with visual servoing for microassembly, J INTEL ROB, 28(3), 2000, pp. 259-276
Citations number
42
Categorie Soggetti
AI Robotics and Automatic Control
Journal title
JOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS
ISSN journal
09210296 → ACNP
Volume
28
Issue
3
Year of publication
2000
Pages
259 - 276
Database
ISI
SICI code
0921-0296(200007)28:3<259:IOFSWV>2.0.ZU;2-P
Abstract
For microassembly tasks uncertainty exists at many levels. Single static se nsing configurations are therefore unable to provide feedback with the nece ssary range and resolution for accomplishing many desired tasks. In this pa per we present experimental results that investigate the integration of two disparate sensing modalities, force and vision, for sensor-based microasse mbly. By integrating these sensing modes, we are able to provide feedback i n a task-oriented frame of reference over a broad range of motion with an e xtremely high precision. An optical microscope is used to provide visual fe edback down to micron resolutions, while an optical beam deflection techniq ue (based on a modified atomic force microscope) is used to provide nanonew ton level force feedback or nanometric level position feedback. Visually se rvoed motion at speeds of up to 2 mm/s with a repeatability of 0.17 mu m ar e achieved with vision alone. The optical beam deflection sensor complement s the visual feedback by providing positional feedback with a repeatability of a few nanometers. Based on the principles of optical beam deflection, t his is equivalent to force measurements on the order of a nanonewton. The v alue of integrating these two disparate sensing modalities is demonstrated during controlled micropart impact experiments. These results demonstrate m icropart approach velocities of 80 mu m/s with impact forces of 9 nN and fi nal contact forces of 2 nN. Within our microassembly system this level of p erformance cannot be achieved using either sensing modality alone. This res earch will aid in the development of complex hybrid MEMS devices in two way s; by enabling the microassembly of more complex MEMS prototypes; and in th e development of automatic assembly machines for assembling and packaging f uture MEMS devices that require increasingly complex assembly strategies.