Cs. Shieh et al., Digital modeling and hybrid control of sampled-data uncertain system with input time delay using the law of mean, APPL MATH M, 23(2), 1999, pp. 131-152
This paper utilizes an interval Fade approximate method together with inter
val arithmetic operation to convert a continuous-time uncertain system with
input time-delay to an equivalent discrete-time interval model and transfo
rms the robust control law of a continuous-time uncertain system with input
time delay into an equivalent one of a sampled-data uncertain system with
input time delay. The developed discrete-time interval model tightly enclos
es the exact discrete-time uncertain system with input time delay. Based on
the law of mean and inclusion theory, a perturbed digital control law of i
nput time-delay sampled-data uncertain system is newly presented, so that t
he states of the digitally controlled sample-data uncertain system closely
match those of the originally well-designed continuous-time uncertain syste
m. (C) 1999 Elsevier Science Inc. All rights reserved.