A pac-learning algorithm is d-space bounded, if it stores at most d ex
amples from the sample at any time. We characterize the. d-space learn
able concept classes. For this purpose we introduce the compression pa
rameter of a concept class C and design our Trial and Error Learning A
lgorithm. We show : C is d-space learnable if and only if the compress
ion parameter of C is at most d. This learning algorithm does not prod
uce a hypothesis consistent with the whole sample as previous approach
es e.g. by Floyd, who presents consistent space bounded learning algor
ithms, but has to restrict herself to very special concept classes. On
the other hand our algorithm needs large samples; the compression par
ameter appears as exponent in the sample size. We present several exam
ples of polynomial time space bounded learnable concept classes: -all
intersection closed concept classes with finite VC-dimension. - convex
n-gons in R(2). - halfspaces in R(n). - unions of triangles in R(2).
We further relate the compression parameter to the VC-dimension, and d
iscuss variants of this parameter.