People tend not to have perfect memories when it comes to learning, or
to anything else for that matter. Most formal studies of learning, ho
wever, assume a perfect memory. Some approaches have restricted the nu
mber of items that could be retained. We introduce a complexity theore
tic accounting of memory utilization by learning machines. In our new
model, memory is measured in bits as a function of the size of the inp
ut. There is a hierarchy of learnability based on increasing memory al
lotment. The lower bound results are proved using an unusual combinati
on of pumping and mutual recursion theorem arguments. For technical re
asons, it was necessary to consider two types of memory: long and shor
t term.