The majority rule algorithm for learning binary weights for a perceptr
on is analysed under the uniform distribution on inputs. It is shown t
hat even though the algorithm is demonstrably inconsistent on random s
amples for very small sample sizes, it nevertheless exhibits a curious
and abrupt asymptotic transition to consistency at moderate sample si
zes. Particular consequences are that the algorithm PAC-learns majorit
y functions in linear time from small samples and that, while the gene
ral variant of binary integer programming embodied here is NP-complete
, almost all instances of the problem are tractable given a sufficient
number of inequalities to be satisfies. (C) 1996 Academic Press, Inc.