In this paper, an algorithm for the design of functional link single layer
neural networks using N binary inputs is described. The resulting connectio
n weights are pure integer numbers. These integer weights facilitate faster
learning by the neural network due to binary operations rather than algebr
aic multiplications. Comparison between various learning algorithms, mainly
backpropagation with the delta rule, and the functional link approach is d
emonstrated via several recognition applications. It is illustrated that th
e functional link approach, due to enhancing input patterns, produces a rob
ust algorithm for linearly non-separable classification problems in terms o
f processing speed and convergence. Furthermore, implementation of the neur
al network can be accomplished using the vast availability of off-the-shelf
components and VLSI techniques.