In this letter, a constructive solution to the N-bit parity problem is prov
ided with a neural network that allows direct connections between the input
layer and the output layer. The present approach requires no training and
adaptation, and thus it warrants the use of the simple threshold activation
function for the output and hidden layer neurons. It is previously shown t
hat this choice of activation function and network structure leads to sever
al solutions for the 3-bit parity problem obtained using linear programming
. One of the solutions for the 3-bit parity problem is then generalized to
obtain a solution for the N-bit parity problem using [N/2] hidden layer neu
rons. It is shown that through the choice of a "staircase" type activation
function, the [N/2] hidden layer neurons can be further combined into a sin
gle hidden layer neuron. (C) 1999 Elsevier Science Ltd. All rights reserved
.