A mechanism suppressing the divergence of synaptic weights should be a
dded to the network performing unsupervised learning algorithms with t
he Hebbian rule. In order to realize competitive learning in hardware,
a synaptic connection circuit keeping a sum of synaptic weights const
ant was fabricated with the complementary field effect transistor (CMO
S) floating gate process. The synaptic weights of each neuron were var
ied by applying the optical correction signals. The sum of synaptic we
ights was kept constant and the weights were memorized in a nonvolatil
e manner. A primitive competitive learning circuit was constructed wit
h the synaptic connection circuits and a winner-take-all (WTA) circuit
. The self-organization of the competitive circuit was confirmed exper
imentally and by means of simulations. The limitation of operating ran
ge and the feasibility of large-scale integration were discussed.