We describe neuromorphic, variable-weight synapses on artificial dendr
ites that facilitate experimentation with correlative adaptation rules
. Attention is focused on those aspects of biological synaptic functio
n that could affect a neuromorphic network's computational power and a
daptive capability. These include sublinear summation, quantal synapti
c noise, and independent adaptation of adjacent synapses. The diffusiv
e nature of artificial dendrites adds considerable flexibility to the
design of fast synapses by allowing conductances to be enabled for sho
rt or for variable lengths of time. We present two complementary synap
se designs: the shared conductance array and the self-timed synapse. B
oth synapse circuits behave as conductances to mimic biological synaps
es and thus enable sublinear summation. The former achieves weight var
iation by selecting different conductances from an on-chip array, and
the latter by modulating the length of time a constant conductance rem
ains activated. Both work with our interchip communication system, vir
tual wires. For the present purpose of comparing various adaptation me
chanisms in software, synaptic weights are stored off chip. This simpl
ifies the addition of quantal weight noise and allows connections from
different sources to the same dendritic compartment to have independe
nt weights.