This article studies the computational power of various discontinuous real
computational models that are based on the classical analog recurrent neura
l network (ARNN). This ARNN consists of finite number of neurons; each neur
on computes a polynomial net function and a sigmoid-like continuous activat
ion function. We introduce arithmetic networks as ARNN augmented with a few
simple discontinuous (e.g., threshold or zero test) neurons. We argue that
even with weights restricted to polynomial time computable reals, arithmet
ic networks are able to compute arbitrarily complex recursive functions. We
identify many types of neural networks that are at least as powerful as ar
ithmetic nets, some of which are not in fact discontinuous, but they boost
other arithmetic operations in the net function (e.g., neurons that can use
divisions and polynomial net functions inside sigmoid-like continuous acti
vation functions). These arithmetic networks are equivalent to the Blum-Shu
b-Smale model, when the latter is restricted to a bounded number of registe
rs.
With respect to implementation on digital computers, we show that arithmeti
c networks with rational weights can be simulated with exponential precisio
n, but even with polynomial-time computable real weights, arithmetic networ
ks are not subject to any fixed precision bounds. This is in contrast with
the ARNN that are known to demand precision that is linear in the computati
on time.
When nontrivial periodic functions (e.g., fractional part, sine, tangent) a
re added to arithmetic networks, the resulting networks are computationally
equivalent to a massively parallel machine. Thus, these highly discontinuo
us networks can solve the presumably intractable class of PSPACE-complete p
roblems in polynomial time.