Now showing items 1-2 of 2
Parameter quantization effects in Gaussian potential function neural networks
(World Scientific and Engineering Academy and Society, 2001)
In hardware implementations of Gaussian Potential Function Neural Networks (GPFNN) deviation from ideal network parameters is inevitable because of the techniques used for parameter storage and implementation of the functions ...
Fault-tolerant training of neural networks in the presence of MOS transistor mismatches
(IEEE-INST Electrical Electronics Engineers Inc, 2001)
Analog techniques are desirable for hardware implementation of neural networks due to their numerous advantages such as small size low power and high speed. However these advantages are often offset by the difficulty in ...