THESIS
2022
1 online resource (xviii, 86 pages) : illustrations (some color)
Abstract
Artificial Neural Networks (ANN) [1-8] features of self-learning as well as high adaptability is
a type of mathematical algorithmic model that simulates the behavioral characteristics of animal
neural networks and performs distributed parallel information processing. The hardware ANN
performs better than the conventional software ANN in terms of power consumption and efficiency,
as its architecture is analogous to that of a biological neuron. A variety of schemes based on two-or
three-port devices are proposed for hardware ANN that have issues with either low input
impendence or high operating voltage. Metal-oxide (MO) thin film transistor (TFT) with a potential
four-port structure has attracted increasing attention, which is considered a promising approach to
meet the aforementioned c...[
Read more ]
Artificial Neural Networks (ANN) [1-8] features of self-learning as well as high adaptability is
a type of mathematical algorithmic model that simulates the behavioral characteristics of animal
neural networks and performs distributed parallel information processing. The hardware ANN
performs better than the conventional software ANN in terms of power consumption and efficiency,
as its architecture is analogous to that of a biological neuron. A variety of schemes based on two-or
three-port devices are proposed for hardware ANN that have issues with either low input
impendence or high operating voltage. Metal-oxide (MO) thin film transistor (TFT) with a potential
four-port structure has attracted increasing attention, which is considered a promising approach to
meet the aforementioned criteria [9].
Parallel dual-gate (DG) TFTs are deployed in this work to construct different circuit structures
to perform excitatory post-synaptic current (EPSC) and inhibitory post-synaptic current (IPSC)
properties, respectively. Continuous conductance states of dual-gate transistors with dynamic
random-access memory (DRAM) weight storage units are capable of emulating biological long term
potential (LTP) and long-term depression (LTD) states within two properties smoothly. Since
parallel DG TFTs provide two separate gates for the input and weight signals, which has extremely
high input impedance, greatly preventing signal attenuation and distortion. Such novel circuit
schemes based on parallel DG TFTs with EPSC and IPSC properties have significant ramifications
for the memory and learning functions of artificial intelligence (AI).
Following a brief introduction to the learning and memory behavior of the biological neural
system and several state-of-the-art ANNs, this thesis will focus on how to design a novel ANN
architecture on the basis of parallel DG TFT to achieve the separation and unification of biological
EPSC and IPSC properties. At the beginning of this work, a basic 1×2 circuit array with IPSC
property was implemented to realize simple logic functions, AND and OR. Additionally, this 1×2
hardware ANN was also employed to characterize the retention and response time, where the stored
weight signal drops by a mere 10% even after a long holding duration of 4 hours. The rise delay
and fall delay of this work are 220 μs and 160 μs, respectively.
Afterward, the scale of this 1×2 ANN structure was enlarged to develop a 4×6 circuit array,
and the Tetris pattern recognition in this work showed how well the enlarged network performed
on the classification. The trained 4×6 circuit array could precisely recognize four typical Tetris
patterns from 35 similar patterns.
Consequently, the hardware DG TFT-based ANN with only EPSC property was discussed and designed to implement the multi-layer network regression analysis. The real-time analysis of the gas sensor array data was sketched through such a network, as well as an excellent result with a standard deviation of approximately 0.02 was obtained.
Finally, the EPSC and IPSC properties were combined into a single circuit unit, which was trained for 16 combinatorial Binary logic functions with two inputs, and their corresponding theoretical minimal networks were found.
Post a Comment