Abstract

As the efficiency of neuromorphic systems improves, biologically-inspired learning techniques are becoming more and more appealing for various computing applications, ranging from pattern and character recognition to general purpose reconfigurable logic. Due to their functional similarities to synapses in the brain, memristors are becoming a key element in the hardware realization of perceptron-based learning systems. By pairing memristive devices with a perceptron-based neuron model, previous work has shown that an efficient and low area neural logic block (NLB) can be developed. However, the use of a simple threshold activation function has limited the set of learnable functions for a single block, resulting in the need for multiple layers to implement certain functions. This complicates the training process, decreases the scalability of the system, and increases the overall energy and delay of large networks. In this work, three novel NLB designs are presented that overcome the limitations of previous hardware NLBs. First, an Adaptive Neural Logic Block (ANLB) and Robust Adaptive Neural Logic Block (RANLB) are proposed. By integrating an adaptive activation function into a perceptron model, these designs are capable of rapidly learning any function in a single layer. Next, a Multi Threshold Neural Logic Block (MTNLB) is proposed in which a static activation function is used to obtain the same functionality with minimal overhead. Using a Verilog-AMS model of a physical memristor, the proposed NLBs are applied to implement both reconfigurable logic and an Optical Character Recognition (OCR) system. When considering the MTNLB as a building block for ISCAS-85 benchmark circuits, it provides EDP improvements of over 90 percent over a standard LUT implementation on all benchmark circuits and up to a 99 percent improvement over a threshold NLB implementation. As a compromise, the ANLB and RANLB provide less of an EDP improvement in a static system, but achieve faster training convergence times for all functions. To show how the proposed design can simplify an OCR application, a simple 8x8 digit recognition system is developed. Using only four 16-input NLBs for each digit, the system is able to develop a model of each digit in only 90 us and correctly classify the majority of test images.

Library of Congress Subject Headings

Machine learning; Perceptrons; Memristors

Publication Date

8-1-2012

Document Type

Thesis

Department, Program, or Center

Computer Engineering (KGCOE)

Advisor

Kudithipudi, Dhireesha

Comments

Note: imported from RIT’s Digital Media Library running on DSpace to RIT Scholar Works. Physical copy available through RIT's The Wallace Library at: Q325.5 .S65 2012

Campus

RIT – Main Campus

Share

COinS