Abstract

Biological organisms operate under severe energy constraints but are still the most powerful computational systems that we know of. In contrast, modern AI algorithms are generally implemented on power-hungry hardware resources such as GPUs, limiting their use at the edge. This work explores the application of biologically-inspired energy constraints to spiking neural networks to better understand their effects on network dynamics and learning and to gain insight into the creation of more energy-efficient AI. Energy constraints are modeled by abstracting the role of astrocytes in metabolizing glucose and regulating the activity-driven distribution of ATP molecules to “pools” of neurons and synapses.

First, energy constraints are applied to the fixed recurrent part (a.k.a. reservoir) of liquid state machines (LSM)—a type of recurrent spiking neural network—in order to analyze their effects on both the network’s computational performance and ability to learn. Energy constraints were observed to have a significant influence on the dynamics of the network based on metrics such as Lyapunov exponent and separation ratio. In several cases the energy constraints also led to an increase in the LSM’s classification accuracy (up to 6.17\% improvement over baseline) when applied to two time series classification tasks: epileptic seizure detection and gait recognition. This improvement in classification accuracy was also typically correlated with the LSM separation metric (Pearson correlation coefficient of 0.9 for seizure detection task). However, the increased classification accuracy was generally not observed in LSMs with sparse connectivity, highlighting the role of energy constrains in sparsifying the LSM’s spike activity, which could lead to real-world energy savings in hardware implementations.

In addition to the fixed LSM reservoir, the impact of energy constraints was also explored in the context of unsupervised learning with spike-timing dependent plasticity (STDP). It was observed that energy constraints can have the effect of decreasing the magnitude of the update of synaptic weights by up to 72.4\%, on average, depending on factors such as the energy cost of neuron spikes and energy pool regeneration rate. Energy constraints under certain conditions were also seen to modify which input frequencies the synapses respond to, tending to attenuate or eliminate weight updates from high frequency inputs. The effects of neuronal energy constraints on STDP learning were also studied at the network level to determine their effects on classification task performance.

The final part of this work attempts to co-optimize an LSM’s energy consumption and performance through reinforcement learning. A proximal policy optimization (PPO) agent is introduced into the LSM reservoir to control the level of neuronal spiking. This was done by allowing it to modify individual energy constraint parameters. The agent is rewarded based on the separation of the reservoir and additionally rewarded for the reduction of reservoir energy consumption.

Library of Congress Subject Headings

Liquid crystal devices--Energy consumption; Neural networks (Computer science)--Energy consumption; Liquid crystal devices--Testing; Neural networks (Computer science)--Testing

Publication Date

8-2020

Document Type

Thesis

Student Type

Graduate

Degree Name

Computer Engineering (MS)

Department, Program, or Center

Computer Engineering (KGCOE)

Advisor

Cory Merkel

Advisor/Committee Member

Raymond Ptucha

Advisor/Committee Member

Dhireesha Kudithipudi

Campus

RIT – Main Campus

Plan Codes

CMPE-MS

Share

COinS