Abstract

Many real-life systems consist of multiple information signals which might be potentially interacting with each other over time. These interactions can be estimated/modeled using techniques like Pearson Correlation (PC), Time lagged Cross Correlation (TLCC) and windowed TLCC, Dynamic Time Warping (DTW), and coupled Hidden Markov Model (cHMM). These techniques, excluding cHMM, cannot capture non-linear interactions and does not work well with multi-variate data. Although cHMM can capture the interactions effectively, it is bound by Markov property and other assumptions like latent variables, prior distributions, etc. These influence the performance of the model significantly. Recurrent Neural Network (RNN) is a variant of Neural Networks which can be used to model time-series data. RNN based architectures are the new state-of-the-art for complex tasks like machine translation. In this research, we explore techniques to extend RNNs to model interacting time-series signals. We propose architectures with coupling and attention mechanism. We evaluate the performance of the models on synthetically generated and real-life data sets. We compare the performance of our proposed architectures to similar ones in the literature. The goal of this exercise is to determine the most effective architecture to capture interaction information in the given interrelated time-series signals.

Library of Congress Subject Headings

Time-series analysis; Neural networks (Computer science)

Publication Date

12-2019

Document Type

Thesis

Student Type

Graduate

Degree Name

Computer Science (MS)

Department, Program, or Center

Computer Science (GCCIS)

Advisor

Ifeoma Nwogu

Advisor/Committee Member

Matthew Wright

Advisor/Committee Member

Linwei Wang

Campus

RIT – Main Campus

Plan Codes

COMPSCI-MS

Share

COinS