Abstract

Neural Style Transfer is a class of neural algorithms designed to redraw a given image in the style of another image, traditionally a famous painting, while preserving the underlying details. Applying this process to a video requires stylizing each of its component frames, and the stylized frames must have temporal consistency between them to prevent flickering and other undesirable features. Current algorithms accommodate these constraints at the expense of speed.

We propose an algorithm called Distributed Artistic Videos and demonstrate its capacity to produce stylized videos over ten times faster than the current state-of-the-art with no reduction in output quality. Through the use of an 8-node computing cluster, we reduce the average time required to stylize a video by 92%—from hours to minutes---compared to the most recent algorithm of this kind on the same equipment and input. This allows the stylization of videos that are longer and higher-resolution than previously feasible.

Library of Congress Subject Headings

Animation (Cinematography)--Data processing; Digital video--Editing; Neural networks (Computer science)

Publication Date

5-2020

Document Type

Thesis

Student Type

Graduate

Degree Name

Computer Science (MS)

Department, Program, or Center

Computer Science (GCCIS)

Advisor

M. Mustafa Rafique

Advisor/Committee Member

Michael Mior

Advisor/Committee Member

Minseok Kwon

Campus

RIT – Main Campus

Plan Codes

COMPSCI-MS

Share

COinS