Data breach incidents are a growing concern in both the private and public sector. As an increasing amount of sensitive information is made available through information systems, the number of occurrences and the potential damage as a result of these breaches will continue to expand. There is a clear need for a framework to mitigate the dangers of leaking data while simultaneously allowing authorized individuals to gain access to necessary information. This thesis presents the concept of data decay, applied to a client-server model for retrieval of sensitive data. In this framework, files tagged by the framework and stored on client machines undergo a process in which they become increasingly non-existent to the client in proportion to the time elapsed since the last server connection. In order for a client to continue using a file that has undergone the decay process, it must successfully authenticate to the central server to begin rebuilding the file. The time required to rebuild the file is computed by the length of time since the last client connection to the server, in addition to the level of sensitivity for the specific file (files tagged as highly sensitive have a faster decay rate than those tagged as moderately sensitive). Using this model, an adversary's window of opportunity to view sensitive information diminishes as the client remains disconnected from the central server. A regular user will find this framework a benefit in the ability to work remotely and have access to required sensitive information without the risk of unintentionally leaking it to unintended parties.
Library of Congress Subject Headings
Client/server computing--Security measures; Computer security
- Please Select One -
Department, Program, or Center
Department of Computing Security (GCCIS)
Taber, Matthew, "A Framework for data decay in client-server model" (2009). Thesis. Rochester Institute of Technology. Accessed from
RIT – Main Campus