Author

Keath Chen

Abstract

Quantum Key Distribution is meant to be an ultimate computer security system that will not need upgrade/overhaul from time to time. QKD allows the generation of long key length on demand. QKD, coupled with the unconditional secure "one-time pad" encryption system, will be unbreakable for eavesdropper with infinite resources. Numerous experimental QKD prototypes have demonstrated that QKD is likely to be a reality before quantum computer. One of the steps in a Quantum Key Distribution protocol is to remove the transmission errors from quantum communication, which typically has a high error rate (one percent or higher). An interactive error control method is utilized in the reconciliation of QKD. This procedure divides transmitted bits into blocks. The size of the block is chosen so that the chance of having multiple error bits in one block is small. By checking parity and doing interactive BINARY search when a parity error is found, error bits can be located and removed. Some error bits escape the detection from the first pass. By repeating this procedure several times, each time randomly dividing bits into blocks, most error bits can be detected and removed. Each parity check means loss of one bit. The goal is to minimize the number of parity checks to locate all (or most) errors and to have a high reliability that the remaining bits have very small residue error rate. Brassard and Salvail devised a better error control procedure in 1993. By keeping track of block parity information from pass to pass, which they called CASCADE, the number of parity checks per error bit is reduced. They observed no errors left after four passes from 10 simulation runs. However, they offered upper bound of residue error rate that was much higher than observed from simulation. Therefore, there is a need to find the optimal block size and to improve the residue error rate analysis of this procedure. In this work, improvement in reconciliation procedure is obtained in throughput (i.e. the number of input bits minus the number of parity checks) and in lower residue error rate. Throughput is doubled at high input error rate and less throughput gain at low input error rate. Three most important factors that enhance the system throughput are: 1) Using larger block size to fully utilize the power of CASCADE method. 2) Avoiding any pair of bits to stay in the same block of any later pass, which allows the use of three passes only (instead of four passes in the BS procedure). 3) Some bits are already known to be lost from the current pass, so I avoid carrying these bits to the later pass. The residue error rate at the end of the second pass (and the third pass) for small initial block size is derived. The residue error rate at the end of the third pass is found much lower than the upper bound given by the previous workers. This residue error rate is found to be a function of input size and the block size. This error control method has many similar features and difficulties to the recent development of "turbo-code" in the error control-coding field.

Library of Congress Subject Headings

Computer security--Research; Computers--Access control; Data encryption (Computer science); Cryptography; Quantum theory

Publication Date

2000

Document Type

Thesis

Department, Program, or Center

Computer Science (GCCIS)

Advisor

Radziszowski, Stanislaw

Advisor/Committee Member

Hemaspaandra, Edith

Advisor/Committee Member

Hempel, Harald

Comments

Note: imported from RIT’s Digital Media Library running on DSpace to RIT Scholar Works. Physical copy available through RIT's The Wallace Library at: QA76.9.A25 C53 2000

Campus

RIT – Main Campus

Share

COinS