Research

Current Work

My two current research areas are searching over encrypted data and designing authentication paradigms.  (My full research statement is available.)

  1. Searching over encrypted data Symmetric searchable encryption is nearing practical levels of performance. However, the security definitions are not well understood. Most definitions allow the adversary to learn some statistics about the underlying data. Researchers have used learning techniques to recover the underlying data from these statistics. I have worked on information leakage and would like to collaborate with experts in learning theory to understand the impact of leakage on searchable encryption. Additionally, I will extend search functionality. In many online applications a search returns not only the exact query but “nearby” results. For example, a search for “John” would also return “Jon” as the edit distance between the two words is one. Previous work has added this functionality to searchable encryption schemes. However, this work considered general notions of closeness, resulting in inefficient schemes. I would like to consider common closeness metrics and make the constructions more efficient. This work will combine ideas from the previous two subsections.
  2. Improving Authentication I would like to increase the supported error rate of fuzzy extractors and demonstrate security for real sources including the human iris. Hardware tokens present another promising authentication alternative. Physical unclonable functions (PUFs) are believed to provide strong authentication. I’m investigating unified solutions that combine the best aspects of passwords, biometrics, and hardware tokens.

Previous Research

Cryptographic constructions span a range of security and efficiency guarantees. On one hand lie constructions providing strong security against a variety of threats whose inefficiencies prevent wide scale deployment. On the other hand, systems used in practice are efficient but often provide heuristic security claims. I believe this sharp tradeoff is unnecessary.

My research has focused on three areas: 1) how to securely communicate with a group of people 2) how to search over encrypted data 3) and how to provide cryptographic authentication from high entropy sources which often suffer from noise between repeated readings.

Secure Group Communication

Public-key encryption schemes allow parties to securely communicate in one-to-one fashion. Group communication can be achieved by individually encrypting to each desired recipient. However, such an approach is prohibitively expensive, especially when the group changes over time. Fiat and Naor formally defined the problem, Naor, Naor, and Lotspiech use a tree based approach to efficiently handle membership changes.

Our team developed and implemented a variant of this scheme. When membership changes, our protocol first changes the key to an intermediate group that is likely to be useful in the future. This creates additional groups with provisioned keys, keeping the average cost of membership change stable.

We demonstrated our implementation using a secure chat application. Our chat system served as the primary communication tool between multiple airplanes flying in and out of range. I learned that debugging cryptography on an airplane is higher pressure than a simulation in an office. It made me understand users’ focus on executing their task and that cryptography must be unobtrusive and make choices without involving the user.

Searching over Encrypted Data

Received data must then be securely stored. For most users, disk encryption can adequately protect their local data. However, users’ data no longer resides on their local machine, it has migrated to remote services. Traditional encryption destroys the search functionality integral to many services. It is possible to balance security and functionality by using encryption that allows for limited functionality.

Deterministic public key encryption allows for a service provided to check equality which can be used to implement keyword search. We constructed a deterministic public-key encryption scheme from any trapdoor function, unifying previous constructions.

I also deployed searchable symmetric encryption. Searchable symmetric encryption provides enables searching while protecting the contents at rest. Some system challenges arise when systems are instantiated. Careful and regular communication can unite research and development. As an example, auditing of secure search was a surprising requirement. Our team designed a secure audit log system that maintains the security of the underlying cryptographic protocols.

Authentication from Noisy Sources

Communicating parties need to authenticate one another to establish a secure communication channel. Two commonly used methods are passwords and the public key infrastructure (PKI). For example, when a user logs into his or her bank account over TLS, the user’s computer authenticates the bank using the PKI, while the bank authenticates the user with a password.

The drawbacks of both methods are numerous. Passwords are known to be insecure: users cannot memorize ones that are strong enough and users often disclose their passwords. PKI, on the other hand, requires centralized trusted authorities. Such authorities are hard to establish and maintain outside of hierarchical organizations. It is attractive to establish a secure communication channel for settings where such infrastructure is not available.

There are many proposals to base authentication on an information source that is known only to communicating parties. Often, such sources of information have higher entropy than passwords and are easier to store. Unfortunately, many of them are noisy and provide similar, but not identical secret values at each reading. Examples of such sources include biometrics, keystroke dynamics, and hardware devices.

Dodis, Ostrovsky, Reyzin, and Smith designed fuzzy extractors to derive keys from noisy sources. Previous constructions of fuzzy extractors have no proofs of security for many sources of practical importance. As an example, the human iris is thought to be strongest biometric. However, in our work, we argue current fuzzy extractors should not be used on irises in high security applications.

To improve fuzzy extractors for practical sources, we started by asking which limitations were necessary. Our work described necessary and sufficient conditions for building a fuzzy extractor.
We then formulated an alternative definition of fuzzy extractors providing security against adversaries with bounded resources (running in polynomial time). Computational security is often used in cryptography and fuzzy extractors have no compelling need for security against unbounded attackers.

We then constructed novel fuzzy extractors using this definition. Our schemes provide two main new properties: 1) longer keys and 2) reusability across multiple providers. The only previously known reusable fuzzy extractor assumed that different providers received readings that were correlated in unrealistic ways. Our construction only needs each provider to receive a valid reading. In addition, we proved our construction is secure for an wide class of noisy sources. I believe our new construction can be used to secure sources of practical importance including the iris.