Publications

Refine Results

(Filters Applied) Clear All

Charting a security landscape in the clouds: data protection and collaboration in cloud storage

Summary

This report surveys different approaches to securely storing and sharing data in the cloud based on traditional notions of security: confidentiality, integrity, and availability, with the main focus on confidentiality. An appendix discusses the related notion of how users can securely authenticate to cloud providers. We propose a metric for comparing secure storage approaches based on their residual vulnerabilities: attack surfaces against which an approach cannot protect. Our categorization therefore ranks approaches from the weakest (the most residual vulnerabilities) to the strongest (the fewest residual vulnerabilities). In addition to the security provided by each approach, we also consider their inherent costs and limitations. This report can therefore help an organization select a cloud data protection approach that satisfies their enterprise infrastructure, security specifications, and functionality requirements.
READ LESS

Summary

This report surveys different approaches to securely storing and sharing data in the cloud based on traditional notions of security: confidentiality, integrity, and availability, with the main focus on confidentiality. An appendix discusses the related notion of how users can securely authenticate to cloud providers. We propose a metric for...

READ MORE

Cryptography for Big Data security

Published in:
Chapter 10 in Big Data: Storage, Sharing, and Security, 2016, pp. 214-87.

Summary

This chapter focuses on state-of-the-art provably secure cryptographic techniques for protecting big data applications. We do not focus on more established, and commonly available cryptographic solutions. The goal is to inform practitioners of new techniques to consider as they develop new big data solutions rather than to summarize the current best practice for securing data.
READ LESS

Summary

This chapter focuses on state-of-the-art provably secure cryptographic techniques for protecting big data applications. We do not focus on more established, and commonly available cryptographic solutions. The goal is to inform practitioners of new techniques to consider as they develop new big data solutions rather than to summarize the current...

READ MORE

SoK: privacy on mobile devices - it's complicated

Summary

Modern mobile devices place a wide variety of sensors and services within the personal space of their users. As a result, these devices are capable of transparently monitoring many sensitive aspects of these users' lives (e.g., location, health, or correspondences). Users typically trade access to this data for convenient applications and features, in many cases without a full appreciation of the nature and extent of the information that they are exposing to a variety of third parties. Nevertheless, studies show that users remain concerned about their privacy and vendors have similarly been increasing their utilization of privacy-preserving technologies in these devices. Still, despite significant efforts, these technologies continue to fail in fundamental ways, leaving users' private data exposed. In this work, we survey the numerous components of mobile devices, giving particular attention to those that collect, process, or protect users' private data. Whereas the individual components have been generally well studied and understood, examining the entire mobile device ecosystem provides significant insights into its overwhelming complexity. The numerous components of this complex ecosystem are frequently built and controlled by different parties with varying interests and incentives. Moreover, most of these parties are unknown to the typical user. The technologies that are employed to protect the users' privacy typically only do so within a small slice of this ecosystem, abstracting away the greater complexity of the system. Our analysis suggests that this abstracted complexity is the major cause of many privacy-related vulnerabilities, and that a fundamentally new, holistic, approach to privacy is needed going forward. We thus highlight various existing technology gaps and propose several promising research directions for addressing and reducing this complexity.
READ LESS

Summary

Modern mobile devices place a wide variety of sensors and services within the personal space of their users. As a result, these devices are capable of transparently monitoring many sensitive aspects of these users' lives (e.g., location, health, or correspondences). Users typically trade access to this data for convenient applications...

READ MORE

Iris biometric security challenges and possible solutions: for your eyes only? Using the iris as a key

Summary

Biometrics were originally developed for identification, such as for criminal investigations. More recently, biometrics have been also utilized for authentication. Most biometric authentication systems today match a user's biometric reading against a stored reference template generated during enrollment. If the reading and the template are sufficiently close, the authentication is considered successful and the user is authorized to access protected resources. This binary matching approach has major inherent vulnerabilities. An alternative approach to biometric authentication proposes to use fuzzy extractors (also known as biometric cryptosystems), which derive cryptographic keys from noisy sources, such as biometrics. In theory, this approach is much more robust and can enable cryptographic authorization. Unfortunately, for many biometrics that provide high-quality identification, fuzzy extractors provide no security guarantees. This gap arises in part because of an objective mismatch. The quality of a biometric identification is typically measured using false match rate (FMR) versus false nonmatch rate (FNMR). As a result, biometrics have been extensively optimized for this metric. However, this metric says little about the suitability of a biometric for key derivation. In this article, we illustrate a metric that can be used to optimize biometrics for authentication. Using iris biometrics as an example, we explore possible directions for improving processing and representation according to this metric. Finally, we discuss why strong biometric authentication remains a challenging problem and propose some possible future directions for addressing these challenges.
READ LESS

Summary

Biometrics were originally developed for identification, such as for criminal investigations. More recently, biometrics have been also utilized for authentication. Most biometric authentication systems today match a user's biometric reading against a stored reference template generated during enrollment. If the reading and the template are sufficiently close, the authentication is...

READ MORE

Unifying leakage classes: simulatable leakage and pseudoentropy

Published in:
8th Int. Conf. Information-Theoretic Security (ICITS 2015), 2-5 May 2015 in Lecture Notes in Computer Science (LNCS), Vol. 9063, 2015, pp. 69-86.

Summary

Leakage resilient cryptography designs systems to withstand partial adversary knowledge of secret state. Ideally, leakage-resilient systems withstand current and future attacks; restoring confidence in the security of implemented cryptographic systems. Understanding the relation between classes of leakage functions is an important aspect. In this work, we consider the memory leakage model, where the leakage class contains functions over the system's entire secret state. Standard limitations include functions over the system's entire secret state. Standard limitations include functions with bounded output length, functions that retain (pseudo) entropy in the secret, and functions that leave the secret computationally unpredictable. Standaert, Pereira, and Yu (Crypto, 2013) introduced a new class of leakage functions they call simulatable leakage. A leakage function is simulatable if a simulator can produce indistinguishable leakage without access to the true secret state. We extend their notion to general applications and consider two versions. For weak simulatability: the simulated leakage must be indistinguishable from the true leakage in the presence of public information. For strong simulatability, this requirement must also hold when the distinguisher has access to the true secret state. We show the following: --Weakly simulatable functions retain computational unpredictability. --Strongly simulatability functions retain pseudoentropy. --There are bounded length functions that are not weakly simulatable. --There are weakly simulatable functions that remove pseudoentropy. --There are leakage functions that retain computational unpredictability are not weakly simulatable.
READ LESS

Summary

Leakage resilient cryptography designs systems to withstand partial adversary knowledge of secret state. Ideally, leakage-resilient systems withstand current and future attacks; restoring confidence in the security of implemented cryptographic systems. Understanding the relation between classes of leakage functions is an important aspect. In this work, we consider the memory leakage...

READ MORE

Cryptographically secure computation

Published in:
Computer, Vol. 48, No. 4, April 2015, pp. 78-81.

Summary

Researchers are making secure multiparty computation--a cryptographic technique that enables information sharing and analysis while keeping sensitive inputs secret--faster and easier to use for application software developers.
READ LESS

Summary

Researchers are making secure multiparty computation--a cryptographic technique that enables information sharing and analysis while keeping sensitive inputs secret--faster and easier to use for application software developers.

READ MORE

Computing on masked data: a high performance method for improving big data veracity

Published in:
HPEC 2014: IEEE Conf. on High Performance Extreme Computing, 9-11 September 2014.

Summary

The growing gap between data and users calls for innovative tools that address the challenges faced by big data volume, velocity and variety. Along with these standard three V's of big data, an emerging fourth "V" is veracity, which addresses the confidentiality, integrity, and availability of the data. Traditional cryptographic techniques that ensure the veracity of data can have overheads that are too large to apply to big data. This work introduces a new technique called Computing on Masked Data (CMD), which improves data veracity by allowing computations to be performed directly on masked data and ensuring that only authorized recipients can unmask the data. Using the sparse linear algebra of associative arrays, CMD can be performed with significantly less overhead than other approaches while still supporting a wide range of linear algebraic operations on the masked data. Databases with strong support of sparse operations, such as SciDB or Apache Accumulo, are ideally suited to this technique. Examples are shown for the application of CMD to a complex DNA matching algorithm and to database operations over social media data.
READ LESS

Summary

The growing gap between data and users calls for innovative tools that address the challenges faced by big data volume, velocity and variety. Along with these standard three V's of big data, an emerging fourth "V" is veracity, which addresses the confidentiality, integrity, and availability of the data. Traditional cryptographic...

READ MORE

Authenticated broadcast with a partially compromised public-key infrastructure

Published in:
Info. and Comput., Vol. 234, February 2014, pp. 17-25.

Summary

Given a public-key infrastructure (PKI) and digital signatures, it is possible to construct broadcast protocols tolerating any number of corrupted parties. Existing protocols, however, do not distinguish between corrupted parties who do not follow the protocol, and honest parties whose secret (signing) keys have been compromised but continue to behave honestly. We explore conditions under which it is possible to construct broadcast protocols that still provide the usual guarantees (i.e., validity/agreement) to the latter. Consider a network of n parties, where an adversary has compromised the secret keys of up to tc honest parties, where an adversary has compromised the secret keys of up to tc honest parties and, in addition, fully controls the behavior of up to ta other parties. We show that for any fixed tc>0 and any fixed ta, there exists an efficient protocol for broadcast if and only if 2 ta + min (ta, tc) < n. (When tc = 0, standard results imply feasibility for all ta < n.) We also show that if tc, ta are not fixed, but are only guaranteed to satisfy the above bound, then broadcast is impossible to achieve except for a few specific values of n; for these "exceptional" values of n, we demonstrate broadcast protocols. Taken together, our results give a complete characterization of this problem.
READ LESS

Summary

Given a public-key infrastructure (PKI) and digital signatures, it is possible to construct broadcast protocols tolerating any number of corrupted parties. Existing protocols, however, do not distinguish between corrupted parties who do not follow the protocol, and honest parties whose secret (signing) keys have been compromised but continue to behave...

READ MORE

DSKE: dynamic set key encryption

Published in:
7th LCN Workshop on Security in Communications, 22 October 2012, pp. 1006-13.

Summary

In this paper, we present a novel paradigm for studying the problem of group key distribution, use it to analyze existing key distribution schemes, and then present a novel scheme for group key distribution which we call "Dynamic Set Key Encryption," or DSKE. DSKE meets the demands of a tactical environment while relying only on standard cryptographic primitives. Our "set key" paradigm allows us to focus on the underlying problem of establishing a confidential communication channel shared by a group of users, without concern for related security factors like authenticity and integrity, and without the need to consider any properties of the group beyond a list of its members. This separation of concerns is vital to our development and analysis of DSKE, and can be applied elsewhere to simplify the analyses of other group key distribution schemes.
READ LESS

Summary

In this paper, we present a novel paradigm for studying the problem of group key distribution, use it to analyze existing key distribution schemes, and then present a novel scheme for group key distribution which we call "Dynamic Set Key Encryption," or DSKE. DSKE meets the demands of a tactical...

READ MORE

Scalable cryptographic authentication for high performance computing

Summary

High performance computing (HPC) uses supercomputers and computing clusters to solve large computational problems. Frequently HPC resources are shared systems and access to restricted data sets or resources must be authenticated. These authentication needs can take multiple forms, both internal and external to the HPC cluster. A computational stack that uses web services among nodes in the HPC may need to perform authentication between nodes of the same job or a job may need to reach out to data sources outside the HPC. Traditional authentication mechanisms such as passwords or digital certificates encounter issues with the distributed and potentially disconnected nature of HPC systems. Distributing and storing plain-text passwords or cryptographic keys among nodes in a HPC system without special protection is a poor security practice. Systems that reach back to the user's terminal for access to the authenticator are possible, but only in fully interactive supercomputing where connectivity to the user's terminal can be guaranteed. Point solutions can be enabled for these use cases, such as software-based role or self-signed certificates, however they require significant expertise in digital certificates to configure. A more general solution is called for that is both secure and easy to use. This paper presents an overview of a solution implemented on the interactive, on-demand LLGrid computing system at MIT Lincoln Laboratory and its use to solve one such authentication problem.
READ LESS

Summary

High performance computing (HPC) uses supercomputers and computing clusters to solve large computational problems. Frequently HPC resources are shared systems and access to restricted data sets or resources must be authenticated. These authentication needs can take multiple forms, both internal and external to the HPC cluster. A computational stack that...

READ MORE