Publications

Refine Results

(Filters Applied) Clear All

Recommender systems for the Department of Defense and intelligence community

Summary

Recommender systems, which selectively filter information for users, can hasten analysts' responses to complex events such as cyber attacks. Lincoln Laboratory's research on recommender systems may bring the capabilities of these systems to analysts in both the Department of Defense and intelligence community.
READ LESS

Summary

Recommender systems, which selectively filter information for users, can hasten analysts' responses to complex events such as cyber attacks. Lincoln Laboratory's research on recommender systems may bring the capabilities of these systems to analysts in both the Department of Defense and intelligence community.

READ MORE

Finding malicious cyber discussions in social media

Summary

Today's analysts manually examine social media networks to find discussions concerning planned cyber attacks, attacker techniques and tools, and potential victims. Applying modern machine learning approaches, Lincoln Laboratory has demonstrated the ability to automatically discover such discussions from Stack Exchange, Reddit, and Twitter posts written in English.
READ LESS

Summary

Today's analysts manually examine social media networks to find discussions concerning planned cyber attacks, attacker techniques and tools, and potential victims. Applying modern machine learning approaches, Lincoln Laboratory has demonstrated the ability to automatically discover such discussions from Stack Exchange, Reddit, and Twitter posts written in English.

READ MORE

Threat-based risk assessment for enterprise networks

Published in:
Lincoln Laboratory Journal, Vol. 22, No. 1, 2016, pp. 33-45.

Summary

Protecting enterprise networks requires continuous risk assessment that automatically identifies and prioritizes cyber security risks, enables efficient allocation of cyber security resources, and enhances protection against modern cyber threats. Lincoln Laboratory created a network security model to guide the development of such risk assessments and, for the most important cyber threats, designed practical risk metrics that can be computed automatically and continuously from security-relevant network data.
READ LESS

Summary

Protecting enterprise networks requires continuous risk assessment that automatically identifies and prioritizes cyber security risks, enables efficient allocation of cyber security resources, and enhances protection against modern cyber threats. Lincoln Laboratory created a network security model to guide the development of such risk assessments and, for the most important cyber...

READ MORE

Cloudbreak: answering the challenges of cyber command and control

Published in:
Lincoln Laboratory Journal, Vol. 22, No. 1, 2016, pp. 60-73.

Summary

Lincoln Laboratory's flexible, user-centered framework for the development of command-and-control systems allows the rapid prototyping of new system capabilities. This methodology, Cloudbreak, effectively supports the insertion of new capabilities into existing systems and fosters user acceptance of new tools.
READ LESS

Summary

Lincoln Laboratory's flexible, user-centered framework for the development of command-and-control systems allows the rapid prototyping of new system capabilities. This methodology, Cloudbreak, effectively supports the insertion of new capabilities into existing systems and fosters user acceptance of new tools.

READ MORE

Secure embedded systems

Published in:
Lincoln Laboratory Journal, Vol. 22, No. 1, 2016, pp. 110-122.

Summary

Developers seek to seamlessly integrate cyber security within U.S. military system software. However, added security components can impede a system's functionality. System developers need a well-defined approach for simultaneously designing functionality and cyber security. Lincoln Laboratory's secure embedded system co-design methodology uses a security coprocessor to cryptographically ensure system confidentiality and integrity while maintaining functionality.
READ LESS

Summary

Developers seek to seamlessly integrate cyber security within U.S. military system software. However, added security components can impede a system's functionality. System developers need a well-defined approach for simultaneously designing functionality and cyber security. Lincoln Laboratory's secure embedded system co-design methodology uses a security coprocessor to cryptographically ensure system confidentiality...

READ MORE

Secure and resilient cloud computing for the Department of Defense

Summary

Cloud computing offers substantial benefits to its users: the ability to store and access massive amounts of data, on-demand delivery of computing services, the capability to widely share information, and the scalability of resource usage. Lincoln Laboratory is developing technology that will strengthen the security and resilience of cloud computing so that the Department of Defense can confidently deploy cloud services for its critical missions.
READ LESS

Summary

Cloud computing offers substantial benefits to its users: the ability to store and access massive amounts of data, on-demand delivery of computing services, the capability to widely share information, and the scalability of resource usage. Lincoln Laboratory is developing technology that will strengthen the security and resilience of cloud computing...

READ MORE

Building Resource Adaptive Software Systems (BRASS): objectives and system evaluation

Summary

As modern software systems continue inexorably to increase in complexity and capability, users have become accustomed to periodic cycles of updating and upgrading to avoid obsolescence—if at some cost in terms of frustration. In the case of the U.S. military, having access to well-functioning software systems and underlying content is critical to national security, but updates are no less problematic than among civilian users and often demand considerable time and expense. To address these challenges, DARPA has announced a new four-year research project to investigate the fundamental computational and algorithmic requirements necessary for software systems and data to remain robust and functional in excess of 100 years. The Building Resource Adaptive Software Systems, or BRASS, program seeks to realize foundational advances in the design and implementation of long-lived software systems that can dynamically adapt to changes in the resources they depend upon and environments in which they operate. MIT Lincoln Laboratory will provide the test framework and evaluation of proposed software tools in support of this revolutionary vision.
READ LESS

Summary

As modern software systems continue inexorably to increase in complexity and capability, users have become accustomed to periodic cycles of updating and upgrading to avoid obsolescence—if at some cost in terms of frustration. In the case of the U.S. military, having access to well-functioning software systems and underlying content is...

READ MORE

Scalability of VM provisioning systems

Summary

Virtual machines and virtualized hardware have been around for over half a century. The commoditization of the x86 platform and its rapidly growing hardware capabilities have led to recent exponential growth in the use of virtualization both in the enterprise and high performance computing (HPC). The startup time of a virtualized environment is a key performance metric for high performance computing in which the runtime of any individual task is typically much shorter than the lifetime of a virtualized service in an enterprise context. In this paper, a methodology for accurately measuring the startup performance on an HPC system is described. The startup performance overhead of three of the most mature, widely deployed cloud management frameworks (OpenStack, OpenNebula, and Eucalyptus) is measured to determine their suitability for workloads typically seen in an HPC environment. A 10x performance difference is observed between the fastest (Eucalyptus) and the slowest (OpenNebula) framework. This time difference is primarily due to delays in waiting on networking in the cloud-init portion of the startup. The methodology and measurements presented should facilitate the optimization of startup across a variety of virtualization environments.
READ LESS

Summary

Virtual machines and virtualized hardware have been around for over half a century. The commoditization of the x86 platform and its rapidly growing hardware capabilities have led to recent exponential growth in the use of virtualization both in the enterprise and high performance computing (HPC). The startup time of a...

READ MORE

Photothermal speckle modulation for noncontact materials characterization

Summary

We have developed a noncontact, photothermal materials characterization method based on visible-light speckle imaging. This technique is applied to remotely measure the infrared absorption spectra of materials and to discriminate materials based on their thermal conductivities. A wavelength-tunable (7.5-8.7 um), intensity-modulated, quantum cascade pump laser and a continuous-wave 532 nm probe laser illuminate a sample surface such that the two laser spots overlap. Surface absorption of the intensity-modulated pump laser induces a time-varying thermoelastic surface deformation, resulting in a time-varying 532 nm scattering speckle field from the surface. The speckle modulation amplitude, derived from a series of visible camera images, is found to correlate with the amplitude of the surface motion. By tuning the pump laser's wavelength over a molecular absorption feature, the amplitude spectrum of the speckle modulation is found to correlate to the IR absorption spectrum. As an example, we demonstrate this technique for spectroscopic identification of thin polymeric films. Furthermore, by adjusting the rate of modulation of the pump beam and measuring the associated modulation transfer to the visible speckle pattern, information about the thermal time constants of surface and sub-surface features can be revealed. Using this approach, we demonstrate the ability to distinguish between different materials (including metals, semiconductors, and insulators) based on differences in their thermal conductivities.
READ LESS

Summary

We have developed a noncontact, photothermal materials characterization method based on visible-light speckle imaging. This technique is applied to remotely measure the infrared absorption spectra of materials and to discriminate materials based on their thermal conductivities. A wavelength-tunable (7.5-8.7 um), intensity-modulated, quantum cascade pump laser and a continuous-wave 532 nm...

READ MORE

Airspace flow rate forecast algorithms, validation, and implementation

Published in:
MIT Lincoln Laboratory Report ATC-428

Summary

This report summarizes work performed by MIT Lincoln Laboratory during the period 1 February 2015 - 30 November 2015 focused on developing and improving algorithms to estimate the impact of convective weather on air traffic flows. The core motivation for the work is the need to improve strategic traffic flow management decision-making in the National Airspace System. The algorithms developed as part of this work translate multiple weather forecast products into a discrete airspace impact metric called permeability.
READ LESS

Summary

This report summarizes work performed by MIT Lincoln Laboratory during the period 1 February 2015 - 30 November 2015 focused on developing and improving algorithms to estimate the impact of convective weather on air traffic flows. The core motivation for the work is the need to improve strategic traffic flow...

READ MORE