Would you like to hear a Lincoln Laboratory talk?

We are pleased to present technical seminars to interested college and university groups. To arrange a technical seminar, please call (781) 981-2465 or email [email protected] and provide a rank-ordered list of requested seminars, your preferred date/time options, and a description of the target audience.

We offer the seminars in the following areas: air traffic control; communication systems; cyber security and information sciences; engineering; homeland protection; human language technology; radar and signal processing; solid state devices, materials, and processes; space control technology; and systems and architectures. Please find related seminars and abstracts below.

Air Traffic Control

Technical Seminars offered in air traffic control.

Human-Systems Integration in Complex Support Systems

Dr. Hayley J. Davison Reynolds1
MIT Lincoln Laboratory

MIT Lincoln Laboratory has had a successful history of integrating decision support systems into the air traffic control and emergency management domain, even though the introduction of new technologies often meets user resistance. Because of the complex nature of the work, air traffic controllers and emergency managers heavily rely on certain technologies and decision processes to maintain a safe and effective operating environment. This reliance on familiar technology and processes makes the introduction of a new tool into the environment a difficult task, even though the tool may ultimately improve the decision and raise levels of safety and/or efficiency of the operation. Poorly integrated systems can result in users being disappointed by a system that provides information for which they have no concept of use, or more likely, that results in the system's not being used despite its existence in the array of available information systems; neither option yields the benefits for which the tool was designed.

In this seminar, a practical methodology for designing and fielding decision support systems that maximizes the potential of effective integration of the system into the users' operational context will be presented. Several examples of air traffic control and emergency management decision support prototype systems designed by Lincoln Laboratory, including the Route Availability Planning Tool, the Tower Flight Data Manager, and the Hurricane Evacuation decision support tool, will be described to demonstrate this process. Included in the presentation will be areas in which the designers ran into roadblocks in making the systems effective and the combination of qualitative and quantitative techniques used to help properly integrate the system in the field, yielding measurable operational benefits.

1 PhD, Aeronautical Systems and Applied Psychology, Massachusetts Institute of Technology


Integrating Unmanned Aircraft Systems Safely into the National Airspace System

Dr. Rodney E. Cole1
MIT Lincoln Laboratory

Unmanned aircraft systems (UAS), such as the Air Force's Global Hawk and Predator, are increasingly employed by the military and Department of Homeland Security in roles that require sharing airspace with civilian aircraft. Missions include pilot training, border patrol, highway and agricultural observation, and disaster management. Due to the pressure for widespread access for UAS to the national airspace and the risk of collision with passenger aircraft, UAS operators must find a way to integrate with manned aircraft with a very high degree of safety. The key to safe integration of UAS into the national airspace is the development and assessment of sense-and-avoid (SAA) technologies to replace the manned aircraft pilot's ability to see and avoid other aircraft.

MIT Lincoln Laboratory is conducting research to address safe and flexible UAS integration with commercial and general aviation aircraft. Research areas include the development of sophisticated computer models that simulate millions of encounters between UAS and civilian aircraft to characterize airspace hazards and collision rates. These models can be applied to assess the performance of SAA algorithms designed to maintain separation, or "well clear," between UAS and civilian aircraft while observing and adhering to established right-of-way rules. The Laboratory is also conducting groundbreaking research in the area of collision avoidance logic and is pursuing a probabilistic approach to collision avoidance that considers the uncertainty in pilot response to alerts and uncertainty in future states of the threat aircraft. This approach offers the potential to provide increased safety with decreased false alarms over conventional techniques and is a candidate for future traffic collision avoidance systems and UAS SAA applications.

The Laboratory is also working with the Department of Defense and Department of Homeland Security to develop ground-based sense-and-avoid (GBSAA) and airborne sense-and-avoid surveillance architectures to satisfy the Federal Aviation Administration's requirement for replacing the onboard pilot's see and avoid function. Under Department of Defense sponsorship, Lincoln Laboratory has deployed a service-oriented architecture GBSAA test bed that will be utilized in operational and simulation-over-live environments to collect data and operator feedback that can then be used to support future certification with the Federal Aviation Administration. This seminar will provide a broad overview of the Laboratory's efforts in UAS airspace integration and next-generation aircraft collision avoidance algorithms and will provide an overview of the GBSAA test bed that is under development for the Department of Defense.

1 PhD, Mathematics, University of Colorado–Boulder


Radar Detection of Aviation Weather Hazards

Dr. John Y. N. Cho1
MIT Lincoln Laboratory

Bad weather plays a factor in many aviation accidents and incidents. Microburst, hail, icing, lightning, fog, and turbulence are atmospheric phenomena that can interfere with aircraft performance and a pilot's ability to fly safely. For safe and efficient operation of the air traffic system, it is crucial to continuously observe meteorological conditions and accurately characterize phenomena hazardous to aircraft. Radar is the most important weather-sensing instrument for aviation. This seminar will discuss technical advances that led to today's operational terminal wind-shear detection radars. An overview of recent and ongoing research to improve radar capability to accurately observe weather hazards to aviation will also be presented.

1 PhD, Electrical Engineering, Cornell University


System Design in an Uncertain World: Decision Support for Mitigating Thunderstorm Impacts on Air Traffic

Richard A. DeLaura1
MIT Lincoln Laboratory

Weather accounts for 70% of the cost of air traffic delays—about $28 billion annually—within the United States National Airspace System. Most weather-related delays occur during the summer months, when thunderstorms affect air traffic, particularly in the crowded Northeast. The task of air traffic management, complicated even in the best of circumstances, can become overwhelmingly complex as air traffic managers struggle to route traffic reliably through rapidly evolving thunderstorms. A new generation of air traffic management decision support tools promises to reduce air traffic delays by accounting for the potential effects of convective weather, such as thunderstorms, on air traffic flow. Underpinning these tools are models that translate high-resolution convective weather forecasts into estimates of impact on aviation operations.

This seminar will present the results of new research to develop models of pilot decision-making and air traffic capacity in the presence of thunderstorms. The models will be described, initial validation will be presented, and sources of error and uncertainty will be discussed. Finally, some applications of these models and directions for future research will be briefly described.

1 AB, Physics, Harvard University

Communication Systems

Technical Seminars offered in communication systems.

Balloon Communications Relay Systems and Technology

Dr. Ameya Agaskar1
MIT Lincoln Laboratory

Today's smart devices, such as phones and watches, have allowed us to connect all aspects of our lives. However, these devices need enough radio spectrum to enable communications. As more smart devices are adopted, the same frequency spectrum will likely need to be utilized by a growing number of physically close transmitters to communicate to a base station, causing congestion and interference. A robust solution to this spectrum congestion and general interference problem is urgently needed.

This seminar provides an overview of a sparse large-aperture array of high-altitude balloon relay solution, where a collection of low-cost balloon relays continuously covers a communication area of interest. Each balloon would relay received signals down to a base station where the signals are processed using advanced adaptive beamforming techniques. The individual user signals are extracted and decoded while the interference is suppressed. The large aperture of the balloon array allows for the spatial separation of very close transmitters on the ground. Among the difficulties of the concept that were overcome include the fact that the balloons drift with time and their location may not be precisely known at any given time by the base-station. The seminar covers conceptual difficulties and their solutions, design choices and tradeoffs, as well as actual beamforming results achieved for interference suppression and videos of the balloon concept during a test flight.

1 PhD, Engineering Sciences, Harvard University


Diversity in Air-to-Ground Lasercom: The Focal Demonstration

Dr. Frederick G. Walther1
MIT Lincoln Laboratory

Laser communications (lasercom) provides significant advantages, compared to RF communications, which include a large, unregulated bandwidth and high beam directionality for free-space links. These advantages provide capabilities for high (multi-gigabits per second [multi-Gb/s]) data transfer rates; reduced terminal size, weight, and power; and a degree of physical link security against out-of-beam interferers or detectors. This seminar addresses the key components of lasercom system design, including modeling and simulation of atmospheric effects, link budget development, employment of spatial and temporal diversity techniques to mitigate signal fading due to scintillation, and requirements for acquisition and tracking system performance. Results from recent flight demonstrations show stable tracking, rapid reacquisition after cloud blockages, and high data throughput for a multi-Gb/s communications link out to 80 kilometers. Potential technologies for further development include a compact optical gimbal that has low size, weight, and power, and more efficient modem and coding techniques to extend range and/or data rate.

1 PhD, Physics, Massachusetts Institute of Technology


Dynamic Link Adaptation for Satellite Communications

Dr. Huan Yao1
MIT Lincoln Laboratory

Future protected military satellite communications will continue to use high-transmission frequencies to capitalize on large amounts of available bandwidth. However, the data flowing through these satellites will transition from the circuit-switched traffic of today's satellite systems to Internet-like packet traffic. One of the main differences in migrating to packet-switched communications is that the traffic will become bursty (i.e., the data rate from particular users will not be constant). The variation in data rate is only one of the potential system variations. At the frequencies of interest, rain and other weather phenomena can introduce significant path attenuation for relatively short time periods. Current protected satellite communications systems are designed with sufficient link margins to provide a desired availability under such degraded path conditions. These systems do not have provisions to use the excess link margins for additional capacity when weather conditions are good. The focus of this seminar is the design of a future satellite system that autonomously reacts to changes in link conditions and offered traffic. This automatic adaptation drastically improves the overall system capacity and the service that can be provided to ground terminals.

1 PhD, Electrical Engineering, Massachusetts Institute of Technology



Implementation of an Aerial High-Capacity Backbone

Dr. Rahul Amin1
MIT Lincoln Laboratory

With the proliferation of mobile devices and cloud computing architectures, the demand for wireless capacity keeps growing at an exponential rate. There is a huge desire to provide ubiquitous connectivity at high data rates to users anywhere and everywhere in the world. To keep up with these increased expectations, developed countries are in the process of upgrading their network infrastructure using advanced cellular 5G and highly dense Wi-Fi deployments. However, there are still large pockets of the world where network infrastructure is unavailable or severely lacking. As of 2019, over 50% of the world’s population still does not have internet access due to this lack of infrastructure. To provide wireless connectivity in such regions of the world, an aerial high-capacity backbone (HCB) is one of the promising solutions. The aerial HCB provides a high data rate, low latency wireless solution that would supplement the traditional satellite communications infrastructure, which can become easily oversubscribed. The aerial HCB consists of several aircraft/unmanned aerial vehicles (UAV) that fly with a directional, high data rate communication system mounted underneath the aircraft/UAV. Ground users connect directly to the aerially mounted communication system, which in turn is connected to ground infrastructure either directly or through a multi-hop path. Depending on the geographical coverage area, users are connected to existing ground infrastructure through a multi-hop path made up of several ground-to-air, air-to-air, and air-to-ground links.

Developing and deploying the aerial HCB is a challenging task due to the dynamism of the aerial nodes. Directional radios are required to achieve high data rates for the HCB. Once a directional radio is selected, a topology management and a routing solution that rapidly converge have to be implemented to achieve high end-to-end path availability through the aerial HCB. The topology management solution helps establish (or re-establish) ground-to-air, air-to-air, and air-to-ground links by making (node, antenna) pairing decisions and instructing the antennas to point in the required direction. Once all the links in the aerial HCB are made based on the topology manager’s decisions, the routing solution needs to establish a loop-free end-to-end data path between users and ground infrastructure. In this seminar, we present the topology management framework and software-based routing solutions that were implemented for the aerial HCB. In addition, we provide an overview of the tools that were used to develop these solutions that let us go from a lab development environment to actual field tests using the same code base.

1 PhD, Computer Engineering, Clemson University


Laser Communications to Enable Space Exploration

Dr. Farzana I. Khatri1
MIT Lincoln Laboratory

Traditionally, communications between the Earth and space has relied on RF systems, which have been in use since the Apollo era when sensor technology was primitive and the Internet did not exist. Today, commercial companies have deployed satellites with sensors that generate many terabytes of data each day, but only a fraction of this data is transmitted back to earth due to communications bandwidth constraints. Furthermore, as humans venture deeper into space, higher communications bandwidth will be required to support them. Free-space laser communications, or lasercom, offers a high-bandwidth and low-size, -weight, and -power solution to the space bandwidth bottleneck by using the same lasers and techniques that revolutionized fiber-optic communications in the 1990s. This talk will describe Lincoln Laboratory's current efforts in designing and deploying lasercom systems with NASA, the current efforts to develop a strong industry base in lasercom, and upcoming lasercom missions.

1 PhD, Electrical Engineering, Massachusetts Institute of Technology


Research Challenges in Airborne Networks and Communications

Dr. Bow-Nan Cheng1
MIT Lincoln Laboratory

In recent years, with the emergence of ubiquitous communications and the increasing availability of unmanned aerial vehicles, there has been an increased interest in building and leveraging airborne networks to facilitate data dissemination and coordination. Although the commercial world has made significant progress in building and deploying ground wireless networks, several military needs are currently not being met by available and emerging commercial technologies, particularly for use in airborne applications.

This talk presents four major airborne networking domains, examines unique domain characteristics, and identifies interesting research challenges for improving performance and scalability in the context of military airborne systems. Topics covered include techniques for improving physical layer performance in the presence of strong interference, antenna capabilities and limitations resulting from the antenna's mounting location on the aircraft surface, and methods for integrating radio-link state information with the platform network routers for improved routing decisions. The talk also presents an overview of some work performed at MIT Lincoln Laboratory in the past few years to address the research challenges, and to identify additional open areas for research, in airborne networking.

1 PhD, Computer Science, Rensselaer Polytechnic Institute


Undersea Laser Communication — The Next Frontier

Dr. Nicholas Hardy1
MIT Lincoln Laboratory

Undersea communication is of critical importance to many research and military applications, including oceanographic data collection, pollution monitoring, offshore exploration, and tactical surveillance. Undersea communication functions have traditionally been accomplished with acoustic and low-frequency radio communication. These technologies, however, impose speed limitations on the undersea platform, impose maximum communications data rates limited to a few bits per second (low-frequency radio) or multiple kilobits per second (acoustic), and radiate over a broad area. These factors lead to power inefficiencies, potential crosstalk, and multipath degradation. Undersea optical communication has the potential to mitigate these capability shortfalls, providing very-high-data-rate covert communication.

This seminar provides an overview of applications that might benefit from high-rate undersea communication and technologies, and it reports on new research into finding the ultimate performance limit of undersea optical communication. Using sophisticated physics models of laser and sunlight undersea propagation, Lincoln Laboratory researchers have found that gigabit-per-second links over appreciable ranges, in excess of 100 meters, are possible in much of the world's oceans. This capability promises to transform undersea communications. Finally, the seminar presents architectures based on currently realizable technology that can achieve the predicted ultimate performance limit.

1 PhD, Electrical Engineering, Massachusetts Institute of Technology


Waveform Design for Airborne Networks

Dr. Frederick J. Block1
MIT Lincoln Laboratory

Airborne networks are an integral part of modern network-centric military operations. These networks operate in a unique environment that poses many research challenges. For example, nodes are much more mobile in airborne networks than in ground-based networks and are often separated by great distances that cause long delays and large propagation losses. Additionally, the altitude of an airborne receiver potentially places it within line of sight of many transmitters located over a wide area. Some of these transmitters may be part of the same network as the receiver but may cause multiple-access interference that is due to complications in channel access protocol design for airborne networks. Other transmitters that are outside the network may also cause interference. The waveform design must specifically account for this environment to achieve the required level of performance. Physical, link, and network layer techniques for providing high-rate, reliable airborne networks will be discussed.

1 PhD, Electrical Engineering, Clemson University

Cyber Security & Information Sciences

Technical Seminars offered in cyber security and information sciences.

Addressing the Challenges of Big Data Through Innovative Technologies

Dr. Vijay Gadepally1 & Dr. Jeremy Kepner2
MIT Lincoln Laboratory

The ability to collect and analyze large amounts of data is increasingly important within the scientific community. The growing gap between the volume, velocity, and variety of data available and users' ability to handle this deluge calls for innovative tools to address the challenges imposed by what has become known as big data. MIT Lincoln Laboratory is taking a leading role in developing a set of tools to help solve the problems inherent in big data.

Big data's volume stresses the storage, memory, and compute capacity of a computing system and requires access to a computing cloud. Choosing the right cloud is problem specific. Currently, four multibillion-dollar ecosystems dominate the cloud computing environment: enterprise clouds, big data clouds, SQL database clouds, and supercomputing clouds. Each cloud ecosystem has its own hardware, software, communities, and business markets. The broad nature of big data challenges makes it unlikely that one cloud ecosystem can satisfy all needs, and solutions are likely to require the tools and techniques from more than one cloud ecosystem. The MIT SuperCloud was developed to provide one such solution. To our knowledge, the MIT SuperCloud is the only deployed cloud system that allows all four ecosystems to co-exist without sacrificing performance or functionality.

The velocity of big data stresses the rate at which data can be absorbed and meaningful answers can be produced. Through an initiative led by the National Security Agency (NSA), the Common Big Data Architecture (CBDA) was developed for the U.S. government. The CBDA is based on the Google Big Table NoSQL approach and is now in wide use. Lincoln Laboratory was instrumental in the development of the CBDA and is a leader in adapting the CBDA to a variety of big data challenges. The centerpieces of the CBDA are the NSA-developed Apache Accumulo database (capable of millions of entries per second) and the Lincoln Laboratory–developed Dynamic Distributed Dimensional Data Model (D4M) schema.

Finally, big data variety may present both the largest challenge and the greatest set of opportunities for supercomputing. The promise of big data is the ability to correlate heterogeneous data to generate new insights. The combination of Apache Accumulo and D4M technologies allows vast quantities of highly diverse data (bioinformatics, cyber-relevant data, social media data, etc.) to be automatically ingested into a common schema that enables rapid query and correlation of elements.

1 PhD, Electrical and Computer Engineering, The Ohio State University
2 PhD, Physics, Princeton University


CASCADE: Cyber Adversarial Scenario Modeling and Automated Decision Engine

Trang Nyugen1
MIT Lincoln Laboratory

Cyber systems are complex and include numerous heterogeneous and interdependent entities and actors— such as mission operators, network users, attackers, and defenders—that interact within a network environment. Decisions must be made to secure cyber systems, mitigate threats, and minimize mission impact. Administrators must consider how to optimally partition a network using firewalls, filters, and other mechanisms to restrict the movement of an attacker who gains a foothold on the network and to mitigate the damage that the attacker can cause. Current cyber decision support focuses on gathering network data, processing these data into knowledge, and visualizing the resultant knowledge for cyber situational awareness. However, as the data become more readily obtained, network administrators face the same problem that business analysts have faced for some time. More data or knowledge does not lead to better decisions. There is a need for the development of cyber decision support that can automatically recommend timely and sound data-driven courses of action. Currently, decisions about partitioning a network are made by analysts who use subjective judgment that is sometimes informed by best practices. We propose a decision system that automatically recommends an optimal network-partitioning architecture to maximize security posture and minimize mission impact.

1 MS, Columbus State University


Cryptographically Secure Computation

Dr. Emily Shen1
MIT Lincoln Laboratory

In today's big data world, the ability to collect, share, and analyze data has led to unprecedented capabilities as well as unprecedented privacy concerns. Often, people or organizations would like to collaborate and obtain results from their collective data but without divulging their individual data. Current techniques to enable such sharing require that the parties either entrust each other with their sensitive data or find a mutually trusted third party who can perform the computation on their behalf; these approaches may be undesirable or impractical.

Secure multiparty computation (MPC) is a cryptographic technique that enables parties to perform joint computations without revealing their sensitive inputs and without using a trusted third party. MPC not only can provide privacy for existing applications but also can enable new applications that are currently not possible. For example, MPC can be used to enable collaboration and information exchange between partner organizations through selective data sharing and to guarantee security of computation performed in an untrusted environment such as a cloud. In this talk, we illustrate the cryptographic ideas behind MPC. We then describe the design and optimization of MPC for a collaborative anomaly detection algorithm.

1 PhD, Electrical Engineering and Computer Science, Massachusetts Institute of Technology


Evaluating Cyber Moving Target Techniques

Dr. Hamed Okhravi1
MIT Lincoln Laboratory

The concept of cyber moving target (MT) defense has been identified as one of the game-changing themes to rebalance the landscape of cyber security. MT techniques make cyber systems "a moving target," that is, less static, less homogeneous, and less deterministic in order to create uncertainty for attackers.

Although many MT techniques have been proposed in the literature, little has been done on evaluating their effectiveness, benefits, and weaknesses. This seminar discusses the evaluation of the wide range of MT techniques. First, a qualitative assessment studies the potential benefits, gaps, and weaknesses for each category of MT strategy. This step identifies major gaps in order to guide future research and prototyping efforts. The findings of a case study on code-reuse defenses are presented. For the MT techniques that have been identified as potentially more beneficial in the qualitative assessment, a deeper quantitative assessment is performed by examining real exploits. Next, the seminar discusses an operational assessment of an MT technique inside a larger system and illustrates how important parameters of such techniques can be monitored for improved effectiveness. Finally, possible directions for future work in this domain are outlined.
1 PhD, Electrical and Computer Engineering, University of Illinois at Urbana-Champaign


Experiences in Cyber Security Education:
The MIT Lincoln Laboratory Capture-the-Flag Exercise

Timothy R. Leek1
MIT Lincoln Laboratory
Dr. Nickolai Zeldovich2
MIT Computer Science and Artificial Intelligence Laboratory

Many popular and well-established cyber security capture-the-flag (CTF) exercises are held each year at a variety of settings, including universities and semiprofessional security conferences. The CTF format also varies greatly, ranging from linear puzzle-like challenges to team-based offensive and defensive free-for-all hacking competitions. While these events are exciting and important as contests of skill, they offer limited educational opportunities. In particular, since participation requires considerable a priori domain knowledge and practical computer security expertise, the majority of typical computer science students are excluded from taking part in these events. The goal in designing and running the MIT Lincoln Laboratory CTF was to make the experience accessible to a wider community by providing an environment that would not only test and challenge the computer security skills of the participants but also educate and prepare those without extensive prior expertise. This seminar presents the Laboratory's self-consciously educational and open CTF, including discussions of our teaching methods, game design, scoring measures, logged data, and lessons learned.

1 MS, Computer Science, University of California–San Diego
2 PhD, Computer Science, Stanford University


Modeling to Improve the Science of Insider Threat Detection

John Holodnak1
MIT Lincoln Laboratory

Recent high-profile security breaches have highlighted the importance of insider threat detection systems for cyber security. Most of these detection systems rely on the manual or semi-automated analysis of audit logs to identify malicious activity. However, issues such as high false-positive rates and concerns over data privacy make it difficult to predict performance of these systems within an enterprise environment. These and other issues limit an organization’s ability to effectively apply these tools.

This seminar will present an approach that leverages enterprise-level modeling to predict the performance of insider threat detection systems. We provide a proof of concept of our modeling approach by applying it to a synthetic dataset and comparing its predictions to the ground truth. The output of these models that predict performance can enable enterprises to compare tools and ultimately allow them to make better informed decisions about which insider threat detection systems to deploy.

1 PhD, Mathematics, North Carolina State University


Multicore Programming in pMatlab® Using Distributed Arrays

Dr. Jeremy Kepner1
MIT Lincoln Laboratory

MATLAB is one of the most commonly used languages for scientific computing, with approximately one million users worldwide. Many of the programs written in MATLAB can benefit from the increased performance offered by multicore processors and parallel computing clusters. The MIT Lincoln Laboratory pMatlab library allows high-performance parallel programs to be written quickly by using the distributed arrays programming paradigm. This talk provides an introduction to distributed arrays programming and will describe the best programming practices for using distributed arrays to produce programs that perform well on multicore processors and parallel computing clusters. These practices include understanding the concepts of parallel concurrency versus parallel data locality, using Amdahl's Law, and employing a well-defined design-code-debug-test process for parallel codes.

1 PhD, Physics, Princeton University


New Approaches to Automatic Speaker Recognition and Forensic Considerations

Dr. Joseph P. Campbell1 & Dr. Pedro A. Torres-Carrasquillo2
MIT Lincoln Laboratory

Recent gains in the performance of automatic speaker recognition systems have been obtained by new methods in subspace modeling. This talk presents the development of speaker recognition systems ranging from traditional approaches, such as Gaussian mixture modeling, to novel state-of-the-art systems employing subspace techniques, such as factor analysis and iVector methods. This seminar also covers research on the means to exploit high-level information. For example, idiosyncratic word usage and speaker-dependent pronunciation are high-level features for recognizing speakers. These high-level features can be combined with conventional features for increased accuracy. The seminar presents new methods to increase the robustness and improve calibration of speaker recognition systems by addressing common factors in the forensic domain that degrade recognition performance. We describe MIT Lincoln Laboratory's VOCALINC system and its application to automated voice comparison of speech samples for law enforcement investigation and forensic applications. The talk concludes with appropriate uses of this technology, especially focusing on cautions regarding forensic-style applications, and a look at this technology's future directions.

1 PhD, Electrical Engineering, Oklahoma State University
2 PhD, Electrical Engineering, Michigan State University


Quantifying Lateral Movement Complexity in Enterprise Networks

Dr. Robert Lychev1
MIT Lincoln Laboratory

Currently, most (if not all) of the biggest cyber security breaches involve attackers who make use of lateral movement, but there is still no consensus among researchers and practitioners on how to quantify lateral movement at device-level and network-level granularities. Although there has been a lot of work to quantify cyber security risk in general, previous efforts propose methods that either do not sufficiently address lateral movement or have prohibitive data requirements, or both.

To this end, we have designed and implemented a framework for quantifying lateral movement complexity of enterprise networks, called the Reachability Assessment and Testing System (RATS). RATS has flexible data requirements. Given a succinct description of a network topology, RATS provides lateral movement complexity assessments at device granularity and for the whole network. RATS also incorporates a module for providing short-term security suggestions that network operators could use to take appropriate security actions and make lateral movement in their networks harder. RATS leverages the Common Cyber Environment Representation, which allows for a user-friendly way to specify network topologies and makes it convenient to integrate with cyber range capabilities, such as LARIAT. The latter option allows defenders to study effects of realistic attacks in an emulated environment to better inform their security decisions.

Finally, the RATS modular design allows security analysts to integrate their own reachability metrics and security-recommendation engines to conduct comparisons before taking any security actions.

1 PhD, Computer Science, Georgia Institute of Technology


Quantitative Evaluation of Moving Target Technology

Paula Donovan1
MIT Lincoln Laboratory

Robust, quantitative measurement of cyber technology is critically needed to measure the utility, impact, and cost of cyber technologies. Our work addresses this need by developing metrics and experimental methodology for cyber technologies. The technological focus for this presentation is moving target technology with a goal of quantitatively measuring resiliency and agility in controlled, repeatable experiments. The evaluation aims to identify one or more specific aspects that each technique is intended to improve, then to quantitatively measure the technique’s ability to realize that improvement with respect to a relevant, realistic threat model. This presentation will describe this approach to quantitative evaluation, including methodology and metrics, results of analysis, simulation and experiments, and a series of lessons learned.

1 MS, Applied Mathematics/Computer Science, University of Massachusetts Amherst


Secure and Resilient Cloud Computing for the Department of Defense

Dr. Nabil A. Schear1
MIT Lincoln Laboratory

The Department of Defense and the Intelligence Community are rapidly moving to adopt cloud computing to achieve substantial user benefits, such as the ability to store and access massive amounts of data, on-demand delivery of computing services, the capability to widely share information, and the scalability of resource usage. However, cloud computing both exacerbates existing cyber security challenges and introduces new challenges.

Today, most cloud data remain unprotected and therefore vulnerable to theft, deception, or destruction. Furthermore, the homogeneity of cloud software and hardware increases the risk of a single cyberattack on the cloud, causing widespread impact. To address these challenges, Lincoln Laboratory is developing technology that will strengthen the security and resilience of cloud computing so that the Department of Defense and Intelligence Community can confidently deploy cloud services for their critical missions. This seminar will detail the cyber security threats to cloud computing, the Laboratory's approach to cloud security and resilience, and results from several novel technologies demonstrating enhanced cyber capabilities with low overhead.

1 PhD, Computer Science, University of Illinois at Urbana-Champaign


Signal Processing for the Measurement of Characteristic Voice Quality

Dr. Nicolas Malyska1 & Dr. Thomas F. Quatieri2
MIT Lincoln Laboratory

The quality of a speaker's voice communicates to a listener information about many characteristics, including the speaker's identity, language, dialect, emotional state, and physical condition. These characteristic elements of a voice arise due to variations in the anatomical configuration of a speaker's lungs, voice box, throat, tongue, mouth, and nasal airways, and the ways in which the speaker moves these structures. The voice box, or larynx, is of particular interest in voice quality, as it is responsible for generating variations in the excitation source signal for speech.

In this seminar, we will discuss mechanisms by which voice-source variations are generated, appear in the acoustic signal, and are perceived by humans. Our focus will be on using signal processing to capture acoustic phenomena resulting from the voice source. The presentation will explore several applications that build upon these measurement techniques, including turbulence-noise component estimation during aperiodic phonation, automatic labeling for regions of irregular phonation, and the analysis of pitch dynamics.

1 PhD, Health Sciences and Technology, Massachusetts Institute of Technology
2 ScD, Electrical Engineering, Massachusetts Institute of Technology


Speech Enhancement Using Deep Neural Networks

Dr. Jonas Borgstron1 & Dr. Michael Brandstein2
MIT Lincoln Laboratory

Speech signals captured by law enforcement or the intelligence community are typically degraded by additive noise and reverberation. Speech enhancement aims to suppress such distortion, leading to improved intelligibility and reduced fatigue in analysts. This talk will review single-channel speech enhancement, and discuss recent advances made by Lincoln Laboratory that leverage deep neural networks.

1 PhD, Electrical Engineering, University of California Los Angeles
2 PhD, Electrical Engineering, Brown University


Web Security Origins and Evolution

Mary Ellen Zurko1
MIT Lincoln Laboratory

Computer systems and software should have security designed in from the start, giving them a firm foundation for the necessary security goals and principles based on confidentiality, integrity, and availability. After that, our systems and software evolve, functionality is added, new technologies are integrated. Attacks occur, and evolve right along with our systems. Our security evolves with our systems, and with the attacks. Unfortunately, there is a lack of papers, publications, or case studies on the successful application of this security life cycle model. 

In this seminar, a successful distributed system, the World Wide Web, serves as a case study in security evolution. Initial security features will be reviewed, with lessons learned for those that were adopted, and those that were not. The transition to attack response brought a different set of features, and some early security choices turned out to not meet the security goals set out for them. New features and technologies brought basic shifts in security assumptions made in early design decisions. More recent attacks will be discussed, especially those enabled by current use and development of the Web, which were largely unimagined during the initial design of the technology and its protections. This perspective provides lessons for security considerations for systems of all sizes, and at all stages of maturity.

1 MS, Computer Science, Massachusetts Institute of Technology


Technical Seminars offered in engineering.

Advanced Structures and Materials Using Additive Manufacturing

Derek Straub1
MIT Lincoln Laboratory

MIT Lincoln Laboratory is utilizing additive manufacturing processes to create novel engineering solutions to develop prototypes for research and national security applications. Additive manufacturing offers the ability to build unique structures with complex external and internal geometries producing strong, lightweight, multifunctional structures that are not possible to create using traditional manufacturing techniques. High-performance plastics are employed in a broad variety of engineering applications, which include unmanned aerial vehicles and airborne sensors. The utilization of metals, including aluminum and titanium for high-performance lightweight cooling systems, is increasing, and research teams are developing techniques to increase mechanical properties to approach their wrought counterparts. The development of new materials offering increased mechanical performance to meet demanding size, weight, performance, and environmental requirements for next-generation prototype systems is underway.

1 MEng, Manufacturing, Massachusetts Institute of Technology


Improved Resilience for the Electric Grid

Erik R. Limpaecher1 & Nicholas Judson2
MIT Lincoln Laboratory

The U.S. electric power grid has been described as the largest interconnected machine on Earth. Its continuous and stable operation has become essential to the day-to-day functioning of our economy and our households. However, the grid is undergoing a period of rapid change while facing an increasing number of threats. The national grid is composed of aging and outdated infrastructure, and with the addition of distributed intermittent renewables and natural gas generation, it is evolving in ways that often decrease grid resilience. As the climate changes and the potential for physical or cyber terrorist attacks grows, the likelihood of catastrophic outages due to extreme weather events and cyber incidents increases. MIT Lincoln Laboratory has been working with the Federal, state, and city governments as well as utilities and grid operators to improve the resiliency of critical facilities and to accelerate the adoption of distributed energy resources. This seminar will describe our recent progress on both new resilience modeling tools and analysis methods as well as on an enabling hardware-in-the-loop test bed prototype.

1 BS, Electrical Engineering, Princeton University
2 PhD, Microbiology and Molecular Genetics, Harvard University


Optomechanical Systems Engineering for Space Payloads

Dr. Keith B. Doyle1
MIT Lincoln Laboratory

MIT Lincoln Laboratory has developed a broad array of optical payloads for imaging and communication systems in support of both science and national defense applications. Space provides a particularly harsh environment for high-performing optical instruments that must survive launch and then meet demanding imaging and pointing requirements in dynamic thermal and vibration operational settings. This seminar presents some of the mechanical challenges in building optical payloads for space and draws on a history of successful optical sensors built at MIT Lincoln Laboratory along with recent systems in the news for both imaging and communication applications. Details include an overview of recent optical payloads and their applications, optomechanical design considerations, the impact and mitigation of environmental effects, and environmental testing.

1 PhD, Engineering Mechanics, University of Arizona


Self-Driving Vehicles Using Localizing Ground-Penetrating Radar

Michael Boulet1
MIT Lincoln Laboratory

Self-driving vehicles have the potential to prevent most of the over 35,000 vehicle fatalities in the U.S. each year. However, current implementations are not mature enough for widespread adoption. Optical sensors provide remarkably successful lane-keeping capabilities, yet they struggle in a range of conditions and scenarios. This seminar will describe in detail MIT Lincoln Laboratory's Localizing Ground-Penetrating Radar (LGPR) system, a sensor that provides real-time estimates of a vehicle's position even in challenging weather and road conditions. As an independent method of localization, the potential safety and robustness impact of fusing the LGPR system into traditional approaches is significant. Video demonstrations include closed loop autonomous steering systems deployed in Afghanistan and Boston and real-time localization at highway speeds at night in a snowstorm.

1 MS, Mechanical Engineering, Massachusetts Institute of Technology

Homeland Protection

Technical Seminars offered in homeland protection.

3D Plume Reconstruction

William Lawrence1, Clara McCreery2, Janice Crager3, & Benjamin L. Ervin4
MIT Lincoln Laboratory

MIT Lincoln Laboratory has developed a capability to reconstruct a smoke plume as a point cloud in 3D space over time. We use a ring of video cameras to record a smoke release from many different angles and then apply a segmentation algorithm to separate the smoke plume from the rest of the scene in each recorded video. The segmented plumes are projected into 3D space at each time step using a modified structure from motion algorithm. This capability is used to referee other sensors, verify theoretical models of plume behavior, and measure plume characteristics such as volume and bulk velocity over time.

1 PhD, Chemistry and Other Chemistry, University of California, Irvine
2 BS, Chemical Engineering, Stanford University
3 MS, Electrical Engineering, Boston University
4 PhD, Industrial Engineering, University of Illinois


Behavioral Biomarkers to Quantify Neurological and Cognitive Status

Gregory A. Ciccarelli1
MIT Lincoln Laboratory

MIT Lincoln Laboratory has been developing noninvasive biomarkers of neurological conditions based upon observable human behaviors, which hold promise for objective monitoring of psychological health, neurological function and cognitive status. Biomarkers based on speaking behavior have been successfully applied to the assessment of a variety of conditions, such as depression, Parkinson’s disease, and cognitive fatigue. This seminar will describe prior and ongoing work on speech-based assessment of these various conditions, before focusing on recent advances in the area of traumatic brain injury.

Traumatic brain injury (TBI) is prevalent in civilians and service members and is associated with a diverse set of cognitive and sensorimotor symptoms. There is a pressing need to go beyond the typical coarse severity classifications (i.e., mild, moderate, severe) when diagnosing patients with TBI. It is essential to develop technologies with the ability to disambiguate mTBI phenotypes, which in turn should facilitate the design of enhanced patient-care systems for monitoring and targeted intervention. Recent advances in virtual reality—embodied in the virtual environment at the MIT Lincoln Laboratory STRIVE Center—have enabled increasingly immersive sensorimotor provocative tests that can challenge subjects with situations that elicit underlying deficits. Pairing provocative tests with neurologically relevant, mechanical models of sensorimotor control (developed at Lincoln Laboratory) provides a neurocomputational substrate that can aid in establishing causal relationships between impaired pathways and observed behaviors. Ultimately, individualized, mechanical models of sensorimotor control may aid in designing novel physiological assessments and predicting outcomes associated with rehabilitation interventions for people who have suffered a TBI.

1 PhD, Electrical Engineering and Computer Science, Massachusetts Institute of Technology


Disease Modeling to Assess Outbreak Detection and Response

Dr. Diane C. Jamrog1
MIT Lincoln Laboratory

Bioterrorism is a serious threat that has become widely recognized since the anthrax mailings of 2001. In response, one national research activity has been the development of biosensors and networks thereof. A driving factor behind biosensor development is the potential to provide early detection of a biological attack, thereby enabling timely treatment. This presentation introduces a disease progression and treatment model to quantify the potential benefit of early detection. To date, the model has been used to assess responses to inhalation anthrax and smallpox outbreaks.

1 PhD, Computation and Applied Mathematics, Rice University


Engineering Cells to Solve Sensing Challenges

Dr. Fran Nargi1
MIT Lincoln Laboratory

MIT Lincoln Laboratory has had a successful history of developing novel cell-based biosensors. These sensors combined multiple distinct genetically engineered functions to produce rapid, reliable readouts for the presence of key biothreats, and can do so much more rapidly than other detection technologies.  Research is ongoing to expand our repertoire of biological sensors for recognition of biological signaling molecules such as proteins, hormones, and neurotransmitters, which can serve as important markers for monitoring health, assessing the effects of therapies or environmental stressors, diagnosing disease, and guiding medical treatment. The capability to monitor biological molecules in real-time on a continuous or high frequency basis would enable adverse trends to be identified quickly, drug or stressor responses to be tracked dynamically, and molecular cycles to be characterized. Our goal is to reduce the complexity and resource requirements of current real-time sensors to enable specific recognition and measurement of biological molecule of interest.

This seminar will present a historical perspective of Lincoln Laboratory sensor development from design to transition to industry, including successes and pitfalls, preliminary results of new biosensor research and future trends and needs.

1 PhD, Pathobiology, University of Connecticut

Machine Learning and Artificial Intelligence

Technical Seminars offered in machine learning and artificial intelligence.

Automated Machine Learning and Enabling Data Infrastructure

Dr. Swaroop Vattam1 & Dr. Pooya Khorrami2
MIT Lincoln Laboratory

Machine learning is becoming increasingly important with the abundance of data, while the number of skilled machine-learning experts is lagging. Automated machine learning (AutoML) systems hold the promise of allowing subject-matter experts to derive insights and value from their data without necessarily having to rely on machine-learning experts or develop deep expertise in machine learning themselves. AutoML research investigates whether it is possible to automate the process of developing end-to-end machine-learning pipelines beginning with raw data and ending with well-calibrated predictive models. This seminar will present an overview of the challenges of AutoML research and the data infrastructure required to support the development of robust AutoML methods. We will also discuss the state-of-the-art AutoML systems, their limitations, and our research directions intended to push the boundaries of the AutoML research.

1 PhD, Computer Science, Georgia Institute of Technology
2 PhD, Electrical and Computer Engineering, University of Illinois, Urbana-Champaign


Combating Illicit Marketplaces on the Deep and Dark Web Using Machine Learning

Dr. Charlie Dagli1, Dr. Lin Li2, & Dr. Joseph Campbell3
MIT Lincoln Laboratory

Increasingly, more and more illicit economic activity is being mediated online. Illicit marketplaces on the Deep Web and Dark Web provide anonymous platforms for the sale and purchase of illegal goods and services. Unfortunately, traditional technologies for indexing and search are not sufficient for combating illicit activities observed in these marketplaces. In this talk, we will describe machine learning-based approaches for enabling law enforcement to counter activities observed in these marketplaces. In particular, we will discuss technologies for multiplatform persona linking (e.g., determining if a user in one illicit marketplace is likely to be the same person as a different user in another illicit marketplace). We will also discuss uncued discovery of organizations operating in illicit marketplaces. Additionally, we will discuss how these technologies were used in the Defense Advanced Research Projects Agency’s Memex program and how Memex is providing significant impact to law enforcement agencies combatting illicit activity online.

1 PhD, Electrical and Computer Engineering, University of Illinois, Urbana-Champaign
2 PhD, Electrical and Computer Engineering, University of California, Davis
3 PhD, Electrical Engineering, Oklahoma State University


Fast AI: Enabling AI with High-Performance Computing

Dr. Vijay Gadepally1, Dr. Jeremy Kepner2, Dr. Albert Reuther3, & Dr. Siddharth Samsi4
MIT Lincoln Laboratory

Artificial intelligence (AI) has the opportunity to revolutionize the way we address challenges of evolving threats, data deluge, and rapid courses of action. AI solutions involve a number of different pieces that must work together to provide capabilities that can be used by decision makers, warfighters, and analysts. Over the past few years, these AI techniques have been successfully applied in numerous domains such as cyber security, health care, and humanitarian assistance and disaster relief. Many of these recent advances can be largely attributed to the availability of three critical elements: data, computing, and algorithms. Developing an AI solution requires access to massive quantities of data and the computational power to train complex models such as those output from a deep neural network. High-performance computing (HPC) systems provide such a platform and are the bedrock of modern AI development.

The MIT Lincoln Laboratory Supercomputing Center (LLSC) is a world leader in making HPC easy to use without compromising performance. Our award-winning research in big data technologies, parallel computing, and novel algorithms are currently utilized by thousands of researchers and has been featured in the media and popular press.

In this seminar, we describe the role of HPC in enabling AI solutions. Specifically, we focus on three research thrusts within the LLSC: extreme computing, data management, and interfaces and algorithms. Our research in extreme computing is designed to address the challenges posed by deploying highly customized hardware into existing development workflows at scale. Tools developed by LLSC researchers can be used to simplify such deployments and have been tested at tens of thousands of computing cores. Along the data management dimension, our research aims to greatly reduce the nearly 80% of time that is often spent on “data munging.” Our team has developed novel techniques for heterogenous data management, data cleaning, and data sharing that can be used by organizations. For interfaces and algorithms, we discuss our research in developing novel techniques that exploit the sparsity inherent in a number of neural network architectures and necessary to process and store increasingly complex AI models. Our research has led to new standards currently being implemented by hardware manufacturers.

1 PhD, Electrical and Computer Engineering, The Ohio State University
2 PhD, Physics, Princeton University
3 PhD, Electrical and Computer Engineering, Purdue Universitiy
4 PhD, Electrical and Computer Engineering, The Ohio State University


Machine Learning Applications in Aviation Weather and Traffic Management

Dr. Mark S. Veillette1
MIT Lincoln Laboratory

Adverse weather accounts for the majority of air traffic delays in the United States. When weather is expected to impact operations, air traffic managers (ATMs) are often faced with an overload of weather information required to make decisions. These data include multiple numerical weather model forecasts, satellite observations, Doppler radar, lightning detections, wind information, and other forms of meteorological data. Many of these data contain a great deal of uncertainty. Absorbing and utilizing these data in an optimal way is challenging even for experienced ATMs.

This talk will provide two examples of using machine learning to assist ATMs. In our first example, data from weather satellites are combined with global lightning detections and numerical weather models to create Doppler radar-like displays of precipitation in regions outside the range of weather radar. In the second example, multiple weather forecasts are used to refine the prediction of airspace capacity within selected regions of airspace. Examples and challenges of data collection, translation, modelling, and operational prototype evaluations will be discussed.

1 PhD, Mathematics, Boston University


Machine Learning for Radio-Frequency Signal Classification and Blind Demodulation

Dr. Ameya Agaskar1
MIT Lincoln Laboratory

Platforms such as unmanned air systems and unattended ground sensors are limited by their size, weight, and power consumption requirements, and as a result, embedded signal processing is limited. Exfiltrating raw sampled data for offline processing is often not an alternative because such processing demands significant bandwidth.

To accommodate at-sensor processing—including detection, demodulation, and decoding—of wideband data, MIT Lincoln Laboratory is proposing an intermediate level of RF data representation to significantly reduce the data rates to levels largely equivalent to the information-handling rate in the processing environment. This approach leverages sparsity in the RF environment and machine learning algorithms to discover signal structure without prior knowledge and leads to blind demodulation techniques.

This seminar presents a novel highly scalable approach, based on nonparametric Bayesian models, for learning unknown signal structure and a new blind synchronization technique. The Laboratory has characterized performance on the basis of signal-to-noise ratio (SNR) and the number of observations. The research team has observed that above a critical SNR learning threshold, the algorithms offer near-optimal levels of performance in terms of raw bit error rates.

1 PhD, Engineering Sciences, Harvard University


Persona Linking Tools for Large-Scale Cyber Entity Attribution

Dr. Lin Li1 & Dr. Charlie Dagli2
MIT Lincoln Laboratory

The challenge of associating digital personas across multiple information domains is a key problem in social media understanding and cyber entity attribution. Successful persona linking provides integration of information from multiple information sources to create a complete picture of user and community activities, characteristics, and trends. MIT Lincoln Laboratory has developed a suite of high-performance digital persona linking tools using advanced natural language processing and graph analytics. These tools aim to uncover hidden identities and covert network activities across multiple information domains, e.g., social media, online forums, deep and dark web marketplaces, and Department of Defense and law enforcement–specific data. The tools are designed to operate on large-scale data collections and exploit multiple aspects of persona signatures. Using multiple dimensions of persona matching, the suite of tools has demonstrated state-of-the-art performance on both social media and illicit marketplace persona-linking tasks. In this seminar, we will highlight our recent advancements in persona linking, including methodology, applications, performance, and future trends and needs.

1 PhD, Electrical and Computer Engineering, University of California, Davis
2 PhD, Electrical and Computer Engineering, University of Illinois, Urbana-Champaign


Synthetic Data Augmentation for AI

Dr. Pooya Khorrami1 & Dr. Michael Brandstein2
MIT Lincoln Laboratory

One common challenge when training machine-learning systems, particularly ones that use deep neural networks, is acquiring a large amount of curated labeled data. In many cases, obtaining large-scale datasets with high fidelity labels can be a time-consuming or even impossible task. As a result, many researchers have tried to address this issue by proposing techniques to efficiently generate large amounts of synthetic training data. In this talk, we examine two different paradigms for generating synthetic data (simple image transformations and generative adversarial networks) and assess the benefits they provide in two application domains: American Sign Language recognition and face recognition. Our findings show that even simple augmentation techniques can improve recognition accuracy when the amount of available data is low, but having more data allows for more complicated approaches to be successful.

1 PhD, Electrical and Computer Engineering, University of Illinois, Urbana-Champaign
2 PhD, Electrical Engineering, Brown University

Quantum Information Sciences

Technical Seminars offered in quantum information sciences.

Quantum Communication Networks for Practical Applications

Dr. P. Benjamin Dixon1
MIT Lincoln Laboratory

Quantum communication networks enable promising advances in the areas of precision sensing and timing, powerful computation, and enhanced security communication. These applications make use of quantum mechanical phenomena, such as superposition and entanglement, as part of well-engineered systems to enable results that are unattainable with classical systems.

MIT Lincoln Laboratory is working to characterize and explore the full utility of these applications by developing component quantum technologies and incorporating them into a Boston-area quantum network test bed. This effort draws from both significant history at MIT Lincoln Laboratory of successfully developing practical communication system demonstrations and strong collaborations with several Boston-area university research groups.

This seminar will explore these ongoing efforts at the Laboratory to develop a quantum communication networks test bed and to make use of it for practical applications.

1 PhD, Physics, University of Rochester

Radar & Signal Processing

Technical Seminars offered in radar and signal processing.

Adaptive Array Estimation

Keith Forsythe1
MIT Lincoln Laboratory

Parameter estimation is a necessary step in most surveillance systems and typically follows detection processing. Estimation theory provides parameter bounds specifying the best achievable performance and suggests maximum-likelihood (ML) estimation as a viable strategy for algorithm development. Adaptive sensor arrays introduce the added complexity of bounding and assessing parameter estimation performance (i) in the presence of limiting interference whose statistics must be inferred from measured data and (ii) under uncertainty in the array manifold for the signal search space. This talk focuses on assessing the mean-squared-error (MSE) performance at low and high signal-to-noise ratio (SNR) of nonlinear ML estimation that (i) uses the sample covariance matrix as an estimate of the true noise covariance and (ii) has imperfect knowledge of the array manifold for the signal search space. The method of interval errors (MIE) is used to predict MSE performance and is shown to be remarkably accurate well below estimation threshold. SNR loss in estimation performance due to noise covariance estimation is quantified and is shown to be quite different from analogous losses obtained for detection. Lastly, a discussion of the asymptotic efficiency of ML estimation is also provided in the general context of misspecified models, the most general form of model mismatch.

1 SM, Mathematics, Massachusetts Institute of Technology


Advanced Embedded Computing

Dr. Paul Monticciolo1, Dr. William S. Song2, & Matthew A. Alexander3
MIT Lincoln Laboratory

Embedded computing continues to influence and change the way we live, as evidenced by its ubiquitous usage in consumer and industrial electronics and its ability to enable new growth areas, such as autonomous vehicles and virtual reality systems.  Over the past several decades the Department of Defense (DoD) has been leveraging advances in embedded computing to meet national needs in intelligence, surveillance, and reconnaissance, electronic warfare, and Command and Control applications that require tremendous compute capabilities in highly constrained size, weight, and power environments. Furthermore, defense platforms such as aircraft and ships can have lifetimes upwards of 50 years; hence, hardware and software architectures must be designed to provide both the performance and flexibility to support new applications and regular technology upgrades over such a long time frame.

MIT Lincoln Laboratory’s Embedded and Open Systems Group has been a long-time leader in the design, development, and implementation of defense electronic components and systems. In this seminar, we will discuss advanced real-time hardware and software technologies along with codesigned hardware and software system solutions. First, we will provide an overview of DoD-oriented embedded computing challenges, technologies, and trends. We will then explore extremely power efficient custom application-specific integrated circuit and emerging system-on-chip solutions that parallel approaches used in smart phones. Heterogeneous system solutions that leverage field-programmable gate arrays, graphics processing units, and multi-core microprocessor components will then be addressed. Open standards-based software architectures that can meet real-time performance requirements, enable code portability, and enhance programmer productivity will be discussed. Representative implementation examples will be provided in each section. Our presentation will conclude with some prognostication on the future of embedded computing.

1 PhD, Electrical Engineering, Northeastern University
2 DSc, Electrical Engineering, Massachusetts Institute of Technology
3 BS, Computer Engineering, Northeastern University


Bioinspired Resource Management for Multiple-Sensor Target Tracking Systems

Dr. Dana Sinno1 & Dr. Hendrick C. Lambert2
MIT Lincoln Laboratory

We present an algorithm, inspired by self-organization and stigmergy observed in biological swarms, for managing multiple sensors tracking large numbers of targets. We have devised a decentralized architecture wherein autonomous sensors manage their own data collection resources and task themselves. Sensors cannot communicate with each other directly; however, a global track file, which is continuously broadcast, allows the sensors to infer their contributions to the global estimation of target states. Sensors can transmit their data (either as raw measurements or some compressed format) only to a central processor where their data are combined to update the global track file. We outline information-theoretic rules for the general multiple-sensor Bayesian target tracking problem and provide specific formulas for problems dominated by additive white Gaussian noise. Using Cramér-Rao lower bounds as surrogates for error covariances and numerical scenarios involving ballistic targets, we illustrate that the bioinspired algorithm is highly scalable and performs very well for large numbers of targets.

1 PhD, Electrical Engineering, Arizona State University
2 PhD, Applied Physics, University of California, San Diego


Multilithic Phased Array Architectures for Next-Generation Radar

Dr. Sean M. Duffy1
MIT Lincoln Laboratory

Phased array antennas provide significant operational capabilities beyond those achievable with dish antennas. Civilian agencies, such as the Federal Aviation Administration and Department of Homeland Security, are investigating the feasibility of using phased array radars to satisfy their next-generation needs. In particular, the Multifunction Phased Array Radar (MPAR) effort aims to eliminate nine different dish-based radars and replace them with radars employing a single, low-cost MPAR architecture. Also, unmanned air system operation within the National Airspace System requires an airborne sense-and-avoid (ABSAA) capability ideally satisfied by a small low-cost phased array.

Two example phased array panels are discussed in this talk. The first is the MPAR panel—a scalable, low-cost, highly capable S-band panel. This panel provides the functionality to perform the missions of air surveillance and weather surveillance for the National Airspace System. The second example is the ABSAA phased array, a low-cost Ku-band panel for unmanned air system collision avoidance radar.

The approach used in these phased arrays eliminates the drivers that lead to expensive systems. For example, the high-power amplifier is fabricated in a high-volume foundry and mounted in a surface mount package, thereby allowing industry-standard low-cost assembly processes. Also, all our integrated circuits contain multiple functions to save on semiconductor space and board-level complexity. Finally, the systems' multilayered printed circuit board assemblies combine the antenna, microwave circuitry, and integrated circuits, eliminating the need for hundreds of connectors between the subsystems; this streamlined design enhances overall reliability and lowers manufacturing costs.

1 PhD, Electrical Engineering, University of Massachusetts–Amherst


Parameter Bounds Under Misspecified Models

Keith Forsythe1
MIT Lincoln Laboratory

Parameter bounds are traditionally derived assuming perfect knowledge of data distributions. When the assumed probability distribution for the measured data differs from the true distribution, the model is said to be misspecified; mismatch at some level is inevitable in practice. Thus, several authors have studied the impact of model misspecification on parameter estimation. Most notably, Peter Huber explored in detail the performance of maximum-likelihood (ML) estimation under a very general form of misspecification; he showed consistency and normality, and derived ML estimation’s asymptotic covariance that is often referred to as the celebrated "sandwich covariance."

The goal of this talk is to consider the class of non-Bayesian parameter bounds emerging from the covariance inequality under the assumption of model misspecification. Casting the bound problem as one of constrained minimization is likewise considered. Primary attention is given to the Cramér-Rao bound (CRB). It is shown that Huber's sandwich covariance is the misspecified CRB and provides the greatest lower bound (tightest) under ML constraints. Consideration of the standard circular complex Gaussian ubiquitous in signal processing yields a generalization of the Slepian-Bangs formula under misspecification. This formula, of course, reduces to the usual one when the assumed distribution is in fact the correct one. The framework is outlined for consideration of the Barankin/Hammersley-Chapman-Robbins, Bhattacharyya, and Bobrovsky-MayerWolf-Zakai bound under misspecification.

1 SM, Mathematics, Massachusetts Institute of Technology


Polarimetric Co-location Layering: A Practical Algorithm for Mitigation of Low Grazing Angle Sea Clutter

Molly Crane1
MIT Lincoln Laboratory

Traditional detection schemes in conventional maritime surveillance radars suffer serious performance degradation due to sea clutter, especially in low grazing angle geometries. In such geometries, typical statistical assumptions regarding sea clutter backscatter do not hold. Trackers can be overwhelmed by false alarms, while objects of interest may be challenging to detect amongst sea clutter. Despite numerous attempts over several decades to devise a means of mitigating the effects of low grazing angle sea clutter on traditional detection schemes, minimal progress has been made in developing an approach that is robust and practical.

To explore whether polarization information might offer an effective means of enhancing target detection in sea clutter, MIT Lincoln Laboratory collected a fully polarimetric X-band radar dataset on the Atlantic coast of Massachusetts' Cape Ann in October 2015. The dataset spans multiple bandwidths, multiple sea states, and various targets of opportunity. Leveraging this dataset, Lincoln Laboratory developed an algorithm dubbed polarimetric colocation layering that retains detections on objects of interest while reducing the number of false alarms in a conventional single-polarization radar by as many as two orders of magnitude. Polarimetric colocation layering is robust across waveform bandwidths and sea states. Moreover, this algorithm is practical: It can plug directly into the standard radar signal processing chain.

1 PhD, Computer Engineering, Boston University


Polynomial Rooting Techniques for Adaptive Array Direction Finding

Dr. Gary F. Hatke1
MIT Lincoln Laboratory

Array processing has many applications in modern communications, radar, and sonar systems. Array processing is used when a signal in space, be it electromagnetic or acoustic, has some spatial coherence properties that can be exploited (such as far-field plane wave properties). The array can be used to sense the orientation of the plane wave and thus deduce the angular direction to the source. Adaptive array processing is used when there exists an environment of many signals from unknown directions as well as noise with unknown spatial distribution. Under these circumstances, classical Fourier analysis of the spatial correlations from an array data snapshot (the data seen at one instance in time) is insufficient to localize the signal sources.

In estimating the signal directions, most adaptive algorithms require computing an optimization metric over all possible source directions and searching for a maximum. When the array is multidimensional (e.g., planar), this search can become computationally expensive, as the source direction parameters are now also multidimensional. In the special case of one-dimensional (line) arrays, this search procedure can be replaced by solving a polynomial equation, where the roots of the polynomial correspond to estimates of the signal directions. This technique had not been extended to multidimensional arrays because these arrays naturally generated a polynomial in multiple variables, which does not have discrete roots.

This seminar introduces a method for generalizing the rooting technique to multidimensional arrays by generating multiple optimization polynomials corresponding to the source estimation problem and finding a set of simultaneous solutions to these equations, which contain source location information. It is shown that the variance of this new class of estimators is equal to that of the search techniques they supplant. In addition, for sources spaced more closely than a Rayleigh beamwidth, the resolution properties of the new polynomial algorithms are shown to be better than those of the search technique algorithms.

1 PhD, Electrical Engineering, Princeton University


Radar Signal Distortion and Compensation with Transionospheric Propagation Paths

Dr. Scott D. Coutts1
MIT Lincoln Laboratory

Electromagnetic signals propagating though the atmosphere and ionosphere are distorted and refracted by this propagation media. The effects are particularly pronounced for lower-frequency signals propagating through the ionosphere. In the extreme case, frequencies in the high-frequency band can be severely refracted by the ionosphere to the point that they are reflected back toward the ground. This effect is exploited by over-the-horizon radars and high-frequency communication systems to achieve very-long-range, over-the-horizon performance. For the general radar case, the measurements of range, Doppler shift, and elevation and azimuth angles are all corrupted from their free-space values with time-varying biases.

To provide accurate radar parameter estimates in the presence of these errors, a three-dimensional method to compensate for the ionosphere refraction has been developed at Lincoln Laboratory. In this seminar, two examples of the method’s use are provided: (1) the radar returns from a known object are used to specify an unknown ionosphere electron density, and (2) an unknown satellite state vector is estimated with and without the ionosphere compensation so that the accuracy improvement can be quantified. Magneto-ionic ray tracing is used to generate the three-dimensional propagation model and propagation correction tables. A maximum-likelihood satellite-ephemeris estimator is designed and demonstrated using corrupted radar data. The technique is demonstrated using “real data” examples with very encouraging results and is applicable at radar frequencies ranging from high to ultrahigh.

1 PhD, Electrical Engineering, Northeastern University


Synthetic Aperture Radar

Dr. Gerald R. Benitz1
MIT Lincoln Laboratory

MIT Lincoln Laboratory is investigating the application of phased-array technology to improve the state of the art in radar surveillance. Synthetic aperture radar (SAR) imaging is one mode that can benefit from a multiple-phase-center antenna. The potential benefits are protection against interference, improved area rate and resolution, and multiple simultaneous modes of operation.

This seminar begins with an overview of SAR, giving the basics of resolution, collection modes, and image formation. Several imaging examples are provided. Results from the Lincoln Multimission ISR Testbed (LiMIT) X-band airborne radar are presented. LiMIT employs an eight-channel phased-array antenna and records 180 MHz bandwidth from each channel simultaneously. One result employs adaptive processing to reject wideband interference, demonstrating recovery of a corrupted SAR image. Another result employs multiple simultaneous beams to increase the area of the image beyond the conventional limitation that is due to the pulse repetition frequency. Areas that are Doppler ambiguous can be disambiguated by using the phased-array antenna.

1 PhD, Electrical Engineering, University of Wisconsin–Madison

Advanced Technology

Technical Seminars offered in solid state devices, materials, & processes.

Advanced Materials at MIT Lincoln Laboratory

Dr. Jeffrey B. Chou1, Dr. Christopher M. Roberts2, & Dr. Steven A. Vitale3
MIT Lincoln Laboratory

At MIT Lincoln Laboratory, we are engineering materials that include new optical and electrical properties with capabilities beyond those found in nature. Our vision is to use these fundamentally new materials to enable faster and lower power electronics, advanced imaging applications, and high-tech wearable fabrics. At Lincoln Laboratory, our cross-disciplinary team of engineers, programmers, physicists, and material scientists are working together to turn these new materials into useable systems that will impact everyday life. Our research spans first-principle calculations via density functional theory to fully fabricated and integrated systems. Many of our research programs also leverage the new and exciting discoveries from the MIT campus faculty and graduate students, who often act as our collaborators on many projects. The vast material deposition and characterization capabilities between Lincoln Laboratory and MIT campus enable an enormous ability to develop new and exciting materials. This seminar will provide a broad overview of the diverse advanced materials programs at Lincoln Laboratory and highlight recent work, including phase-change metamaterials, dielectric metasurfaces, and two-dimensional materials for valleytronics.

1 PhD, Electrical Engineering and Computer Sciences, University of California–Berkeley
2 PhD, Physics, University of Massachusetts
3 PhD, Chemical Engineering, Massachusetts Institute of Technology


Electromagnetic Vector Antenna and Constrained Maximum-Likelihood Imaging for Radio Astronomy

Dr. Frank C. Robey1, Dr. Alan J. Fenn2, & Dr. Mark J. Silver3
MIT Lincoln Laboratory
Mary E. Knapp4, Dr. Frank D. Lind5, & Dr. Ryan A. Volz6

Radio astronomy at frequencies below 50 MHz provides a window into nonthermal processes in objects ranging from planets to galaxies. These frequencies also provide insight into the formation of the universe. Ground-based arrays cannot adequately observe astronomical sources below about 20 MHz due to ionospheric perturbation and shielding; therefore, the sky has not been mapped with high angular resolution below that frequency. Radio astronomy at these frequencies must be accomplished in space. With space-based sensing, the cost of each satellite is high, and consequently we desire to maximize the information from each satellite. This presentation discusses designs for mapping the sky from space using electromagnetic vector sensors. These sensors measure the full electric- and magnetic-field vectors of incoming radiation and enable measurement with reasonable angular resolution from a compact sensor with a single-phase center. A model for radio astronomy imaging is introduced, and the constraints imposed by Maxwell's equations and vector sensing of an electromagnetic field are explored. This presentation shows that the covariance matrix inherent in the stochastic process must lie in a highly constrained subset of allowable positive definite covariance matrices. Results are shown that use an expectation maximization to form images consistent with a covariance matrix that satisfies the constraints. A conceptual design for a spacecraft to map the sky at frequencies below the ionospheric cutoff is discussed, along with concept development progress.

1 PhD, DsC, Electrical Engineering, Washington University
2 PhD, Electrical Engineering, The Ohio State University
3 PhD, Aerospace Engineering, University of Colorado, Boulder
4 Graduate Student, Earth, Atmosphere, and Planetary Sciences Department, Massachusetts Institute of Technology
5 PhD, Geophysics, University of Washington
6 PhD, Aeronautics/Astronautics, Stanford University


Functional Fibers and Fabrics at MIT Lincoln Laboratory

Dr. Alexander Stolyarov1, Dr. Lauren Cantley2, Dr. Jeffrey B. Chou3, Dr. Lalitha Parameswaran4, & Dr. Bradford Perkins, Jr.5
MIT Lincoln Laboratory

At MIT Lincoln Laboratory, we are pushing the frontiers of advanced functional fibers and fabrics from the age-old textiles to next generation fabric systems with capabilities spanning sensing, communications, health monitoring and more. To enable this development, our team focuses on multimaterial fiber device fabrication—fibers with internal domains containing semiconductors, metals, and insulators. By judicious selection of materials and processing conditions, individual fibers with nanometer scale semiconductor features can be drawn into kilometer lengths, thus enabling electronic and opto-electronic fiber devices. Weaving these fibers into fabrics enables textiles with sophisticated properties tailored for national security mission areas. This seminar will overview this new but rapidly emerging field and will include several advanced fabric use cases, including fabric-based communications, chemical sensing, and fabrics connected to the cloud.

1 PhD, Physics, Harvard University
2 PhD, Mechanical Engineering, Boston University
3 PhD, Electrical Engineering and Computer Science, University of California-Berkeley
4 PhD, Electrical Engineering, Massachusetts Institute of Technology
5 PhD, Physics, University of Colorado


Geiger-Mode Avalanche Photodiode Arrays for Imaging and Sensing

Dr. Brian F. Aull1
MIT Lincoln Laboratory

This seminar discusses the development of arrays of silicon avalanche photodiodes integrated with digital complementary metal-oxide semiconductor (CMOS) circuits to make focal planes with single-photon sensitivity. The avalanche photodiodes are operated in Geiger mode: they are biased above the avalanche breakdown voltage so that the detection of a single photon leads to a discharge that can directly trigger a digital circuit. The CMOS circuits to which the photodiodes are connected can either time stamp or count the resulting detection events. Applications include three-dimensional imaging using laser radar, wavefront sensing for adaptive optics, and optical communications.

1 PhD, Electrical Engineering, Massachusetts Institute of Technology


High-Power Laser Technology at MIT Lincoln Laboratory

Dr. T. Y. Fan1 & Dr. Darren A. Rand2
MIT Lincoln Laboratory

This seminar includes an overview of high-power laser technology development at MIT Lincoln Laboratory. Two topics will be emphasized: laser beam combining and cryogenically cooled solid-state lasers. Beam combining, taking the outputs from arrays of lasers and combining them into a single beam with near-ideal propagation characteristics, has been pursued for many decades, but its promise has started to become a reality only within the last decade. Beam combining of both fiber and semiconductor arrays using both wavelength and coherent beam combining approaches will be discussed. Cryogenically cooled solid-state lasers enable high-efficiency and high-average-power lasers while maintaining excellent beam quality. This approach is particularly applicable for developing laser sources with both high peak and average power.

1 PhD, Electrical Engineering, Stanford University
2 PhD, Electrical Engineering, Princeton University


Integrated Photonics for Sensing, Communications, and Signal Processing

Dr. Suraj Bramhavar1, Dr. Paul Juodawlkis2, Dr. Boris (Dave) Kharas3, Dr. Cheryl Sorace-Agaska4, Dr. Reuel Swint5, & Dr. Siva Yegnanarayanan6
MIT Lincoln Laboratory

Integrated photonics involves the aggregation of multiple optical or photonic components onto a common substrate using either monolithic or hybrid integration techniques. Common components include lasers, optical modulators, detectors, filters, splitters and combiners, couplers, and optical isolators. These components are typically connected using optical waveguides built into the substrate platform to create a photonic integrated circuit (PIC). Relative to optical circuits constructed from discrete components connected using optical fibers, PICs have a number of advantages, including reduced size and weight, reduced fiber interfaces and associated fiber-pigtailing cost, and improved environmental stability for coherent optical signal processing. These advantages are strengthened by combining PICs with electronic integrated circuits via several electronic-photonic integration techniques (wire bonding, flip chip, wafer bonding, or monolithic). Depending on the desired functions, PICs can be fabricated from a variety of materials including silicon, silicon-nitride, and compound semiconductors (e.g., indium phosphide, gallium arsenide, gallium nitride, and their associated ternary and quaternary compounds).

In this seminar, we will provide an introduction to integrated photonics technology including design, fabrication, packaging, and characterization. We will describe resources and capabilities to develop silicon, silicon-nitride, and compound-semiconductor PICs at MIT Lincoln Laboratory using in-house fabrication resources. We will also describe several government applications of integrated photonics (e.g., remote sensing, free-space communications, and signal processing), and how these are similar and different from commercial applications.

1 PhD, Electrical Engineering, Boston University
2 PhD, Electrical Engineering, Georgia Institute of Technology
3 PhD, Materials Science, State University of New York at Stony Brook
4 PhD, Electrical Engineering, Massachusetts Institute of Technology
5 PhD, Electrical Engineering, University of Illinois
6 PhD, Electrical Engineering, University of California Los Angeles


Microfluidics at MIT Lincoln Laboratory

Prof. Todd A. Thorsen1, Dr. Shaun R. Berry2, & Dr. Jakub Kedzierski3
MIT Lincoln Laboratory

At MIT Lincoln Laboratory, we are engineering general-purpose tools for microfluidic platforms. Cross-disciplinary teams, consisting of engineers, programmers, and biologists, are designing and developing microfluidic tools for a broad range of applications. Our vision is to apply microfabrication techniques, traditionally used in microelectromechanical system (MEMS) and integrated circuit manufacturing, to develop microfluidic components and systems that are reconfigurable, scalable, and programmable through a software interface. End users of these microfluidic tools will be able to orchestrate complex and adaptive procedures that are beyond the capabilities of today’s hardware. This talk will provide a broad overview on the diverse microfluidic programs at Lincoln Laboratory, highlighting recent work, including microhydraulic actuators, integrated "lab-on-a-chip" systems for biological exploration, and actively configurable optofluidics such as liquid microlenses and displays.

1 PhD, Biochemistry and Molecular Biophysics, California Institute of Technology
2 PhD, Mechanical Engineering, Tufts University
3 PhD, Electrical Engineering, University of California–Berkeley


New Fabrication Platforms and Processes

Dr. Bradley Duncan1, Dr. Lalitha Parameswaran2, Dr. Livia Racz3, & Dr. Melissa Smith4
MIT Lincoln Laboratory

Additive manufacturing techniques, such as 3D printing and 3D assembly, have transformed production by breaking long-established rules of economies of scale and manufacturing complexity. MIT Lincoln Laboratory has a suite of ongoing efforts to develop novel materials and processes for the construction of nonplanar 3D devices, integrated assemblies and systems, using our expertise in material science, semiconductor fabrication, chemistry, and mechanical and electrical engineering. Our goal is to develop processes that enable "one-stop" fabrication of complete systems with both mechanical and electrical functionality. Ongoing programs include: design of materials and processes for concurrent 3D printing of dissimilar materials; development of high-resolution, radio-frequency 3D-printed structures for low size, weight, and power applications extending to the THz range; design of novel microplasma-based sputtering of conductive materials for direct write of interconnect on nonplanar substrates; and development of reconstructed wafer processes to overcome die-size limitations and enable scaling of digital integrated circuits to large formats.

1 PhD, Organic Chemistry, Brown University
2 PhD, Electrical Engineering, Massachusetts Institute of Technology
3 PhD, Materials Science and Engineering, Massachusetts Institute of Technology
4 PhD, Materials Science and Engineering, Massachusetts Institute of Technology


Slab-Coupled Optical Waveguide Devices and Their Applications

Dr. Paul W. Juodawlkis1, Dr. Gary M. Smith2, Dr. Reuel B. Swint3, & Dr. George W. Turner4
MIT Lincoln Laboratory

For two decades, MIT Lincoln Laboratory has been developing new classes of high-power semiconductor optoelectronic emitters and detectors based on the slab-coupled optical waveguide (SCOW) concept. The key characteristics of the SCOW design include (1) the use of a planar slab waveguide to filter the higher-order transverse modes from a large rib waveguide, (2) low overlap between the optical mode and the active layers, and (3) low excess optical loss. These characteristics enable waveguide devices having large (> 5 × 5 μm) symmetric fundamental-mode operation and long length (~1 cm). These large dimensions, relative to conventional waveguide devices, allow efficient coupling to optical fibers and external optical cavities, and provide reduced electrical and thermal resistances for improved heat dissipation.

This seminar will review the SCOW operating principles and describe applications of the SCOW technology, including Watt-class semiconductor SCOW lasers (SCOWLs) and amplifiers (SCOWAs), monolithic and ring-cavity mode-locked lasers, single-frequency external cavity lasers, and high-current waveguide photodiodes. The SCOW concept has been demonstrated in a variety of material systems at wavelengths including 915, 960–980, 1060, 1300, 1550, 1650, and 2100 nm. In addition to single emitters, higher brightness has been obtained by combining arrays of SCOWLs and SCOWAs using wavelength beam-combining and coherent combining techniques. These beam-combined SCOW architectures offer the potential of kilowatt-class, high-efficiency, electrically pumped optical sources.

1 PhD, Electrical Engineering, Georgia Institute of Technology
2 PhD, Electrical Engineering, University of Illinois at Urbana-Champaign
3 PhD, Electrical Engineering, University of Illinois at Urbana-Champaign
4 PhD, Electrical Engineering, Johns Hopkins University

Ultrasensitive Mass Spectrometry Development at MIT Lincoln Laboratory

Dr. Ta-Hsuan Ong1, Dr. Jude Kelley2, & Dr. Roderick Kunz3
MIT Lincoln Laboratory

Mass spectrometry (MS) has long been regarded as one of the most reliable methods for chemical analysis because of its ability to identify molecules based on their molecular weight and fragmentation patterns combined with its sensitivity in the pico- to femtogram range. Traditionally, analytical systems that utilize mass spectrometry have coupled this method with other analytical techniques such as gas or liquid chromatography; however, the development of ambient ionization techniques and improvements in mass spectrometer design have demonstrated that this technique can function on its own as a multipurpose chemical detector.

MIT Lincoln Laboratory has been advancing these systems in an effort to develop the next generation of MS-based sensing systems focused on detection missions relevant to national security. This work has centered on ways to improve the sensitivity of MS-based systems to explosives when they are encountered as vapors and as trace particulate residues. The vapor detection system that the Laboratory has developed has a real-time sensitivity to concentrations in the parts-per-quadrillion range. Real-time sensitivity at these levels rivals that of conventional vapor detectors (canines), providing opportunities to (1) better understand the origins, dynamics, concentrations, and attenuation levels of vapor signatures associated with concealed threats and (2) help improve canine training. The Laboratory's work on detecting an explosive particulate residue has focused on applying thermal desorption and atmospheric pressure chemical ionization (TD-APCI) to swipe-based surface samples across a wide range of explosive classes. For explosive threats that are challenging to detect with TD-APCI, Lincoln Laboratory researchers have developed specialized chemical reagents that can be added directly to the ionization source to improve both the specificity of the technique as well as its sensitivity. The information revealed from these studies is used to assess current mass spectrometer-based explosive trace detection systems and guide future development efforts.

1 PhD, Analytical Chemistry, University of Illinois at Urbana-Champaign
2 PhD, Physical Chemistry, Yale University
3 PhD, Analytical Chemistry, University of North Carolina at Chapel Hill

Space Systems Technology

Technical Seminars offered in space systems technology.

Overview of the NASA TROPICS CubeSat Constellation Mission

Dr. William J. Blackwell1
MIT Lincoln Laboratory

Recent technology advances in miniature microwave radiometers that can be hosted on very small satellites have made possible a new class of constellation missions that provide very high revisit rates of tropical cyclones and other severe weather. The Time-Resolved Observations of Precipitation structure and storm Intensity with a Constellation of Smallsats (TROPICS) mission was selected by NASA as part of the Earth Venture–Instrument (EVI-3) program and is now in development with planned launch readiness in late 2019. The overarching goal for TROPICS is to provide nearly all-weather observations of 3D temperature and humidity, as well as cloud ice and precipitation horizontal structure, at high temporal resolution to conduct high-value science investigations of tropical cyclones, including:

  1. Relationships of rapidly evolving precipitation and upper cloud structures to upper-level warm-core intensity and associated storm intensity changes;
  2. Evolution (including diurnal variability) of precipitation structure and storm intensification in relationship to environmental humidity fields; and
  3. The impact of rapid-update observations on numerical and statistical intensity forecasts of tropical cyclones. 

TROPICS will provide rapid-refresh microwave measurements (median refresh rate better than 60 minutes for the baseline mission) over the tropics that can be used to observe the thermodynamics of the troposphere and precipitation structure for storm systems at the mesoscale and synoptic scale over the entire storm lifecycle. TROPICS will comprise a constellation of six 3U CubeSats in three low-Earth orbital planes. Each CubeSat will host a high-performance scanning radiometer to provide temperature profiles using seven channels near the 118.75 GHz oxygen absorption line, water vapor profiles using three channels near the 183 GHz water vapor absorption line, imagery in a single channel near 90 GHz for precipitation measurements (when combined with higher resolution water vapor channels), and a single channel at 205 GHz that is more sensitive to precipitation-sized ice particles and low-level moisture.

This observing system offers an unprecedented combination of horizontal and temporal resolution in the microwave spectrum to measure environmental and inner-core conditions for tropical cyclones on a nearly global scale and is a major leap forward in the temporal resolution of several key parameters needed for assimilation into advanced data assimilation systems capable of utilizing rapid-update radiance or retrieval data. This presentation will provide an overview of the mission and an update on current status, with a focus on recent performance simulations on a range of observables to be provided by the constellation, including temperature, water vapor, rain rate, and tropical cyclone intensity indicators.

1 PhD, Electrical Engineering, Massachusetts Institute of Technology


Synoptic Astronomy with the Space Surveillance Telescope

Dr. Deborah F. Woods1
MIT Lincoln Laboratory

The Space Surveillance Telescope (SST) is a 3.5-meter telescope with a 3-degree × 2-degree field of view developed by MIT Lincoln Laboratory for DARPA for tracking satellites and Earth-orbiting space debris. The telescope has a three-mirror Mersenne-Schmidt design that obtains a fast focal ratio of F/1.0, enabling deep, wide-area searches with rapid revisit rates. These attributes also have utility for detections in time domain astronomy, in which objects are observed to vary in position or in brightness. The SST acquired asteroid observations in support of the Lincoln Near-Earth Asteroid Research program from 2014–2017. In the course of its observing programs, the SST has acquired a treasure trove of archival image data. While the SST is currently in the process of relocation to North West Australia, analysis with archival data continues.

With the development of a science pipeline for the processing of archival data, the SST is contributing to studies in time domain astronomy. The science goals are to identify specific classes of variable stars and to use the stars' period of variability and color information from other astronomical catalogs to estimate distances to the objects. Object distances enable the 3D reconstruction of compact halo groups around the Milky Way galaxy, which help inform galaxy formation models. While the SST's primary mission is to support space situational awareness, the image data collected during the course of operations has been valuable for astronomical applications in the time domain.

1 PhD, Astronomy, Harvard University