Would you like to hear a Lincoln Laboratory talk?

We are pleased to present technical seminars to interested college and university groups. To arrange a technical seminar, please call (781) 981-2465 or email [email protected] and provide a rank-ordered list of requested seminars, your preferred date/time options, and a description of the target audience.

We offer the seminars in the following areas: air traffic control; communication systems; cyber security and information sciences; engineering; homeland protection; human language technology; radar and signal processing; solid state devices, materials, and processes; space control technology; and systems and architectures. Please find related seminars and abstracts below.

Air Traffic Control

Technical Seminars offered in air traffic control.

Human-System Integration in Complex Systems

Dr. Hayley J. Davison Reynolds1
MIT Lincoln Laboratory

MIT Lincoln Laboratory has had a successful history of integrating decision support systems into the air traffic control and emergency management domain, even though the introduction of new technologies often meets user resistance. Because of the complex nature of the work, air traffic controllers and emergency managers heavily rely on certain technologies and decision processes to maintain a safe and effective operating environment. This reliance on familiar technology and processes makes the introduction of a new tool into the environment a difficult task, even though the tool may ultimately improve the decision and raise levels of safety and/or efficiency of the operation. Poorly integrated systems can result in users being disappointed by a system that provides information for which they have no concept of use, or more likely, that results in the system's not being used despite its existence in the array of available information systems; neither option yields the benefits for which the tool was designed.

In this seminar, a practical methodology for designing and fielding decision support systems that maximizes the potential of effective integration of the system into the users' operational context will be presented. Several examples of air traffic control and emergency management decision support prototype systems designed by Lincoln Laboratory, including the Route Availability Planning Tool (RAPT), the Tower Flight Data Manager (TFDM), and the Hurricane Evacuation (HVX) decision support tool will be described to demonstrate this process. Included in the presentation will be areas in which the designers ran into roadblocks in making the systems effective and the combination of qualitative and quantitative techniques used to help properly integrate the system in the field, yielding measurable operational benefits.

1 PhD, Aeronautical Systems and Applied Psychology, Massachusetts Institute of Technology

 

Integrating Unmanned Aircraft Systems Safely into the National Airspace System

Dr. Rodney E. Cole1
MIT Lincoln Laboratory

Unmanned aircraft systems (UAS) such as the Air Force's Global Hawk and Predator are increasingly employed by the military and Department of Homeland Security in roles that require sharing airspace with civilian aircraft. Missions include pilot training, border patrol, highway and agricultural observation, and disaster management. Because of the pressure for widespread access for UAS to the national airspace and the risk of collision with passenger aircraft, UAS operators must find a way to integrate with manned aircraft with a very high degree of safety. The key to safe integration of UAS into the national airspace is the development and assessment of "sense and avoid" (SAA) technologies to replace the manned aircraft pilot's ability to "see and avoid" other aircraft.

MIT Lincoln Laboratory is conducting research to address safe and flexible UAS integration with commercial and general aviation aircraft. Research areas include development of sophisticated computer models that simulate millions of encounters between UAS and civilian aircraft to characterize airspace hazards and collision rates. These models can be applied to assess the performance of SAA algorithms designed to maintain separation, "well clear," between UAS and civilian aircraft while observing and adhering to established right-of-way rules. The Laboratory is also conducting groundbreaking research in the area of collision avoidance logic and is pursuing a probabilistic approach to collision avoidance that considers the uncertainty in pilot response to alerts and uncertainty in future states of the threat aircraft. This approach offers the potential to provide increased safety with decreased false alarms over conventional techniques and is a candidate for future Traffic Alert Collision Avoidance Systems (TCAS) and UAS SAA applications.

The Laboratory is also working with the Department of Defense (DoD) and Department of Homeland Security to develop ground-based sense and avoid (GBSAA) and airborne sense and avoid (ABSAA) surveillance architectures to satisfy the Federal Aviation Administration's (FAA) requirement for replacing the onboard pilot's "see and avoid" function. Under DoD sponsorship, Lincoln Laboratory has deployed a service-oriented architecture GBSAA test bed that will be utilized in operational and simulation-over-live environments to collect data and operator feedback that can then be used to support future certification with the FAA. This seminar will provide a broad overview of the Laboratory's efforts in UAS airspace integration and next-generation aircraft collision avoidance algorithms and will provide an overview of the GBSAA test bed that is under development for the DoD.

1 PhD, Mathematics, University of Colorado–Boulder

 

Radar Detection of Aviation Weather Hazards

Dr. John Y. N. Cho1
MIT Lincoln Laboratory

Bad weather plays a factor in many aviation accidents and incidents. Microburst, hail, icing, lightning, fog, turbulence — these are atmospheric phenomena that can interfere with aircraft performance and a pilot's ability to fly safely. Thus, for safe and efficient operation of the air traffic system, it is crucial to continuously observe meteorological conditions and accurately characterize phenomena hazardous to aircraft. Radar is the most important weather-sensing instrument for aviation. This seminar will discuss technical advances that led to today's operational terminal wind-shear detection radars. An overview of recent and ongoing research to improve radar capability to accurately observe weather hazards to aviation will also be presented.

1 PhD, Electrical Engineering, Cornell University

 

System Design in an Uncertain World: Decision Support for Mitigating Thunderstorm Impacts on Air Traffic

Richard A. DeLaura1
MIT Lincoln Laboratory

Weather accounts for 70% of the cost of air traffic delays — about $28 billion annually — within the United States National Airspace System (NAS). Most weather-related delays occur during the summer months, when thunderstorms affect air traffic, particularly in the crowded Northeast. The task of air traffic management, complicated even in the best of circumstances, can become overwhelmingly complex as air traffic managers struggle to route traffic reliably through rapidly evolving thunderstorms. A new generation of air traffic management decision support tools promises to reduce air traffic delays by accounting for the potential effects of convective weather, such as thunderstorms, on air traffic flow. Underpinning these tools are models that translate high-resolution convective weather forecasts into estimates of impact on aviation operations.

This seminar will present the results of new research to develop models of pilot decision-making and air traffic capacity in the presence of thunderstorms. The models will be described, initial validation will be presented, and sources of error and uncertainty will be discussed. Finally, some applications of these models and directions for future research will be briefly described.

1 AB, Physics, Harvard University

Communication Systems

Technical Seminars offered in communication systems.

Balloon Communications Relay Systems and Technology

Dr. Ameya Agaskar1
MIT Lincoln Laboratory

Today's smart devices, such as phones and watches, have allowed us to connect all aspects of our lives. However, these devices need enough radio spectrum to enable communications. As more smart devices are adopted, the same frequency spectrum will likely need to be utilized by a growing number of physically close transmitters to communicate to a base station, causing congestion and interference. A robust solution to this spectrum congestion and general interference problem is urgently needed.

This seminar provides an overview of a sparse large-aperture array of high-altitude balloon relay solution, where a collection of low-cost balloon relays continuously covers a communication area of interest. Each balloon would relay received signals down to a base station where the signals are processed using advanced adaptive beamforming techniques. The individual user signals are extracted and decoded while the interference is suppressed. The large aperture of the balloon array allows for the spatial separation of very close transmitters on the ground. Among the difficulties of the concept that were overcome include the fact that the balloons drift with time and their location may not be precisely known at any given time by the base-station. The seminar covers conceptual difficulties and their solutions, design choices and tradeoffs, as well as actual beamforming results achieved for interference suppression and videos of the balloon concept during a test flight.

1 PhD, Engineering Sciences, Harvard University

 

Diversity in Air-to-Ground Lasercom: The Focal Demonstration

Dr. Frederick G. Walther1
MIT Lincoln Laboratory

Laser communications (lasercom) provides significant advantages, compared to radio-frequency (RF) communications, which include a large, unregulated bandwidth and high beam directionality for free-space links. These advantages provide capabilities for high (multi Gb/s) data transfer rates; reduced terminal size, weight and power; and a degree of physical link security against out-of-beam interferers or detectors. This seminar addresses the key components of lasercom system design, including modeling and simulation of atmospheric effects, link budget development, employment of spatial and temporal diversity techniques to mitigate signal fading due to scintillation, and requirements for acquisition and tracking system performance. Results from recent flight demonstrations show stable tracking, rapid reacquisition after cloud blockages, and high data throughput for a multi-Gb/s communications link out to 80 km. Potential technologies for further development include a compact optical gimbal that has low size, weight, and power, and more efficient modem and coding techniques to extend range and/or data rate.

1 PhD, Physics, Massachusetts Institute of Technology

 

Dynamic Link Adaptation for Satellite Communications

Dr. Huan Yao1
MIT Lincoln Laboratory

Future protected military satellite communications will continue to use high-transmission frequencies to capitalize on large amounts of available bandwidth. However, the data flowing through these satellites will transition from the circuit-switched traffic of today's satellite systems to Internet-like packet traffic. One of the main differences in migrating to packet-switched communications is that the traffic will become bursty (i.e., the data rate from particular users will not be constant). The variation in data rate is only one of the potential system variations. At the frequencies of interest, rain and other weather phenomena can introduce significant path attenuation for relatively short time periods. Current protected satellite communications systems are designed with sufficient link margins to provide a desired availability under such degraded path conditions. These systems do not have provisions to use the excess link margins for additional capacity when weather conditions are good. The focus of this seminar is the design of a future satellite system that autonomously reacts to changes in link conditions and offered traffic. This automatic adaptation drastically improves the overall system capacity and the service that can be provided to ground terminals.

1 PhD, Electrical Engineering, Massachusetts Institute of Technology

 

Laser Communications to Enable Space Exploration

Dr. Farzana I. Khatri1
MIT Lincoln Laboratory

Traditionally, communications between the Earth and space has relied on radio frequency (RF) systems, which have been in use since the Apollo era when sensor technology was primitive and the Internet did not exist. Today, commercial companies have deployed satellites with sensors that generate many terabytes of data each day, but only a fraction of this data is transmitted back to earth due to communications bandwidth constraints. Furthermore, as humans venture deeper into space, higher communications bandwidth will be required to support them. Free-space laser communications, or “lasercom,” offers a high bandwidth and low- size, weight, and power solution to the space bandwidth bottleneck by using the same lasers and techniques that revolutionized fiber-optic communications in the 1990s. This talk will describe the Lincoln Laboratory's current efforts in designing and deploying lasercom systems with NASA, the current efforts to develop a strong industry base in lasercom, and upcoming lasercom missions.

1 PhD, Electrical Engineering, Massachusetts Institute of Technology

 

Practical Capacity Benchmarking for Wireless Networks

Dr. Jun Sun1
MIT Lincoln Laboratory

Despite the prevalence of ad hoc networks in wireless mesh networks, wireless sensor networks, and the tactical communication environment, there is not yet a fundamental understanding of their achieved performance relative to the optimal limit. Many evaluations of emerging wireless ad hoc networking systems have left the impression that the systems exhibit poor performance, but these evaluations have not provided either a clear vision of what performance should be attained or which details of network implementation are responsible for the systems' disappointing results. The goal of the work discussed in this seminar is to develop a network benchmarking capability that can be used to evaluate the performance of emerging wireless ad hoc networks. MIT Lincoln Laboratory's focus is on the development of a network capacity benchmark.

This talk provides an upper bound on the wireless network capacity that is computationally efficient to implement. The upper-bound calculations consider multiple different wireless interference models. The seminar shows that in a wireless network with n nodes and bounded degree, the gap between the network capacity and its upper bound is O(log n) under uniform traffic and several different interference models. For the more constrained case of a wireless N*M grid network, the upper bound is at most twice the value of the network capacity.

The talk shows that, in practice, Lincoln Laboratory's approach performs extremely well. A polynomial time randomized algorithm generates the upper bound in general wireless networks. To ascertain the performance of their upper bound, the Laboratory's researchers use the cross-layer approach to develop a lower bound by considering only a subset of independent sets. The lower bound is shown to be within 95% of the upper bound on average for the primary interference model. For 802.11, 802.16 and the 2-hop interference models, the lower bound is within 70% of the upper bound. These bounds are used to quantify the impact of commonly used algorithms at the routing and the scheduling layers on the overall network throughput. The seminar compares the individual effects of these algorithms on overall network throughput performance.

1 PhD, Electrical Engineering and Computer Science, Massachusetts Institute of Technology

 

Providing Information Security with Quantum Physics — A Practical Engineering Perspective

Dr. P. Benjamin Dixon1
MIT Lincoln Laboratory

Quantum information technology enables promising advances in the area of communications security. A well-engineered application of quantum mechanics enables results that are unattainable with classical-only processing. The most well-known example is quantum key distribution, a family of protocols to distribute a secret key. With a carefully designed protocol, quantum mechanics bounds the information an eavesdropper could obtain without being detected.

A verifiable random number generator (vRNG) is another important quantum information technology. A secure source of random bits is an important input for nearly all cryptographic protocols. Classical RNGs, which are subject to silent failures, potentially create security vulnerabilities. A vRNG uses the observation of a quantum phenomenon — the presence of entanglement via a violation of Bell's inequalities — to certify the entropy of the vRNG output.

MIT Lincoln Laboratory has had great success developing practical communication system demonstrations. This seminar will explore the ongoing efforts at the Laboratory to engineer quantum communication systems, exploring both theoretical and experimental advances.

1 PhD, Physics, University of Rochester

 

Research Challenges in Airborne Networks and Communications

Dr. Bow-Nan Cheng1
MIT Lincoln Laboratory

In recent years, with the emergence of ubiquitous communications and the increasing availability of unmanned aerial vehicles, there has been an increased interest in building and leveraging airborne networks to facilitate data dissemination and coordination. Although the commercial world has made significant progress in building and deploying ground wireless networks, several military needs are currently not being met by available and emerging commercial technologies, particularly for use in airborne applications.

This talk presents four major airborne networking domains, examines unique domain characteristics, and identifies interesting research challenges for improving performance and scalability in the context of military airborne systems. Topics covered include techniques for improving physical layer performance in the presence of strong interference, antenna capabilities and limitations resulting from the antenna's mounting location on the aircraft surface, and methods for integrating radio-link state information with the platform network routers for improved routing decisions. The talk also presents an overview of some work performed at MIT Lincoln Laboratory in the past few years to address the research challenges, and to identify additional open areas for research, in airborne networking.

1 PhD, Computer Science, Rensselaer Polytechnic Institute

 

Undersea Laser Communication — The Next Frontier

Dr. Nicholas Hardy1
MIT Lincoln Laboratory

Undersea communication is of critical importance to many research and military applications, including oceanographic data collection, pollution monitoring, offshore exploration, and tactical surveillance. Undersea communication functions have traditionally been accomplished with acoustic and low-frequency radio communication. These technologies, however, impose speed limitations on the undersea platform, impose maximum communications data rates limited to a few bits per second (low-frequency radio) or multiple kilobits per second (acoustic), and radiate over a broad area. These factors lead to power inefficiencies, potential crosstalk, and multipath degradation. Undersea optical communication has the potential to mitigate these capability shortfalls, providing very-high-data-rate covert communication.

This seminar provides an overview of applications that might benefit from high-rate undersea communication and technologies, and it reports on new research into finding the ultimate performance limit of undersea optical communication. Using sophisticated physics models of laser and sunlight undersea propagation, Lincoln Laboratory researchers have found that gigabit-per-second links over appreciable ranges, in excess of 100 meters, are possible in much of the world's oceans. This capability promises to transform undersea communications. Finally, the seminar presents architectures based on currently realizable technology that can achieve the predicted ultimate performance limit.

 

1 PhD, Electrical Engineering, Massachusetts Institute of Technology

 

Waveform Design for Airborne Networks

Dr. Frederick J. Block1
MIT Lincoln Laboratory

Airborne networks are an integral part of modern network-centric military operations. These networks operate in a unique environment that poses many research challenges. For example, nodes are much more mobile in airborne networks than in ground-based networks and are often separated by great distances that cause long delays and large propagation losses. Additionally, the altitude of an airborne receiver potentially places it within line of sight of many transmitters located over a wide area. Some of these transmitters may be part of the same network as the receiver but may cause multiple-access interference that is due to complications in channel access protocol design for airborne networks. Other transmitters that are outside the network may also cause interference. The waveform design must specifically account for this environment to achieve the required level of performance. Physical, link, and network layer techniques for providing high-rate, reliable airborne networks will be discussed.

1 PhD, Electrical Engineering, Clemson University

Cyber Security & Information Sciences

Technical Seminars offered in cyber security and information sciences.

Addressing the Challenges of Big Data Through Innovative Technologies

Dr. Vijay Gadepally1 & Dr. Jeremy Kepner2
MIT Lincoln Laboratory

The ability to collect and analyze large amounts of data is increasingly important within the scientific community. The growing gap between the volume, velocity, and variety of data available and users' ability to handle this deluge calls for innovative tools to address the challenges imposed by what has become known as big data. MIT Lincoln Laboratory is taking a leading role in developing a set of tools to help solve the problems inherent in big data.

Big data's volume stresses the storage, memory, and compute capacity of a computing system and requires access to a computing cloud. Choosing the right cloud is problem specific. Currently, four multibillion-dollar ecosystems dominate the cloud computing environment: enterprise clouds, big data clouds, SQL database clouds, and supercomputing clouds. Each cloud ecosystem has its own hardware, software, communities, and business markets. The broad nature of big data challenges makes it unlikely that one cloud ecosystem can satisfy all needs, and solutions are likely to require the tools and techniques from more than one cloud ecosystem. The MIT SuperCloud was developed to provide one such solution. To our knowledge, the MIT SuperCloud is the only deployed cloud system that allows all four ecosystems to co-exist without sacrificing performance or functionality.

The velocity of big data stresses the rate at which data can be absorbed and meaningful answers can be produced. Through an initiative led by the National Security Agency (NSA), a Common Big Data Architecture (CBDA) was developed for the U.S. government. The CBDA is based on the Google Big Table NoSQL approach and is now in wide use. Lincoln Laboratory was instrumental in the development of the CBDA and is a leader in adapting the CBDA to a variety of big data challenges. The centerpieces of the CBDA are the NSA-developed Apache Accumulo database (capable of millions of entries per second) and the Lincoln Laboratory–developed Dynamic Distributed Dimensional Data Model (D4M) schema.

Finally, big data variety may present both the largest challenge and the greatest set of opportunities for supercomputing. The promise of big data is the ability to correlate heterogeneous data to generate new insights. The combination of Apache Accumulo and D4M technologies allows vast quantities of highly diverse data (bioinformatics, cyber-relevant data, social media data, etc.) to be automatically ingested into a common schema that enables rapid query and correlation of elements.

1 PhD, Electrical and Computer Engineering, The Ohio State University
2 PhD, Physics, Princeton University

 

CASCADE: Cyber Adversarial Scenario Modeling and Automated Decision Engine

Trang Nyugen1
MIT Lincoln Laboratory

Cyber systems are complex and include numerous heterogeneous and interdependent entities and actors — such as mission operators, network users, attackers, and defenders — that interact within a network environment. Decisions must be made to secure cyber systems, mitigate threats, and minimize mission impact. Administrators must consider how to optimally partition a network using firewalls, filters, and other mechanisms to restrict the movement of an attacker who gains a foothold on the network and to mitigate the damage that the attacker can cause. Current cyber decision support focuses on gathering network data, processing these data into knowledge, and visualizing the resultant knowledge for cyber situational awareness. However, as the data become more readily obtained, network administrators face the same problem that business analysts have faced for some time: More data or knowledge does not lead to better decisions. There is a need for the development of cyber decision support that can automatically recommend timely and sound data-driven courses of action. Currently, decisions about partitioning a network are made by analysts who use subjective judgment that is sometimes informed by best practices. We propose a decision system that automatically recommends an optimal network-partitioning architecture to maximize security posture and minimize mission impact.

1 MS, Columbus State University

 

Cryptographically Secure Computation

Dr. Emily Shen1
MIT Lincoln Laboratory

In today's big data world, the ability to collect, share, and analyze data has led to unprecedented capabilities as well as unprecedented privacy concerns. Often, people or organizations would like to collaborate and obtain results from their collective data but without divulging their individual data. Current techniques to enable such sharing require that the parties either entrust each other with their sensitive data or find a mutually trusted third party who can perform the computation on their behalf; these approaches may be undesirable or impractical.

Secure multiparty computation (MPC) is a cryptographic technique that enables parties to perform joint computations without revealing their sensitive inputs and without using a trusted third party. MPC not only can provide privacy for existing applications but also can enable new applications that are currently not possible. For example, MPC can be used to enable collaboration and information exchange between partner organizations through selective data sharing and to guarantee security of computation performed in an untrusted environment such as a cloud. In this talk, we illustrate the cryptographic ideas behind MPC. We then describe the design and optimization of MPC for a collaborative anomaly detection algorithm.

1 PhD, Electrical Engineering and Computer Science, Massachusetts Institute of Technology
 

 

Cyber Security Metrics

Dr. Robert Lychev1
MIT Lincoln Laboratory

Recent cyber attacks on government, commercial, and institutional computer networks have highlighted the need for increased cyber security measures. In order to better quantify, understand, and, therefore, more effectively combat this ever-growing threat, the U.S. government is shifting its security strategy for its computer networks and systems from one of yearly compliance checks to one of continuous monitoring and assessment. While continuous monitoring refers to the ability to maintain constant awareness of the configuration and status of the computer networks and systems, assessment refers to the ability to accurately appraise the security posture of the systems and to estimate the risk associated with them. Clearly, the effectiveness of this strategy hinges upon the ability to accurately assess cyber risk.

This seminar will present a methodology for producing useful cyber security metrics that are derived from realistic and well-defined mathematical attacker models. These metrics can be continuously evaluated from operational security data and, thus, support the government's new security strategy of continuous monitoring. The speaker will show how this methodology has been used to develop several important security metrics that are based upon the SANS Institute's list of 20 critical security controls. Live demonstrations will illustrate how these security metrics are used to assess risk in an operational context.

1 PhD, Computer Science, Georgia Institute of Technology

 

Evaluating Cyber Moving Target Techniques

Dr. Hamed Okhravi1
MIT Lincoln Laboratory

The concept of cyber moving target (MT) defense has been identified as one of the game-changing themes to rebalance the landscape of cyber security. MT techniques make cyber systems "a moving target," that is, less static, less homogeneous, and less deterministic in order to create uncertainty for attackers.

Although many MT techniques have been proposed in the literature, little has been done on evaluating their effectiveness, benefits, and weaknesses. This seminar discusses the evaluation of the wide range of MT techniques. First, a qualitative assessment studies the potential benefits, gaps, and weaknesses for each category of MT strategy. This step identifies major gaps in order to guide future research and prototyping efforts. The findings of a case study on code-reuse defenses are presented. For the MT techniques that have been identified as potentially more beneficial in the qualitative assessment, a deeper quantitative assessment is performed by examining real exploits. Next, the seminar discusses an operational assessment of an MT technique inside a larger system and illustrates how important parameters of such techniques can be monitored for improved effectiveness. Finally, possible directions for future work in this domain are outlined.
 
1 PhD, Electrical and Computer Engineering, University of Illinois at Urbana-Champaign

 

Experiences in Cyber Security Education:
The MIT Lincoln Laboratory Capture-the-Flag Exercise

Timothy R. Leek1
MIT Lincoln Laboratory
Dr. Nickolai Zeldovich2
MIT Computer Science and Artificial Intelligence Laboratory

Many popular and well-established cyber security capture-the-flag (CTF) exercises are held each year at a variety of settings, including universities and semiprofessional security conferences. The CTF format also varies greatly, ranging from linear puzzle-like challenges to team-based offensive and defensive free-for-all hacking competitions. While these events are exciting and important as contests of skill, they offer limited educational opportunities. In particular, since participation requires considerable a priori domain knowledge and practical computer security expertise, the majority of typical computer science students are excluded from taking part in these events. The goal in designing and running the MIT Lincoln Laboratory CTF was to make the experience accessible to a wider community by providing an environment that would not only test and challenge the computer security skills of the participants but also educate and prepare those without extensive prior expertise. This seminar presents the Laboratory's self-consciously educational and open CTF, including discussions of our teaching methods, game design, scoring measures, logged data, and lessons learned.

1 MS, Computer Science, University of California–San Diego
2 PhD, Computer Science, Stanford University

 

Modeling to Improve the Science of Insider Threat Detection

John Holodnak1
MIT Lincoln Laboratory

Recent high-profile security breaches have highlighted the importance of insider threat detection systems for cyber security. Most of these detection systems rely on the manual or semi-automated analysis of audit logs to identify malicious activity. However, issues such as high false-positive rates and concerns over data privacy make it difficult to predict performance of these systems within an enterprise environment. These and other issues limit an organization’s ability to effectively apply these tools.

This seminar will present an approach that leverages enterprise-level modeling to predict the performance of insider threat detection systems. We provide a proof of concept of our modeling approach by applying it to a synthetic dataset and comparing its predictions to the ground truth. The output of these models that predict performance can enable enterprises to compare tools and ultimately allow them to make better informed decisions about which insider threat detection systems to deploy.

1 PhD, Mathematics, North Carolina State University

 

Multicore Programming in pMatlab® Using Distributed Arrays

Dr. Jeremy Kepner1
MIT Lincoln Laboratory

MATLAB is one of the most commonly used languages for scientific computing, with approximately one million users worldwide. Many of the programs written in MATLAB can benefit from the increased performance offered by multicore processors and parallel computing clusters. The MIT Lincoln Laboratory pMatlab library allows high-performance parallel programs to be written quickly by using the distributed arrays programming paradigm. This talk provides an introduction to distributed arrays programming and will describe the best programming practices for using distributed arrays to produce programs that perform well on multicore processors and parallel computing clusters. These practices include understanding the concepts of parallel concurrency versus parallel data locality, using Amdahl's Law, and employing a well-defined design-code-debug-test process for parallel codes.

1 PhD, Physics, Princeton University

 

Quantitative Evaluation of Moving Target Technology

Paula Donovan1
MIT Lincoln Laboratory

Robust, quantitative measurement of cyber technology is critically needed to measure the utility, impact and cost of cyber technologies. Our work addresses this need by developing metrics and experimental methodology for cyber technologies.  The technological focus for this presentation is Moving Target technology with a goal of quantitatively measuring resiliency and agility in controlled, repeatable experiments. The evaluation aims to identify one or more specific aspects that each technique is intended to improve, then to quantitatively measure the technique’s ability to realize that improvement with respect to a relevant, realistic threat model.  This presentation will describe this approach to quantitative evaluation, including methodology and metrics, results of analysis, simulation and experiments, and a series of lessons learned.

1MS, Applied Mathematics/Computer Science, University of Massachusetts Amherst.

 

Secure and Resilient Cloud Computing for the Department of Defense

Dr. Nabil A. Schear1
MIT Lincoln Laboratory

The Department of Defense (DoD) and the intelligence community (IC) are rapidly moving to adopt cloud computing to achieve substantial user benefits, such as the ability to store and access massive amounts of data, on-demand delivery of computing services, the capability to widely share information, and the scalability of resource usage. However, cloud computing both exacerbates existing cyber security challenges and introduces new challenges.

Today, most cloud data remain unprotected and therefore vulnerable to theft, deception, or destruction. Furthermore, the homogeneity of cloud software and hardware increases the risk of a single cyber attack on the cloud, causing widespread impact. To address these challenges, Lincoln Laboratory is developing technology that will strengthen the security and resilience of cloud computing so that the DoD and IC can confidently deploy cloud services for their critical missions. This seminar will detail the cyber security threats to cloud computing, the Laboratory's approach to cloud security and resilience, and results from several novel technologies demonstrating enhanced cyber capabilities with low overhead.

1 PhD, Computer Science, University of Illinois at Urbana-Champaign

 

Web Security Origins and Evolution

Mary Ellen Zurko1
MIT Lincoln Laboratory

Computer systems and software should have security designed in from the start, giving them a firm foundation for the necessary security goals and principles based on confidentiality, integrity, and availability. After that, our systems and software evolve, functionality is added, new technologies are integrated. Attacks occur, and evolve right along with our systems. Our security evolves with our systems, and with the attacks. Unfortunately, there is a lack of papers, publications, or case studies on the successful application of this security life cycle model. 

In this seminar, a successful distributed system, the World Wide Web, serves as a case study in security evolution. Initial security features will be reviewed, with lessons learned for those that were adopted, and those that were not. The transition to attack response brought a different set of features, and some early security choices turned out to not meet the security goals set out for them. New features and technologies brought basic shifts in security assumptions made in early design decisions. More recent attacks will be discussed, especially those enabled by current use and development of the Web, which were largely unimagined during the initial design of the technology and its protections. This perspective provides lessons for security considerations for systems of all sizes, and at all stages of maturity.

1 MS, Computer Science, Massachusetts Institute of Technology

Engineering

Technical Seminars offered in engineering.

Advanced Structures and Materials Using Additive Manufacturing

Derek Straub1
MIT Lincoln Laboratory

MIT Lincoln Laboratory is utilizing additive manufacturing processes to create novel engineering solutions to develop prototypes for both research and national security applications. Additive manufacturing offers the ability to build unique structures with complex external and internal geometries producing strong, lightweight, multifunctional structures that are not possible using traditional manufacturing techniques. High-performance plastics are employed in a broad variety of engineering applications, which include unmanned aerial vehicles and airborne sensors. The utilization of metals, including the use of aluminum and titanium for high performance, lightweight cooling systems, is increasing, and research teams are developing techniques to increase mechanical properties to approach their wrought counterparts. The development of new materials offering increased mechanical performance to meet demanding size, weight, performance, and environmental requirements for next-generation prototype systems is underway.

1 MEng, Engineering in Manufacturing, Massachusetts Institute of Technology

 

Improved Resilience for the Electric Grid

Erik R. Limpaecher1 & Nicholas Judson2
MIT Lincoln Laboratory

The U.S. electric power grid has been described as the largest interconnected machine on Earth. Its continuous and stable operation has become essential to the day-to-day functioning of our economy and our households. However, the grid is undergoing a period of rapid change while facing an increasing number of threats. The national grid is composed of aging and outdated infrastructure, and with the addition of distributed intermittent renewables and natural gas generation, it is evolving in ways that often decrease grid resilience. As the climate changes and the potential for physical or cyber terrorist attacks grows, the likelihood of catastrophic outages due to extreme weather events and cyber incidents increases. MIT Lincoln Laboratory has been working with the Federal, state, and city governments as well as utilities and grid operators to improve the resiliency of critical facilities and to accelerate the adoption of distributed energy resources. This seminar will describe our recent progress on both new resilience modeling tools and analysis methods as well as on an enabling hardware-in-the-loop test bed prototype.

1 BS, Electrical Engineering, Princeton University
2 PhD, Microbiology and Molecular Genetics, Harvard University

 

Optomechanical Systems Engineering for Space Payloads

Dr. Keith B. Doyle1
MIT Lincoln Laboratory

MIT Lincoln Laboratory has developed a broad array of optical payloads for imaging and communication systems in support of both science and national defense applications. Space provides a particularly harsh environment for high-performing optical instruments that must survive launch and then meet demanding imaging and pointing requirements in dynamic thermal and vibration operational settings. This seminar presents some of the mechanical challenges in building optical payloads for space and draws on a history of successful optical sensors built at MIT Lincoln Laboratory along with recent systems in the news for both imaging and communication applications. Details include an overview of recent optical payloads and their applications, optomechanical design considerations, the impact and mitigation of environmental effects, and environmental testing.

1 PhD, Engineering Mechanics, University of Arizona

 

Self-Driving Vehicles Using Localizing Ground-Penetrating Radar

Michael Boulet1
MIT Lincoln Laboratory

Self-driving vehicles have the potential to prevent most of the over 35,000 vehicle fatalities in the U.S. each year. However, current implementations are not mature enough for widespread adoption. Optical sensors provide remarkably successful lane-keeping capabilities, yet they struggle in a range of conditions and scenarios. This seminar will describe in detail MIT Lincoln Laboratory's Localizing Ground-Penetrating Radar (LGPR) system, a sensor that provides real-time estimates of a vehicle's position even in challenging weather and road conditions. As an independent method of localization, the potential safety and robustness impact of fusing the LGPR system into traditional approaches is significant. Video demonstrations include closed loop autonomous steering systems deployed in Afghanistan and Boston and real-time localization at highway speeds at night in a snowstorm.

1 MS, Mechanical Engineering, Massachusetts Institute of Technology

Homeland Protection

Technical Seminars offered in homeland protection.

3D Plume Reconstruction

William Lawrence1, Clara McCreery2, Janice Crager3, & Benjamin L. Ervin4
MIT Lincoln Laboratory

MIT Lincoln Laboratory has developed a capability to reconstruct a smoke plume as a point cloud in 3D space over time. We use a ring of video cameras to record a smoke release from many different angles and then apply a segmentation algorithm to separate the smoke plume from the rest of the scene in each recorded video. The segmented plumes are projected into 3D space at each time step using a modified structure from motion algorithm. This capability is used to referee other sensors, verify theoretical models of plume behavior, and measure plume characteristics such as volume and bulk velocity over time.

1 PhD, Chemistry and Other Chemistry, University of California, Irvine
2 BS, Chemical Engineering, Stanford University
3 MS, Electrical Engineering, Boston University
4 PhD, Industrial Engineering, University of Illinois

 

Behavioral Biomarkers to Quantify Neurological and Cognitive Status

Adam C. Lammert1
MIT Lincoln Laboratory

MIT Lincoln Laboratory has been developing noninvasive biomarkers of neurological condition based upon observable human behaviors, which hold promise for objective monitoring of psychological health, neurological function and cognitive status. Biomarkers based on speaking behavior have been successfully applied to assessment of a variety of conditions, such as depression, Parkinson’s disease, and cognitive fatigue detection. This seminar will describe prior and ongoing work on speech-based assessment of these various conditions, before focusing on recent advances in the area of traumatic brain injury.

Traumatic brain injury (TBI) is prevalent in civilians and service members, and is associated with a diverse set of cognitive and sensorimotor symptoms. As such, there is a pressing need to go beyond the typical coarse severity classifications (i.e., mild, moderate, severe) when diagnosing patients with TBI. It is essential to develop technologies with the ability to disambiguate mTBI phenotypes, which in turn should facilitate the design of enhanced patient care systems for monitoring and targeted intervention. Recent advances in virtual reality – embodied in the virtual environment at the MIT Lincoln Laboratory STRIVE Center – have enabled increasingly immersive sensorimotor provocative tests, which can challenge subjects with situations that elicit underlying deficits. Pairing provocative tests with neurologically relevant, mechanical models of sensorimotor control (developed at Lincoln Laboratory) provides a neurocomputational substrate which can aid in establishing causal relationships between impaired pathways and observed behaviors. Ultimately, individualized, mechanical models of sensorimotor control may aid in designing novel physiological assessments, as well as predicting outcomes associated with rehabilitation interventions for people who have suffered a TBI.

1 PhD, Computer Science, University of Southern California

 

Disease Modeling to Assess Outbreak Detection and Response

Dr. Diane C. Jamrog1
MIT Lincoln Laboratory

Bioterrorism is a serious threat that has become widely recognized since the anthrax mailings of 2001. In response, one national research activity has been the development of biosensors and networks thereof. A driving factor behind biosensor development is the potential to provide early detection of a biological attack, thereby enabling timely treatment. This presentation introduces a disease progression and treatment model to quantify the potential benefit of early detection. To date, the model has been used to assess responses to inhalation anthrax and smallpox outbreaks.

1 PhD, Computation and Applied Mathematics, Rice University

 

Engineering Cells to Solve Sensing Challenges

Dr. Fran Nargi1
MIT Lincoln Laboratory

MIT Lincoln Laboratory has had a successful history of developing novel cell-based biosensors. These sensors combined multiple distinct genetically engineered functions to produce rapid, reliable readouts for the presence of key biothreats, and can do so much more rapidly than other detection technologies.  Research is ongoing to expand our repertoire of biological sensors for recognition of biological signaling molecules such as proteins, hormones, and neurotransmitters, which can serve as important markers for monitoring health, assessing the effects of therapies or environmental stressors, diagnosing disease, and guiding medical treatment. The capability to monitor biological molecules in real-time on a continuous or high frequency basis would enable adverse trends to be identified quickly, drug or stressor responses to be tracked dynamically, and molecular cycles to be characterized. Our goal is to reduce the complexity and resource requirements of current real-time sensors to enable specific recognition and measurement of biological molecule of interest.

This seminar will present a historical perspective of Lincoln Laboratory sensor development from design to transition to industry, including successes and pitfalls, preliminary results of new biosensor research and future trends and needs.

1 PhD, Pathobiology, University of Connecticut

Human Language Technology

Technical Seminars offered in human language technology.

New Approaches to Automatic Speaker Recognition and Forensic Considerations

Dr. Joseph P. Campbell1 & Dr. Pedro A. Torres-Carrasquillo2
MIT Lincoln Laboratory

Recent gains in the performance of automatic speaker recognition systems have been obtained by new methods in subspace modeling. This talk presents the development of speaker recognition systems ranging from traditional approaches, such as Gaussian mixture modeling (GMM) to novel state-of-the-art systems employing subspace techniques, such as factor analysis and iVector methods. This seminar also covers research on the means to exploit high-level information. For example, idiosyncratic word usage and speaker-dependent pronunciation are high-level features for recognizing speakers. These high-level features can be combined with conventional features for increased accuracy. The seminar presents new methods to increase robustness and improve calibration of speaker recognition systems by addressing common factors in the forensic domain that degrade recognition performance. We describe MIT Lincoln Laboratory's VOCALINC system and its application to automated voice comparison of speech samples for law enforcement investigation and forensic applications. The talk concludes with appropriate uses of this technology, especially cautions regarding forensic-style applications, and a look at this technology's future directions.

1 PhD, Electrical Engineering, Oklahoma State University
2 PhD, Electrical Engineering, Michigan State University

 

Signal Processing for the Measurement of Characteristic Voice Quality

Dr. Nicolas Malyska1 & Dr. Thomas F. Quatieri2
MIT Lincoln Laboratory

The quality of a speaker's voice communicates to a listener information about many characteristics, including the speaker's identity, language, dialect, emotional state, and physical condition. These characteristic elements of a voice arise because of variations in the anatomical configuration of a speaker's lungs, voice box, throat, tongue, mouth, and nasal airways, as well as the ways in which the speaker moves these structures. The voice box, or larynx, is of particular interest in voice quality, as it is responsible for generating variations in the excitation source signal for speech.

In this seminar, we will discuss mechanisms by which voice-source variations are generated, appear in the acoustic signal, and are perceived by humans. Our focus will be on using signal processing to capture acoustic phenomena resulting from the voice source. The presentation will explore several applications that build upon these measurement techniques including (1) turbulence-noise component estimation during aperiodic phonation, (2) automatic labeling for regions of irregular-phonation, and (3) the analysis of pitch dynamics.

1 PhD, Health Sciences and Technology, Massachusetts Institute of Technology
2 ScD, Electrical Engineering, Massachusetts Institute of Technology

 

Sparse Coding and Applications in Human Language Technology

Dr. Youngjune L. Gwon1
MIT Lincoln Laboratory

Machine learning has become a central proponent for Lincoln Laboratory’s Human Language Technology portfolio. Sparse coding is a class of unsupervised machine learning methods that enables the discovery of structural patterns of data with unknown priors. Used primarily as a representation learning tool, sparse coding is data-driven and seeks an efficient, alternative form of data suitable for an input to a supervised task such as classification and regression.

This seminar will explain the theory of sparse linear models and discuss computationally different approaches for the sparse coding optimization problem. It will also describe how one can apply sparse coding to a broad range of machine learning problems and present sparse coding-enabled applications in seized media analytics, affective computing, and speaker and language identification. The seminar will feature a detailed case study on multimedia event detection relevant to several Lincoln Laboratory programs for helping the intelligence community solve complex multimodal data analytics problems.

1 PhD, Computer Science, Harvard University

 

Speech Enhancement Using Deep Neural Networks

Dr. Jonas Borgstron1 & Dr. Michael Brandstein2
MIT Lincoln Laboratory

Speech signals captured by law enforcement or the intelligence community are typically degraded by additive noise and reverberation. Speech enhancement aims to suppress such distortion, leading to improved intelligibility and reduced fatigue in analysts. This talk will review single-channel speech enhancement, and discuss recent advances made by Lincoln Laboratory that leverage deep neural networks.

1 PhD, Electrical Engineering, University of California Los Angeles
2 PhD, Electrical Engineering, Brown University

Machine Learning and Artificial Intelligence

Technical Seminars offered in machine learning and artificial intelligence.

Artificial Intelligence for Cyber Security

Dr. Joseph R. Zipkin1
MIT Lincoln Laboratory

Despite cyber defenders' best efforts to secure computer systems over the past several decades, cyber adversaries continue to outpace and outsmart them by hacking systems and stealing data. Recent advances in artificial intelligence (AI) can be leveraged to improve this situation by helping the defender to stay apace with the flood of cyber data from networks and systems and enabling them to make progress in securing their systems. For example, AI can help security analysts focus on relevant signals in large amounts of situational awareness data, and it can be leveraged to automatically generate cyber courses of action in response to cyber threats. This seminar will present an overview of the challenges of cyber security that are amenable to AI solutions and provide a summary of recent work in analyzing online open-source data in support of attack prediction.

1 PhD, Mathematics, University of California Los Angeles

 

Combating Illicit Marketplaces on the Deep and Dark Web Using Machine Learning

Dr. Charlie Dagli1, Dr. Lin Li2, & Dr. Joseph Campbell3
MIT Lincoln Laboratory

Increasingly, more and more illicit economic activity is being mediated online. Illicit marketplaces on the Deep Web and Dark Web provide anonymous platforms for the sale and purchase of illegal goods and services. Unfortunately, traditional technologies for indexing and search are not sufficient for combating illicit activities observed in these marketplaces. In this talk, we will describe machine learning-based approaches for enabling law enforcement to counter activities observed in these marketplaces. In particular, we will discuss technologies for multiplatform persona linking (e.g., determining if a user in one illicit marketplace is likely to be the same person as a different user in another illicit marketplace). We will also discuss uncued discovery of organizations operating in illicit marketplaces. Additionally, we will discuss how these technologies were used in the Defense Advanced Research Projects Agency’s Memex program and how Memex is providing significant impact to law enforcement agencies combatting illicit activity online.

1 PhD, Electrical and Computer Engineering, University of Illinois, Urbana-Champaign
2 PhD, Electrical and Computer Engineering, University of California, Davis
3 PhD, Electrical Engineering, Oklahoma State University

 

Efficient Dataset Curation and Visual Question Answering with Deep Neural Networks

Arjun Majumdar1, Dr. Ryan Soklaski2, & David Mascharka3
MIT Lincoln Laboratory

Recently, deep neural networks have been successfully applied in settings where large amounts of data have been collected and annotated to train a model. Here, success is often narrowly defined as predictive performance on a test set. However, machine learning practitioners often start with unlabeled data and success may require that a human operator can interpret the outputs of the system. For example, current practice for building a system that can find and understand the spatial relationships between cars in overhead imagery is to annotate a large dataset containing vehicles with sufficient diversity to train a machine learning algorithm. Typically, curating the dataset and training the model each happen independently, which may result in more time spent annotating data than is necessary for the given task. Furthermore, when asked to count the number of cars near a specific building, the system might output the answer as a single number, which may leave the user asking which vehicles were counted and how the region near the building was defined. In the first part of this talk, we will discuss how to efficiently label data for machine learning applications using active learning with Bayesian deep neural networks. Following this, we will describe a visual question answering system that achieves state-of-the-art performance on a benchmark dataset, while providing human-interpretable outputs that illustrate the intermediate steps in the algorithm's reasoning process.

1 MS, Computer Engineering, University of Illinois Urbana-Champaign
2 PhD, Physics, Washington University in St. Louis
3 BS, Computer Science, Drake University

 

Machine Learning Applications in Aviation Weather and Traffic Management

Dr. Mark S. Veillette1
MIT Lincoln Laboratory

Adverse weather accounts for the majority of air traffic delays in the United States. When weather is expected to impact operations, air traffic managers (ATMs) are often faced with an overload of weather information required to make decisions. These data include multiple numerical weather model forecasts, satellite observations, Doppler radar, lightning detections, wind information, and other forms of meteorological data. Many of these data contain a great deal of uncertainty. Absorbing and utilizing these data in an optimal way is challenging even for experienced ATMs.

This talk will provide two examples of using machine learning to assist ATMs. In our first example, data from weather satellites are combined with global lightning detections and numerical weather models to create Doppler radar-like displays of precipitation in regions outside the range of weather radar. In the second example, multiple weather forecasts are used to refine the prediction of airspace capacity within selected regions of airspace. Examples and challenges of data collection, translation, modelling, and operational prototype evaluations will be discussed.

1 PhD, Mathematics, Boston University

 

Machine Learning for Radio-Frequency Signal Classification and Blind Demodulation

 

Dr. Ameya Agaskar1
MIT Lincoln Laboratory

Platforms such as unmanned air systems and unattended ground sensors are limited by their size, weight, and power consumption requirements, and as a result, embedded signal processing is limited. Exfiltrating raw sampled data for offline processing is often not an alternative because such processing demands significant bandwidth.

To accommodate at-sensor processing—including detection, demodulation, and decoding—of wideband data, MIT Lincoln Laboratory is proposing an intermediate level of radio-frequency (RF) data representation to significantly reduce the data rates to levels largely equivalent to the information-handling rate in the processing environment. This approach leverages sparsity in the RF environment and machine learning algorithms to discover signal structure without prior knowledge and leads to blind demodulation techniques.

This seminar presents a novel highly scalable approach, based on nonparametric Bayesian models, for learning unknown signal structure and a new blind synchronization technique. The Laboratory has characterized performance on the basis of signal-to-noise ratio (SNR) and the number of observations. The research team has observed that above a critical SNR learning threshold, the algorithms offer near-optimal levels of performance in terms of raw bit error rates.

1 PhD, Engineering Sciences, Harvard University

 

Quantum Information Sciences

Technical Seminars offered in quantum information sciences.

Quantum Information Science with Superconducting Artificial Atoms

Dr. William D. Oliver1
MIT Lincoln Laboratory & the Research Laboratory of Electronics

Superconducting qubits are artificial atoms assembled from electrical circuit elements. When cooled to cryogenic temperatures, these circuits exhibit quantized energy levels. Transitions between levels are induced by applying pulsed microwave electromagnetic radiation to the circuit, revealing quantum coherent phenomena analogous to (and in certain cases beyond) those observed with coherent atomic systems.

This talk provides an overview of quantum information science and superconducting artificial atoms, including several demonstrations of quantum coherence using these circuits: Landau-Zener-Stückelberg oscillations [1], microwave-induced qubit cooling to temperatures less than 3 mK (colder than the refrigerator) [2], and a new broadband spectroscopy technique called amplitude spectroscopy [3]. We then discuss in detail a highly coherent aluminum qubit (T1 = 12 µs, T2Echo = 23 µs, fidelity = 99.75%) with which we demonstrated noise spectroscopy using nuclear magnetic resonance (NMR)-inspired control sequences comprising 100s of pulses [4, 5].

These experiments exhibit a remarkable agreement with theory and are extensible to other solid-state qubit modalities. In addition to fundamental studies of quantum coherence in solid-state systems, we anticipate these devices and techniques will advance qubit control and state-preparation methods for quantum information science and technology applications.

[1] W.D. Oliver, et al., "Mach-Zehnder Interferometry in a Strongly Driven Superconducting Qubit," Science, vol. 310, no. 5754, pp. 1653–1657, (2005.
[2] S.O. Valenzuela, et al., "Microwave-Induced Cooling of a Superconducting Qubit," Science, vol. 314, no. 5805, pp. 1589–1592, 2006.
[3] D.M. Berns et al., "Amplitude Spectroscopy of a Solid-State Artificial Atom," Nature, vol. 455, pp. 51–57, 2008.
[4] J. Bylander, et al., "Noise Spectroscopy Through Dynamical Decoupling with a Superconducting Flux Qubit," Nature Physics, vol. 7, pp. 565–570, 2011.
[5] F.Yan, et al., "Rotating-Frame Relaxation as a Noise Spectrum Analyzer of a Superconducting Qubit Undergoing Driven Evolution," accepted for publication in Nature Communications, 2013.

1 PhD, Electrical Engineering, Stanford University

 

Toward Large-Scale Trapped-Ion Quantum Processing

Dr. John Chiaverini1
MIT Lincoln Laboratory

Quantum computers have the potential to deliver a profound computational advantage when applied to many important problems because they manipulate information in a fundamentally different way from that of current (classical) hardware. Individual atomic ions, held and manipulated using electromagnetic fields, constitute a leading candidate system to realize such a processor because of their long coherence times and uniform physical properties. Accomplishments at the few-qubit level include high-fidelity demonstrations of basic quantum algorithms, but a clear path to a large-scale processor is not fully defined. This presentation will describe Lincoln Laboratory’s efforts to develop scalable ion techniques and technologies. These efforts include the loading and manipulation of ions in two-dimensional surface-electrode-trap arrays, the integration of technology for increased on-chip control of ion qubits, and the reduction of noise that can limit multi-qubit gate fidelity; all these techniques will likely be necessary for reaching the large scales required to realize the promise of quantum computing.

1 PhD, Physics, Stanford University

Radar & Signal Processing

Technical Seminars offered in radar and signal processing.

Adaptive Array Estimation

Keith Forsythe1
MIT Lincoln Laboratory

Parameter estimation is a necessary step in most surveillance systems and typically follows detection processing. Estimation theory provides parameter bounds specifying the best achievable performance and suggests maximum-likelihood (ML) estimation as a viable strategy for algorithm development. Adaptive sensor arrays introduce the added complexity of bounding and assessing parameter estimation performance (i) in the presence of limiting interference whose statistics must be inferred from measured data and (ii) under uncertainty in the array manifold for the signal search space. This talk focuses on assessing the mean-squared-error (MSE) performance at low and high signal-to-noise ratio (SNR) of nonlinear ML estimation that (i) uses the sample covariance matrix as an estimate of the true noise covariance and (ii) has imperfect knowledge of the array manifold for the signal search space. The method of interval errors (MIE) is used to predict MSE performance and is shown to be remarkably accurate well below estimation threshold. SNR loss in estimation performance due to noise covariance estimation is quantified and is shown to be quite different from analogous losses obtained for detection. Lastly, a discussion of the asymptotic efficiency of ML estimation is also provided in the general context of misspecified models, the most general form of model mismatch.

1 SM, Mathematics, Massachusetts Institute of Technology

 

Advanced Embedded Computing

Dr. Paul Monticciolo1, Dr. William S. Song2, & Matthew A. Alexander3
MIT Lincoln Laboratory

Embedded computing continues to influence and change the way we live, as evidenced by its ubiquitous usage in consumer and industrial electronics and its ability to enable new growth areas, such as autonomous vehicles and virtual reality systems.  Over the past several decades the Department of Defense (DoD) has been leveraging advances in embedded computing to meet national needs in intelligence, surveillance, and reconnaissance, electronic warfare, and Command and Control applications that require tremendous compute capabilities in highly constrained size, weight, and power environments. Furthermore, defense platforms such as aircraft and ships can have lifetimes upwards of 50 years; hence, hardware and software architectures must be designed to provide both the performance and flexibility to support new applications and regular technology upgrades over such a long time frame.

MIT Lincoln Laboratory’s Embedded and Open Systems Group has been a long-time leader in the design, development, and implementation of defense electronic components and systems. In this seminar, we will discuss advanced real-time hardware and software technologies along with codesigned hardware and software system solutions. First, we will provide an overview of DoD-oriented embedded computing challenges, technologies, and trends. We will then explore extremely power efficient custom application-specific integrated circuit and emerging system-on-chip solutions that parallel approaches used in smart phones. Heterogeneous system solutions that leverage field-programmable gate arrays, graphics processing units, and multi-core microprocessor components will then be addressed. Open standards-based software architectures that can meet real-time performance requirements, enable code portability, and enhance programmer productivity will be discussed. Representative implementation examples will be provided in each section. Our presentation will conclude with some prognostication on the future of embedded computing.

1 PhD, Electrical Engineering, Northeastern University
2 DSc, Electrical Engineering, Massachusetts Institute of Technology
3 BS, Computer Engineering, Northeastern University

 

Bioinspired Resource Management for Multiple-Sensor Target Tracking Systems

Dr. Dana Sinno1 & Dr. Hendrick C. Lambert2
MIT Lincoln Laboratory

We present an algorithm, inspired by self-organization and stigmergy observed in biological swarms, for managing multiple sensors tracking large numbers of targets. We have devised a decentralized architecture wherein autonomous sensors manage their own data collection resources and task themselves. Sensors cannot communicate with each other directly; however, a global track file, which is continuously broadcast, allows the sensors to infer their contributions to the global estimation of target states. Sensors can transmit their data (either as raw measurements or some compressed format) only to a central processor where their data are combined to update the global track file. We outline information-theoretic rules for the general multiple-sensor Bayesian target tracking problem and provide specific formulas for problems dominated by additive white Gaussian noise. Using Cramér-Rao lower bounds as surrogates for error covariances and numerical scenarios involving ballistic targets, we illustrate that the bioinspired algorithm is highly scalable and performs very well for large numbers of targets.

1 PhD, Electrical Engineering, Arizona State University
2 PhD, Applied Physics, University of California, San Diego

 

Multilithic Phased Array Architectures for Next-Generation Radar

Dr. Sean M. Duffy1
MIT Lincoln Laboratory

Phased array antennas provide significant operational capabilities beyond those achievable with dish antennas. Civilian agencies, such as the Federal Aviation Administration and Department of Homeland Security, are investigating the feasibility of using phased array radars to satisfy their next-generation needs. In particular, the Multifunction Phased Array Radar (MPAR) effort aims to eliminate nine different dish-based radars and replace them with radars employing a single, low-cost MPAR architecture. Also, unmanned air system operation within the National Airspace System requires an airborne sense-and-avoid (ABSAA) capability ideally satisfied by a small low-cost phased array.

Two example phased array panels are discussed in this talk. The first is the MPAR panel—a scalable, low-cost, highly capable S-band panel. This panel provides the functionality to perform the missions of air surveillance and weather surveillance for the National Airspace System. The second example is the ABSAA phased array, a low-cost Ku-band panel for unmanned air system collision avoidance radar.

The approach used in these phased arrays eliminates the drivers that lead to expensive systems. For example, the high-power amplifier is fabricated in a high-volume foundry and mounted in a surface mount package, thereby allowing industry-standard low-cost assembly processes. Also, all our integrated circuits contain multiple functions to save on semiconductor space and board-level complexity. Finally, the systems' multilayered printed circuit board assemblies combine the antenna, microwave circuitry, and integrated circuits, eliminating the need for hundreds of connectors between the subsystems; this streamlined design enhances overall reliability and lowers manufacturing costs.

1 PhD, Electrical Engineering, University of Massachusetts–Amherst

 

Parameter Bounds Under Misspecified Models

Keith Forsythe1
MIT Lincoln Laboratory

Parameter bounds are traditionally derived assuming perfect knowledge of data distributions. When the assumed probability distribution for the measured data differs from the true distribution, the model is said to be misspecified; mismatch at some level is inevitable in practice. Thus, several authors have studied the impact of model misspecification on parameter estimation. Most notably, Peter Huber explored in detail the performance of maximum-likelihood (ML) estimation under a very general form of misspecification; he showed consistency and normality, and derived ML estimation’s asymptotic covariance that is often referred to as the celebrated "sandwich covariance."

The goal of this talk is to consider the class of non-Bayesian parameter bounds emerging from the covariance inequality under the assumption of model misspecification. Casting the bound problem as one of constrained minimization is likewise considered. Primary attention is given to the Cramér-Rao bound (CRB). It is shown that Huber's sandwich covariance is the misspecified CRB and provides the greatest lower bound (tightest) under ML constraints. Consideration of the standard circular complex Gaussian ubiquitous in signal processing yields a generalization of the Slepian-Bangs formula under misspecification. This formula, of course, reduces to the usual one when the assumed distribution is in fact the correct one. The framework is outlined for consideration of the Barankin/Hammersley-Chapman-Robbins, Bhattacharyya, and Bobrovsky-MayerWolf-Zakai bound under misspecification.

1 SM, Mathematics, Massachusetts Institute of Technology

 

Polarimetric Co-location Layering: A Practical Algorithm for Mitigation of Low Grazing Angle Sea Clutter

Molly Crane1
MIT Lincoln Laboratory

Traditional detection schemes in conventional maritime surveillance radars suffer serious performance degradation due to sea clutter, especially in low grazing angle geometries. In such geometries, typical statistical assumptions regarding sea clutter backscatter do not hold. Trackers can be overwhelmed by false alarms, while objects of interest may be challenging to detect amongst sea clutter. Despite numerous attempts over several decades to devise a means of mitigating the effects of low grazing angle sea clutter on traditional detection schemes, minimal progress has been made in developing an approach that is robust and practical.

To explore whether polarization information might offer an effective means of enhancing target detection in sea clutter, MIT Lincoln Laboratory collected a fully polarimetric X-band radar dataset on the Atlantic coast of Massachusetts' Cape Ann in October 2015. The dataset spans multiple bandwidths, multiple sea states, and various targets of opportunity. Leveraging this dataset, Lincoln Laboratory developed an algorithm dubbed polarimetric colocation layering that retains detections on objects of interest while reducing the number of false alarms in a conventional single-polarization radar by as many as two orders of magnitude. Polarimetric colocation layering is robust across waveform bandwidths and sea states. Moreover, this algorithm is practical: It can plug directly into the standard radar signal processing chain.

1 PhD, Computer Engineering, Boston University

 

Polynomial Rooting Techniques for Adaptive Array Direction Finding

Dr. Gary F. Hatke1
MIT Lincoln Laboratory

Array processing has many applications in modern communications, radar, and sonar systems. Array processing is used when a signal in space, be it electromagnetic or acoustic, has some spatial coherence properties that can be exploited (such as far-field plane wave properties). The array can be used to sense the orientation of the plane wave and thus deduce the angular direction to the source. Adaptive array processing is used when there exists an environment of many signals from unknown directions as well as noise with unknown spatial distribution. Under these circumstances, classical Fourier analysis of the spatial correlations from an array data snapshot (the data seen at one instance in time) is insufficient to localize the signal sources.

In estimating the signal directions, most adaptive algorithms require computing an optimization metric over all possible source directions and searching for a maximum. When the array is multidimensional (e.g., planar), this search can become computationally expensive, as the source direction parameters are now also multidimensional. In the special case of one-dimensional (line) arrays, this search procedure can be replaced by solving a polynomial equation, where the roots of the polynomial correspond to estimates of the signal directions. This technique had not been extended to multidimensional arrays because these arrays naturally generated a polynomial in multiple variables, which does not have discrete roots.

This seminar introduces a method for generalizing the rooting technique to multidimensional arrays by generating multiple optimization polynomials corresponding to the source estimation problem and finding a set of simultaneous solutions to these equations, which contain source location information. It is shown that the variance of this new class of estimators is equal to that of the search techniques they supplant. In addition, for sources spaced more closely than a Rayleigh beamwidth, the resolution properties of the new polynomial algorithms are shown to be better than those of the search technique algorithms.

1 PhD, Electrical Engineering, Princeton University

 

Radar Signal Distortion and Compensation with Transionospheric Propagation Paths

Dr. Scott D. Coutts1
MIT Lincoln Laboratory

Electromagnetic signals propagating though the atmosphere and ionosphere are distorted and refracted by this propagation media. The effects are particularly pronounced for lower-frequency signals propagating through the ionosphere. In the extreme case, frequencies in the high-frequency band can be severely refracted by the ionosphere to the point that they are reflected back toward the ground. This effect is exploited by over-the-horizon radars and high-frequency communication systems to achieve very-long-range, over-the-horizon performance. For the general radar case, the measurements of range, Doppler shift, and elevation and azimuth angles are all corrupted from their free-space values with time-varying biases.

To provide accurate radar parameter estimates in the presence of these errors, a three-dimensional method to compensate for the ionosphere refraction has been developed at Lincoln Laboratory. In this seminar, two examples of the method’s use are provided: (1) the radar returns from a known object are used to specify an unknown ionosphere electron density, and (2) an unknown satellite state vector is estimated with and without the ionosphere compensation so that the accuracy improvement can be quantified. Magneto-ionic ray tracing is used to generate the three-dimensional propagation model and propagation correction tables. A maximum-likelihood satellite-ephemeris estimator is designed and demonstrated using corrupted radar data. The technique is demonstrated using “real data” examples with very encouraging results and is applicable at radar frequencies ranging from high to ultrahigh.

1 PhD, Electrical Engineering, Northeastern University

 

Synthetic Aperture Radar

Dr. Gerald R. Benitz1
MIT Lincoln Laboratory

MIT Lincoln Laboratory is investigating the application of phased-array technology to improve the state of the art in radar surveillance. Synthetic aperture radar (SAR) imaging is one mode that can benefit from a multiple-phase-center antenna. The potential benefits are protection against interference, improved area rate and resolution, and multiple simultaneous modes of operation.

This seminar begins with an overview of SAR, giving the basics of resolution, collection modes, and image formation. Several imaging examples are provided. Results from the Lincoln Multimission ISR Testbed (LiMIT) X-band airborne radar are presented. LiMIT employs an eight-channel phased-array antenna and records 180 MHz bandwidth from each channel simultaneously. One result employs adaptive processing to reject wideband interference, demonstrating recovery of a corrupted SAR image. Another result employs multiple simultaneous beams to increase the area of the image beyond the conventional limitation that is due to the pulse repetition frequency. Areas that are Doppler ambiguous can be disambiguated by using the phased-array antenna.

1 PhD, Electrical Engineering, University of Wisconsin–Madison

Solid State Devices, Materials, & Processes

Technical Seminars offered in solid state devices, materials, & processes.

3D Integrated Technology Toolbox

Donna-Ruth Yost1 & Danna Rosenberg2
MIT Lincoln Laboratory

3D integration extends 2D integrated circuits into the third dimension, increasing density, improving performance and offering opportunities for new functionality. Integration technologies range from 3D packaging solutions, such as die stacking or flip chip bump bonding, to advanced 3D integration technologies utilizing through silicon vias (TSVs) or wafer bonding with high-density, low-capacitance interconnects. Performance needs, form-factor, security and cost all play a role in the decision of which packaging approach to use. MIT Lincoln Laboratory pioneered a 3D circuit integration technology for imagers that put data processing behind each pixel, enabling advanced imagers [1]. We are also currently developing advanced 3D integration of superconducting technologies for quantum computing applications [2]. MIT Lincoln Laboratory has a wide range of capabilities and facilities for prototyping new fabrication techniques that is ideally suited to the development of advanced 3D integration technologies.

Advanced focal-plane arrays were among the first applications to exploit the benefits of 3D integration technology because of the massively parallel information flow present in two-dimensional imaging arrays as information flows from the sensor tier to the processing tier(s) in the z-direction. The Laboratory's 3D integration technology has been used to fabricate many focal planes including mega-pixel, four-side-abuttable imaging modules for tiling large mosaic focal planes and a three-tier Geiger-mode avalanche photodiode 3D LIDAR array, using an avalanche-photodiode tier, an analog complementary metal-oxide semiconductor (CMOS) tier, and a high speed digital CMOS tier [3-6]. Lincoln Laboratory’s 3D integration technology was made available to the circuit-design research community through multiproject fabrication runs sponsored by the Defense Advanced Research Projects Agency (DARPA). More than 100 different 3D circuit concepts were explored in these runs, including stacked memories, field-programmable gate arrays, and mixed-signal and RF circuits. We have developed an understanding of heterogeneous 3D integration issues by successfully demonstrating 3D integration of Si CMOS read out integrated circuits to short wave infrared InGaAs photodiode wafers [7], and an understanding of mixed-fabrication facility issues by 3D integrating Si CMOS readout integrated circuits with externally fabricated technologies.

Quantum processors will require 3D integration and packaging as the field of quantum engineering expands from simple quantum device demonstrations to more complex, highly connected systems. Most integrated qubit demonstrations to date have been limited to 2 dimensions.  3D integration offers the promise of increased connectivity, which must be achieved while preserving the performance of the sensitive quantum devices. At Lincoln Laboratory, we have developed a 3 stack platform integrating superconducting qubits, interposers with coupling qubits and superconducting TSVs, and superconducting multi-chip modules all fabricated with commercial tools that are compatible with 200 mm wafer processing. 

This seminar will discuss the 3D integration toolbox available at Lincoln Laboratory and current 3D technology development programs including high-density integration for advanced image processors, heterogeneous wafer-scale integration for radar systems, and superconducting interposer stacks for state-of-the-art quantum processors.

[1] J.A. Burns, et al., "A Wafer-Scale 3-D Circuit Integration Technology," IEEE Transactions on Electron Devices, vol. 53, no. 10, pp. 2507–2516, October 2006.
[2] Rosenberg, et al., "3D Integrated Superconducting Qubits," NPJ Quantum Information 2017.
[3] J.A. Burns, et al., "Three-dimensional Integrated Circuits for Low Power, High Bandwidth Systems on a Chip," 2001 ISSCC International Solid-State Circuits Conference, Digest of Technical Papers, vol. 44, pp. 268–269, February 2001.
[4] V. Suntharalingam, et al., "Megapixel CMOS Image Sensor Fabricated in Three-Dimensional Integrated Circuit Technology," 2005 ISSCC International Solid-State Circuits Conference, Digest of Technical Papers, vol. 48, pp. 356–357, February 2005.
[5] V. Suntharalingam, et al., "A Four-Side Tileable, Back Illuminated, 3D- Integrated Megapixel CMOS Image Sensor," IEEE 2009 ISSCC International Solid-State Circuits Conference, Digest of Technical Papers, pp. 38–39, February 2009.
[6] B. Aull, et al., "Laser Radar Imager Based on 3D Integration of Geiger-Mode Avalanche Photodiodes with Two SOI Timing Circuit Layers," 2006 ISSCC International Solid-State Circuits Conference, Digest of Technical Papers, vol. 49, pp. 304–305, February 2006.
[7] C.L. Chen, et al., "Wafer-Scale 3D Integration of InGaAs Image Sensors with Si Readout Circuits," IEEE International Conference on 3D System Integration, San Francisco, 28–30 Sept. 2009 (Best Paper Award).

1 BS, Materials Science and Engineering, Cornell University
2 PhD, Physics, Stanford University

 

Advanced Materials at MIT Lincoln Laboratory

Dr. Jeffrey B. Chou1, Dr. Vladimir Liberman2, & Dr. Daniel Nezich3
MIT Lincoln Laboratory

At MIT Lincoln Laboratory, we are engineering materials that include new optical and electrical properties with capabilities beyond those found in nature. Our vision is to use these fundamentally new materials to enable faster- and lower-power electronics, advanced imaging applications, and high-tech wearable fabrics. At Lincoln Laboratory, our cross-disciplinary team of engineers, programmers, physicists, and material scientists are working together to turn these new materials into useable systems that will impact everyday life. Our research spans first-principle calculations via density functional theory to fully fabricated and integrated systems. Many of our research programs also leverage the new and exciting discoveries from the MIT campus faculty and graduate students, who often act as our collaborators on many projects. The vast material deposition and characterization capabilities between Lincoln Laboratory and MIT campus enable an enormous ability to develop new and exciting materials. This seminar will provide a broad overview on the diverse advanced materials programs at Lincoln Laboratory and highlight recent work, including phase-change metamaterials, dielectric metasurfaces, and two-dimensional materials for valleytronics.

1 PhD, Electrical Engineering and Computer Sciences, University of California–Berkeley
2 PhD, Applied Physics, Columbia University
3 PhD, Physics, Massachusetts Institute of Technology

 

Electromagnetic Vector Antenna and Constrained Maximum-Likelihood Imaging for Radio Astronomy

Dr. Frank C. Robey1, Dr. Alan J. Fenn2, & Dr. Mark J. Silver3
MIT Lincoln Laboratory
Mary E. Knapp4, Dr. Frank D. Lind5, & Dr. Ryan A. Volz6
MIT

Radio astronomy at frequencies below 50 MHz provides a window into nonthermal processes in objects ranging from planets to galaxies. These frequencies also provide insight into the formation of the universe. Ground-based arrays cannot adequately observe astronomical sources below about 20 MHz due to ionospheric perturbation and shielding; therefore, the sky has not been mapped with high angular resolution below that frequency. Radio astronomy at these frequencies must be accomplished in space. With space-based sensing, the cost of each satellite is high, and consequently we desire to maximize the information from each satellite. This presentation discusses designs for mapping the sky from space using electromagnetic vector sensors. These sensors measure the full electric- and magnetic-field vectors of incoming radiation and enable measurement with reasonable angular resolution from a compact sensor with a single-phase center. A model for radio astronomy imaging is introduced, and the constraints imposed by Maxwell's equations and vector sensing of an electromagnetic field are explored. This presentation shows that the covariance matrix inherent in the stochastic process must lie in a highly constrained subset of allowable positive definite covariance matrices. Results are shown that use an expectation maximization to form images consistent with a covariance matrix that satisfies the constraints. A conceptual design for a spacecraft to map the sky at frequencies below the ionospheric cutoff is discussed, along with concept development progress.

1 PhD, DsC, Electrical Engineering, Washington University
2 PhD, Electrical Engineering, The Ohio State University
3 PhD, Aerospace Engineering, University of Colorado, Boulder
4 Graduate Student, Earth, Atmosphere, and Planetary Sciences Department, Massachusetts Institute of Technology
5 PhD, Geophysics, University of Washington
6 PhD, Aeronautics/Astronautics, Stanford University

 

Functional Fibers and Fabrics at MIT Lincoln Laboratory

Dr. Alexander Stolyarov1, Dr. Lauren Cantley2, Dr. Jeffrey B. Chou3, Dr. Lalitha Parameswaran4, & Dr. Bradford Perkins, Jr.5
MIT Lincoln Laboratory

At MIT Lincoln Laboratory, we are pushing the frontiers of advanced functional fibers and fabrics from the age-old textiles to next generation fabric systems with capabilities spanning sensing, communications, health monitoring and more. To enable this development, our team focuses on multimaterial fiber device fabrication—fibers with internal domains containing semiconductors, metals, and insulators. By judicious selection of materials and processing conditions, individual fibers with nanometer scale semiconductor features can be drawn into kilometer lengths, thus enabling electronic and opto-electronic fiber devices. Weaving these fibers into fabrics enables textiles with sophisticated properties tailored for national security mission areas. This seminar will overview this new but rapidly emerging field and will include several advanced fabric use cases, including fabric-based communications, chemical sensing, and fabrics connected to the cloud.

1 PhD, Physics, Harvard University
2 PhD, Mechanical Engineering, Boston University
3 PhD, Electrical Engineering and Computer Science, University of California-Berkeley
4 PhD, Electrical Engineering, Massachusetts Institute of Technology
5 PhD, Physics, University of Colorado

 

Geiger-Mode Avalanche Photodiode Arrays for Imaging and Sensing

Dr. Brian F. Aull1
MIT Lincoln Laboratory

This seminar discusses the development of arrays of silicon avalanche photodiodes integrated with digital complementary metal-oxide semiconductor (CMOS) circuits to make focal planes with single-photon sensitivity. The avalanche photodiodes are operated in Geiger mode: they are biased above the avalanche breakdown voltage so that the detection of a single photon leads to a discharge that can directly trigger a digital circuit. The CMOS circuits to which the photodiodes are connected can either time stamp or count the resulting detection events. Applications include three-dimensional imaging using laser radar, wavefront sensing for adaptive optics, and optical communications.

1 PhD, Electrical Engineering, Massachusetts Institute of Technology

 

High-Power Laser Technology at MIT Lincoln Laboratory

Dr. T. Y. Fan1 & Dr. Darren A. Rand2
MIT Lincoln Laboratory

This seminar includes an overview of high-power laser technology development at MIT Lincoln Laboratory. Two topics will be emphasized: laser beam combining and cryogenically cooled solid-state lasers. Beam combining, taking the outputs from arrays of lasers and combining them into a single beam with near-ideal propagation characteristics, has been pursued for many decades, but its promise has started to become a reality only within the last decade. Beam combining of both fiber and semiconductor arrays using both wavelength and coherent beam combining approaches will be discussed. Cryogenically cooled solid-state lasers enable high-efficiency and high-average-power lasers while maintaining excellent beam quality. This approach is particularly applicable for developing laser sources with both high peak and average power.

1 PhD, Electrical Engineering, Stanford University
2 PhD, Electrical Engineering, Princeton University

 

Integrated Photonics for Sensing, Communications, and Signal Processing

Dr. Suraj Bramhavar1, Dr. Paul Juodawlkis2, Dr. Boris (Dave) Kharas3, Dr. Cheryl Sorace-Agaska4, Dr. Reuel Swint5, & Dr. Siva Yegnanarayanan6
MIT Lincoln Laboratory

Integrated photonics involves the aggregation of multiple optical or photonic components onto a common substrate using either monolithic or hybrid integration techniques. Common components include lasers, optical modulators, detectors, filters, splitters and combiners, couplers, and optical isolators. These components are typically connected using optical waveguides built into the substrate platform to create a photonic integrated circuit (PIC). Relative to optical circuits constructed from discrete components connected using optical fibers, PICs have a number of advantages, including reduced size and weight, reduced fiber interfaces and associated fiber-pigtailing cost, and improved environmental stability for coherent optical signal processing. These advantages are strengthened by combining PICs with electronic integrated circuits via several electronic-photonic integration techniques (wire bonding, flip chip, wafer bonding, or monolithic). Depending on the desired functions, PICs can be fabricated from a variety of materials including silicon, silicon-nitride, and compound semiconductors (e.g., indium phosphide, gallium arsenide, gallium nitride, and their associated ternary and quaternary compounds).

In this seminar, we will provide an introduction to integrated photonics technology including design, fabrication, packaging, and characterization. We will describe resources and capabilities to develop silicon, silicon-nitride, and compound-semiconductor PICs at MIT Lincoln Laboratory using in-house fabrication resources. We will also describe several government applications of integrated photonics (e.g., remote sensing, free-space communications, and signal processing), and how these are similar and different from commercial applications.

1 PhD, Electrical Engineering, Boston University
2 PhD, Electrical Engineering, Georgia Institute of Technology
3 PhD, Materials Science, State University of New York at Stony Brook
4 PhD, Electrical Engineering, Massachusetts Institute of Technology
5 PhD, Electrical Engineering, University of Illinois
6 PhD, Electrical Engineering, University of California Los Angeles

 

Microfluidics at MIT Lincoln Laboratory

Prof. Todd A. Thorsen1, Dr. Shaun R. Berry2, & Dr. Jakub Kedzierski3
MIT Lincoln Laboratory

At MIT Lincoln Laboratory, we are engineering general-purpose tools for microfluidic platforms. Cross-disciplinary teams, consisting of engineers, programmers, and biologists, are designing and developing microfluidic tools for a broad range of applications. Our vision is to apply microfabrication techniques, traditionally used in microelectromechanical system (MEMS) and integrated circuit manufacturing, to develop microfluidic components and systems that are reconfigurable, scalable, and programmable through a software interface. End users of these microfluidic tools will be able to orchestrate complex and adaptive procedures that are beyond the capabilities of today’s hardware. This talk will provide a broad overview on the diverse microfluidic programs at Lincoln Laboratory, highlighting recent work, including microhydraulic actuators, integrated "lab-on-a-chip" systems for biological exploration, and actively configurable optofluidics such as liquid microlenses and displays.

1 PhD, Biochemistry and Molecular Biophysics, California Institute of Technology
2 PhD, Mechanical Engineering, Tufts University
3 PhD, Electrical Engineering, University of California–Berkeley

 

New Fabrication Platforms and Processes

Dr. Bradley Duncan1, Dr. Lalitha Parameswaran2, Dr. Livia Racz3, & Dr. Melissa Smith4
MIT Lincoln Laboratory

Additive manufacturing techniques, such as 3D printing and 3D assembly, have transformed production by breaking long-established rules of economies of scale and manufacturing complexity. MIT Lincoln Laboratory has a suite of ongoing efforts to develop novel materials and processes for the construction of nonplanar 3D devices, integrated assemblies and systems, using our expertise in material science, semiconductor fabrication, chemistry, and mechanical and electrical engineering. Our goal is to develop processes that enable "one-stop" fabrication of complete systems with both mechanical and electrical functionality. Ongoing programs include: design of materials and processes for concurrent 3D printing of dissimilar materials; development of high-resolution, radio-frequency 3D-printed structures for low size, weight, and power applications extending to the THz range; design of novel microplasma-based sputtering of conductive materials for direct write of interconnect on nonplanar substrates; and development of reconstructed wafer processes to overcome die-size limitations and enable scaling of digital integrated circuits to large formats.

1 PhD, Organic Chemistry, Brown University
2 PhD, Electrical Engineering, Massachusetts Institute of Technology
3 PhD, Materials Science and Engineering, Massachusetts Institute of Technology
4 PhD, Materials Science and Engineering, Massachusetts Institute of Technology

 

Slab-Coupled Optical Waveguide Devices and Their Applications

Dr. Paul W. Juodawlkis1, Dr. Gary M. Smith2, Dr. Reuel B. Swint3, & Dr. George W. Turner4
MIT Lincoln Laboratory

For two decades, MIT Lincoln Laboratory has been developing new classes of high-power semiconductor optoelectronic emitters and detectors based on the slab-coupled optical waveguide (SCOW) concept. The key characteristics of the SCOW design include (1) the use of a planar slab waveguide to filter the higher-order transverse modes from a large rib waveguide, (2) low overlap between the optical mode and the active layers, and (3) low excess optical loss. These characteristics enable waveguide devices having large (> 5 × 5 μm) symmetric fundamental-mode operation and long length (~1 cm). These large dimensions, relative to conventional waveguide devices, allow efficient coupling to optical fibers and external optical cavities, and provide reduced electrical and thermal resistances for improved heat dissipation.

This seminar will review the SCOW operating principles and describe applications of the SCOW technology, including Watt-class semiconductor SCOW lasers (SCOWLs) and amplifiers (SCOWAs), monolithic and ring-cavity mode-locked lasers, single-frequency external cavity lasers, and high-current waveguide photodiodes. The SCOW concept has been demonstrated in a variety of material systems at wavelengths including 915, 960–980, 1060, 1300, 1550, 1650, and 2100 nm. In addition to single emitters, higher brightness has been obtained by combining arrays of SCOWLs and SCOWAs using wavelength beam-combining and coherent combining techniques. These beam-combined SCOW architectures offer the potential of kilowatt-class, high-efficiency, electrically pumped optical sources.

1 PhD, Electrical Engineering, Georgia Institute of Technology
2 PhD, Electrical Engineering, University of Illinois at Urbana-Champaign
3 PhD, Electrical Engineering, University of Illinois at Urbana-Champaign
4 PhD, Electrical Engineering, Johns Hopkins University
 

Ultrasensitive Mass Spectrometry Development at MIT Lincoln Laboratory

Dr. Ta-Hsuan Ong1, Dr. Jude Kelley2, & Dr. Roderick Kunz3
MIT Lincoln Laboratory

Mass spectrometry (MS) has long been regarded as one of the most reliable methods for chemical analysis because of its ability to identify molecules based on their molecular weight and fragmentation patterns combined with its sensitivity in the pico- to femtogram range. Traditionally, analytical systems that utilize mass spectrometry have coupled this method with other analytical techniques such as gas or liquid chromatography; however, the development of ambient ionization techniques and improvements in mass spectrometer design have demonstrated that this technique can function on its own as a multipurpose chemical detector.

MIT Lincoln Laboratory has been advancing these systems in an effort to develop the next generation of MS-based sensing systems focused on detection missions relevant to national security. This work has centered on ways to improve the sensitivity of MS-based systems to explosives when they are encountered as vapors and as trace particulate residues. The vapor detection system that the Laboratory has developed has a real-time sensitivity to concentrations in the parts-per-quadrillion range. Real-time sensitivity at these levels rivals that of conventional vapor detectors (canines), providing opportunities to (1) better understand the origins, dynamics, concentrations, and attenuation levels of vapor signatures associated with concealed threats and (2) help improve canine training. The Laboratory's work on detecting an explosive particulate residue has focused on applying thermal desorption and atmospheric pressure chemical ionization (TD-APCI) to swipe-based surface samples across a wide range of explosive classes. For explosive threats that are challenging to detect with TD-APCI, Lincoln Laboratory researchers have developed specialized chemical reagents that can be added directly to the ionization source to improve both the specificity of the technique as well as its sensitivity. The information revealed from these studies is used to assess current mass spectrometer-based explosive trace detection systems and guide future development efforts.


1 PhD, Analytical Chemistry, University of Illinois at Urbana-Champaign
2 PhD, Physical Chemistry, Yale University
3 PhD, Analytical Chemistry, University of North Carolina at Chapel Hill

Space Systems Technology

Technical Seminars offered in space systems technology.

Haystack Ultrawideband Satellite Imaging Radar Antenna

Dr. Joseph M. Usoff1
MIT Lincoln Laboratory

The Haystack facility in Westford, Massachusetts, has been in operation since 1964 and has conducted a wide range of communications, radar, and radio astronomy missions. MIT Lincoln Laboratory, under sponsorship of the United States Air Force, recently upgraded the facility, including the replacement of the 120-foot diameter Cassegrain antenna with a new antenna and the addition of a wideband W-band radar to the existing wideband X-band radar. The upgraded antenna is of the same diameter and has the same optical parameters as the old antenna, but the surface tolerance has been significantly improved to be better than 100 µm rms. The improved antenna surface tolerance permits efficient operation at higher frequencies, enabling W-band radar operations and radio astronomy experiments up to the 230 GHz band. This presentation will provide an overview of the new antenna; describe the fabrication and construction challenges; and highlight the surface alignment process.

1 PhD, Electrical Engineering, The Ohio State University

 

Miniature Ultrawideband Antennas for Low-SWaP Platforms

Dr. Raoul O. Ouedraogo1
MIT Lincoln Laboratory

The demand for miniaturized antennas that can be integrated on size, weight, and power (SWaP)-constrained platforms has significantly increased over the past decade. For various military and commercial applications operations, there is also an increasing interest for the miniaturized antennas to be multifunctional and operable over ultra-wide frequency bands. While rapid advances in semiconductor, gallium nitride, and integrated circuit technologies continue to enable the design of smaller agile payloads, antennas are still largely designed as static and bulky devices. As a result, users must carry multiple antennas to service different applications or physically large antennas that provide the wide bandwidth needed to cover multiple applications. Unfortunately, these are options that cannot be implemented on very low-SWaP platforms such as micro-unmanned aerial vehicles or wearable devices.

To overcome these difficulties, we present efforts carried out at MIT Lincoln Laboratory to develop novel antenna systems based on the integration of a pixel-based multi-level topology optimization approach with microfluidics to create highly efficient subwavelength antenna systems with up to 20:1 instantaneous bandwidths. The pixel-based optimization approach is an in-silico design methodology through which the metallization or dielectric of an antenna is parameterized into pixels (or voxels) and the topology of each pixel is altered in order to create new antenna geometries that exhibit the desired electrical characteristics. This seminar describes the design methodology and outlines the performance (analytical and measured) of various prototype directional and omnidirectional miniaturized ultrawideband and reconfigurable antennas used in various efforts.

1 PhD, Electrical Engineering, Michigan State University

 

Overview of the NASA TROPICS CubeSat Constellation Mission

Dr. William J. Blackwell1
MIT Lincoln Laboratory

Recent technology advances in miniature microwave radiometers that can be hosted on very small satellites have made possible a new class of constellation missions that provide very high revisit rates of tropical cyclones and other severe weather. The Time-Resolved Observations of Precipitation structure and storm Intensity with a Constellation of Smallsats (TROPICS) mission was selected by NASA as part of the Earth Venture–Instrument (EVI-3) program and is now in development with planned launch readiness in late 2019. The overarching goal for TROPICS is to provide nearly all-weather observations of 3D temperature and humidity, as well as cloud ice and precipitation horizontal structure, at high temporal resolution to conduct high-value science investigations of tropical cyclones, including:

  1. Relationships of rapidly evolving precipitation and upper cloud structures to upper-level warm-core intensity and associated storm intensity changes;
  2. Evolution (including diurnal variability) of precipitation structure and storm intensification in relationship to environmental humidity fields; and
  3. The impact of rapid-update observations on numerical and statistical intensity forecasts of tropical cyclones. 

TROPICS will provide rapid-refresh microwave measurements (median refresh rate better than 60 minutes for the baseline mission) over the tropics that can be used to observe the thermodynamics of the troposphere and precipitation structure for storm systems at the mesoscale and synoptic scale over the entire storm lifecycle. TROPICS will comprise a constellation of six 3U CubeSats in three low-Earth orbital planes. Each CubeSat will host a high-performance scanning radiometer to provide temperature profiles using seven channels near the 118.75 GHz oxygen absorption line, water vapor profiles using three channels near the 183 GHz water vapor absorption line, imagery in a single channel near 90 GHz for precipitation measurements (when combined with higher resolution water vapor channels), and a single channel at 205 GHz that is more sensitive to precipitation-sized ice particles and low-level moisture.

This observing system offers an unprecedented combination of horizontal and temporal resolution in the microwave spectrum to measure environmental and inner-core conditions for tropical cyclones on a nearly global scale and is a major leap forward in the temporal resolution of several key parameters needed for assimilation into advanced data assimilation systems capable of utilizing rapid-update radiance or retrieval data. This presentation will provide an overview of the mission and an update on current status, with a focus on recent performance simulations on a range of observables to be provided by the constellation, including temperature, water vapor, rain rate, and tropical cyclone intensity indicators.

1 PhD, Electrical Engineering, Massachusetts Institute of Technology

 

Synoptic Astronomy with the Space Surveillance Telescope

Dr. Deborah F. Woods1
MIT Lincoln Laboratory

The Space Surveillance Telescope (SST) is a 3.5 meter telescope with a 3°x2° field of view developed by MIT Lincoln Laboratory for the Defense Advanced Research Projects Agency for tracking satellites and Earth-orbiting space debris. The telescope has a three-mirror Mersenne-Schmidt design that obtains a fast focal ratio of F/1.0, enabling deep, wide-area searches with rapid revisit rates. These attributes also have utility for detections in time domain astronomy, in which objects are observed to vary in position or in brightness. The SST acquired asteroid observations in support of the Lincoln Near-Earth Asteroid Research program from 2014–2017. In the course of its observing programs, the SST has acquired a treasure trove of archival image data. While the SST is currently in the process of relocation to North West Australia, analysis with archival data continues.

With the development of a science pipeline for the processing of archival data, the SST is contributing to studies in time domain astronomy. The science goals are to identify specific classes of variable stars and to use the stars' period of variability and color information from other astronomical catalogs to estimate distances to the objects. Object distances enable the 3D reconstruction of compact halo groups around the Milky Way galaxy, which help inform galaxy formation models. While the SST's primary mission is to support space situational awareness, the image data collected during the course of operations has been valuable for astronomical applications in the time domain.

1 PhD, Astronomy, Harvard University

Systems and Architectures

Technical Seminars offered in systems and architectures.

Choices, Choices, Choices (Decisions, Decisions, Decisions)

Dr. Robert T. Shin1
MIT Lincoln Laboratory

As you plan a career after graduating from a university or college, you are faced with many choices and decisions, not just now but for many and your most productive years. While there generally is not a right or wrong decision, it is helpful to think about and make these decisions in a more systematic way. This seminar looks at perspectives on how one might think about making a choice and making an impact, especially as an architect of future advanced systems. Also, my lessons learned in architectural thinking and in management are presented in the hope that you might find them useful as you advance in your careers. Finally, a short overview of MIT Lincoln Laboratory, a federally funded research and development center, is presented as a case study on how one can leverage such an organization to make an impact.

1 PhD, Electrical Engineering, Massachusetts Institute of Technology