Publications

Refine Results

(Filters Applied) Clear All

A framework to improve evaluation of novel decision support tools

Published in:
11th Intl. Conf. on Applied Human Factors and Ergonomics, AHFE, 16-20 July 2020.

Summary

Organizations that introduce new technology into an operational environment seek to improve some aspect of task conduct through technology use. Many organizations rely on user acceptance measures to gauge technology viability, though misinterpretation of user feedback can lead organizations to accept non-beneficial technology or reject potentially beneficial technology. Additionally, teams that misinterpret user feedback can spend time and effort on tasks that do not improve either user acceptance or operational task conduct. This paper presents a framework developed through efforts to transition technology to the U.S. Transportation Command (USTRANSCOM). The framework formalizes aspects of user experience with technology to guide organization and development team research and assessments. The case study is examined through the lens of the framework to illustrate how user-focused methodologies can be employed by development teams to systematically improve development of new technology, user acceptance of new technology, and assessments of technology viability.
READ LESS

Summary

Organizations that introduce new technology into an operational environment seek to improve some aspect of task conduct through technology use. Many organizations rely on user acceptance measures to gauge technology viability, though misinterpretation of user feedback can lead organizations to accept non-beneficial technology or reject potentially beneficial technology. Additionally, teams...

READ MORE

Detect-and-avoid closed-loop evaluation of noncooperative well clear definitions

Published in:
J. Air Transp., Vol. 28, No. 4, 12 July 2020, pp. 195-206.

Summary

Four candidate detect-and-avoid well clear definitions for unmanned aircraft systems encountering noncooperative aircraft are evaluated using safety and operational suitability metrics. These candidates were proposed in previous research based on unmitigated collision risk, maneuver initiation ranges, and other considerations. Noncooperative aircraft refer to aircraft without a functioning transponder. One million encounters representative of the assumed operational environment for the detect-and-avoid system are simulated using a benchmark alerting and guidance algorithm as well as a pilot response model. Results demonstrate sensitivity of the safety metrics to the unmanned aircraft’s speed and the detect-and-avoid system's surveillance volume. The only candidate without a horizontal time threshold, named modified tau, outperforms the other three candidates in avoiding losses of detect and avoid well clear. Furthermore, this candidate's alerting timeline lowers the required surveillance range. This can help reduce the barrier of enabling unmanned aircraft systems' operations with low size, weight, and power sensors.
READ LESS

Summary

Four candidate detect-and-avoid well clear definitions for unmanned aircraft systems encountering noncooperative aircraft are evaluated using safety and operational suitability metrics. These candidates were proposed in previous research based on unmitigated collision risk, maneuver initiation ranges, and other considerations. Noncooperative aircraft refer to aircraft without a functioning transponder. One million...

READ MORE

Graded index dielectric superstrate for phased array scan compensation

Published in:
2020 IEEE Intl. Symp. on Antennas and Propagation and North American Radio Science Meeting, 5-10 July 2020.

Summary

This paper presents the use of additively manufactured graded index materials to improve the scan impedance variation of phased arrays. Solvent-based extrusion permits the programmatically controlled printing of chosen material properties with relative permittivity ranging from 2 to 24.5. This low-loss material shows promise for improvement of scan impedance variation for high scan angles in a smaller footprint than discrete layers.
READ LESS

Summary

This paper presents the use of additively manufactured graded index materials to improve the scan impedance variation of phased arrays. Solvent-based extrusion permits the programmatically controlled printing of chosen material properties with relative permittivity ranging from 2 to 24.5. This low-loss material shows promise for improvement of scan impedance variation...

READ MORE

Predicting cognitive load and operational performance in a simulated marksmanship task

Summary

Modern operational environments can place significant demands on a service member's cognitive resources, increasing the risk of errors or mishaps due to overburden. The ability to monitor cognitive burden and associated performance within operational environments is critical to improving mission readiness. As a key step toward a field-ready system, we developed a simulated marksmanship scenario with an embedded working memory task in an immersive virtual reality environment. As participants performed the marksmanship task, they were instructed to remember numbered targets and recall the sequence of those targets at the end of the trial. Low and high cognitive load conditions were defined as the recall of three- and six-digit strings, respectively. Physiological and behavioral signals recorded included speech, heart rate, breathing rate, and body movement. These features were input into a random forest classifier that significantly discriminated between the low- and high-cognitive load conditions (AUC=0.94). Behavioral features of gait were the most informative, followed by features of speech. We also showed the capability to predict performance on the digit recall (AUC = 0.71) and marksmanship (AUC = 0.58) tasks. The experimental framework can be leveraged in future studies to quantify the interaction of other types of stressors and their impact on operational cognitive and physical performance.
READ LESS

Summary

Modern operational environments can place significant demands on a service member's cognitive resources, increasing the risk of errors or mishaps due to overburden. The ability to monitor cognitive burden and associated performance within operational environments is critical to improving mission readiness. As a key step toward a field-ready system, we...

READ MORE

Adjoint analysis of guidance systems for time-series inputs using Fourier analysis

Author:
Published in:
J. Guid., Control, Dyn., Vol. 43, No. 7, July 2020.

Summary

The adjoint technique is a proven technique for analysis of linear time-varying systems and is widely used in the missile design community. It is a very efficient technique that can solve for both deterministic and stochastic disturbances and can develop a miss distance budget in a single computer solution of the differential equations without use of time-consuming Monte Carlo simulations. The adjoint technique is very valuable in both preliminary and more advanced missile design stages and is based upon the mathematical adjoint of the system dynamics matrix of the homing loop. Zarchan [1,2] describes extensive use of the technique for a variety of disturbances for homing missiles, and this author has developed its use for command guided missiles [3]. For adjoint analysis, the usual method of modeling maneuver disturbances to a missile guidance system starts by modeling the maneuver in the forward-time system as a delta function input into a transfer function with the same second-order statistics as the maneuver, and its output is input into the guidance system; then the system is converted into its adjoint system [1]. Bucco and Weiss [4] show that a set of nonstandard time-varying inputs cannot be treated in the normal fashion [2,5,6], and they present a new technique that enables these nonstandard inputs to be analyzed using adjoint analysis. This paper was inspired by and extends the results of the paper by Bucco and Weiss [4]. This paper shows that the use of the complex digital Fourier amplitude spectrums of both the maneuver and the adjoint impulse response at the maneuver point allows adjoint analysis to address another type of nonstandard input, namely, an arbitrary time-series inputs such as specific target maneuvers that are not representable by an impulse input into a transfer function; heretofore, these time-series inputs have not been treatable with adjoint analysis. Additionally, if there are several sets of arbitrary time series of target maneuvers, each with an associated probability of occurrence, the root-mean-square (rms) value of the set of probabilistic maneuvers can be calculated, another significant new capability introduced in this paper.
READ LESS

Summary

The adjoint technique is a proven technique for analysis of linear time-varying systems and is widely used in the missile design community. It is a very efficient technique that can solve for both deterministic and stochastic disturbances and can develop a miss distance budget in a single computer solution of...

READ MORE

Deep implicit coordination graphs for multi-agent reinforcement learning [e-print]

Summary

Multi-agent reinforcement learning (MARL) requires coordination to efficiently solve certain tasks. Fully centralized control is often infeasible in such domains due to the size of joint action spaces. Coordination graph based formalization allows reasoning about the joint action based on the structure of interactions. However, they often require domain expertise in their design. This paper introduces the deep implicit coordination graph (DICG) architecture for such scenarios. DICG consists of a module for inferring the dynamic coordination graph structure which is then used by a graph neural network based module to learn to implicitly reason about the joint actions or values. DICG allows learning the tradeoff between full centralization and decentralization via standard actor-critic methods to significantly improve coordination for domains with large number of agents. We apply DICG to both centralized-training-centralized-execution and centralized-training-decentralized-execution regimes. We demonstrate that DICG solves the relative overgeneralization pathology in predatory-prey tasks as well as outperforms various MARL baselines on the challenging StarCraft II Multi-agent Challenge (SMAC) and traffic junction environments.
READ LESS

Summary

Multi-agent reinforcement learning (MARL) requires coordination to efficiently solve certain tasks. Fully centralized control is often infeasible in such domains due to the size of joint action spaces. Coordination graph based formalization allows reasoning about the joint action based on the structure of interactions. However, they often require domain expertise...

READ MORE

Control systems need software security too: cyber-physical systems and safety-critical application domains must adopt widespread effective software defenses

Author:
Published in:
SIGNAL Mag., 1 June 2020.

Summary

Low-level embedded control systems are increasingly being targeted by adversaries, and there is a strong need for stronger software defenses for such systems. The cyber-physical nature of such systems impose real-time performance constraints not seen in enterprise computing systems, and such constraints fundamentally alter how software defenses should be designed and applied. MIT Lincoln Laboratory scientists demonstrated that current randomization-based defenses, which have low average-case overhead, can incur significant worst-case overhead that may be untenable in real-time applications, while some low-overhead enforcement-based defenses have low worst-case performance overheads making them more amenable to real-time applications. Such defenses should be incorporated into a comprehensive resilient architecture with a strategy for failover and timely recovery in the case of a cyber threat.
READ LESS

Summary

Low-level embedded control systems are increasingly being targeted by adversaries, and there is a strong need for stronger software defenses for such systems. The cyber-physical nature of such systems impose real-time performance constraints not seen in enterprise computing systems, and such constraints fundamentally alter how software defenses should be designed...

READ MORE

COVID-19: famotidine, histamine, mast cells, and mechanisms [eprint]

Summary

SARS-CoV-2 infection is required for COVID-19, but many signs and symptoms of COVID-19 differ from common acute viral diseases. Currently, there are no pre- or post-exposure prophylactic COVID-19 medical countermeasures. Clinical data suggest that famotidine may mitigate COVID-19 disease, but both mechanism of action and rationale for dose selection remain obscure. We explore several plausible avenues of activity including antiviral and host-mediated actions. We propose that the principal famotidine mechanism of action for COVID-19 involves on-target histamine receptor H2 activity, and that development of clinical COVID-19 involves dysfunctional mast cell activation and histamine release.
READ LESS

Summary

SARS-CoV-2 infection is required for COVID-19, but many signs and symptoms of COVID-19 differ from common acute viral diseases. Currently, there are no pre- or post-exposure prophylactic COVID-19 medical countermeasures. Clinical data suggest that famotidine may mitigate COVID-19 disease, but both mechanism of action and rationale for dose selection remain...

READ MORE

75,000,000,000 streaming inserts/second using hierarchical hypersparse GraphBLAS matrices

Summary

The SuiteSparse GraphBLAS C-library implements high performance hypersparse matrices with bindings to a variety of languages (Python, Julia, and Matlab/Octave). GraphBLAS provides a lightweight in-memory database implementation of hypersparse matrices that are ideal for analyzing many types of network data, while providing rigorous mathematical guarantees, such as linearity. Streaming updates of hypersparse matrices put enormous pressure on the memory hierarchy. This work benchmarks an implementation of hierarchical hypersparse matrices that reduces memory pressure and dramatically increases the update rate into a hypersparse matrices. The parameters of hierarchical hypersparse matrices rely on controlling the number of entries in each level in the hierarchy before an update is cascaded. The parameters are easily tunable to achieve optimal performance for a variety of applications. Hierarchical hypersparse matrices achieve over 1,000,000 updates per second in a single instance. Scaling to 31,000 instances of hierarchical hypersparse matrices arrays on 1,100 server nodes on the MIT SuperCloud achieved a sustained update rate of 75,000,000,000 updates per second. This capability allows the MIT SuperCloud to analyze extremely large streaming network data sets.
READ LESS

Summary

The SuiteSparse GraphBLAS C-library implements high performance hypersparse matrices with bindings to a variety of languages (Python, Julia, and Matlab/Octave). GraphBLAS provides a lightweight in-memory database implementation of hypersparse matrices that are ideal for analyzing many types of network data, while providing rigorous mathematical guarantees, such as linearity. Streaming updates...

READ MORE

The 2017 Buffalo Area Icing and Radar Study (BAIRS II)

Published in:
MIT Lincoln Laboratory Report ATC-447

Summary

The second Buffalo Area Icing and Radar Study (BAIRS II) was conducted during the winter of 2017. The BAIRS II partnership between Massachusetts Institute of Technology (MIT) Lincoln Laboratory (LL), the National Research Council of Canada (NRC), and Environment and Climate Change Canada (ECCC) was sponsored by the Federal Aviation Administration (FAA). It is a follow-up to the similarly sponsored partnership of the original BAIRS conducted in the winter of 2013. The original BAIRS provided in situ verification and validation of icing and hydrometeors, respectively, within the radar domain in support of a hydrometeor-classification-based automated icing hazard algorithm. The BAIRS II motivation was to: --Collect additional in situ verification and validation data, --Probe further dual polarimetric radar features associated with icing hazard, --Provide foundations for additions to the icing hazard algorithm beyond hydrometeor classifications, and --Further characterize observable microphysical conditions in terms of S-band dual polarimetric radar data. With BAIRS II, the dual polarimetric capability is provided by multiple Next Generation Weather Radar (NEXRAD) S-band radars in New York State, and the verification of the icing hazard with microphysical and hydrometeor characterizations is provided by NRC's Convair-580 instrumented research plane during five icing missions covering about 21 mission hours. The ability to reliably interpret the NEXRAD dual polarization radar-sensed thermodynamic phase of the hydrometeors (solid, liquid, mix) in the context of cloud microphysics and precipitation physics makes it possible to assess the icing hazard potential to aviation. The challenges faced are the undetectable nature of supercooled cloud droplets (for Sband) and the isotropic nature of Supercooled Large Drops (SLD). The BAIRS II mission strategy pursued was to study and probe radar-identifiable, strongly anisotropic crystal targets (dendrites and needles) with which supercooled water (and water saturated conditions) are physically linked as a means for dual polarimetric detection of icing hazard. BAIRS II employed superior optical array probes along with state and microphysical instrumentation; and, using again NEXRAD-feature-guided flight paths, was able to make advances from the original BAIRS helpful to the icing algorithm development. The key findings that are given thorough treatment in this report are: --Identification of the radar-detectable "crystal sandwich" structure from two anisotropic crystal types stratified by in situ air temperature in association with varying levels of supercooled water --with layer thicknesses observed to 2 km, --over hundred-kilometer scales matched with the mesoscale surveillance of the NEXRAD radars, --Development and application of a multi-sensor cloud phase algorithm to distinguish between liquid phase, mixed phase, and glaciated (no icing) conditions for purposes of a "truth" database and improved analysis in BAIRS II, --Development of concatenated hydrometeor size distributions to examine the in situ growth of both liquid and solid hydrometeors over a broad size spectrum; used, in part, to demonstrate differences between maritime and continental conditions, and --The Icing Hazard Levels (IHL) algorithm’s verification in icing conditions is consistent with previous work and, new, is documented to perform well when indicating "glaciated" (no icing) conditions.
READ LESS

Summary

The second Buffalo Area Icing and Radar Study (BAIRS II) was conducted during the winter of 2017. The BAIRS II partnership between Massachusetts Institute of Technology (MIT) Lincoln Laboratory (LL), the National Research Council of Canada (NRC), and Environment and Climate Change Canada (ECCC) was sponsored by the Federal Aviation...

READ MORE