Publications

Refine Results

(Filters Applied) Clear All

Dissociating COVID-19 from other respiratory infections based on acoustic, motor coordination, and phonemic patterns

Published in:
Sci. Rep., Vol. 13, No. 1, January 2023, 1567.

Summary

In the face of the global pandemic caused by the disease COVID-19, researchers have increasingly turned to simple measures to detect and monitor the presence of the disease in individuals at home. We sought to determine if measures of neuromotor coordination, derived from acoustic time series, as well as phoneme-based and standard acoustic features extracted from recordings of simple speech tasks could aid in detecting the presence of COVID-19. We further hypothesized that these features would aid in characterizing the effect of COVID-19 on speech production systems. A protocol, consisting of a variety of speech tasks, was administered to 12 individuals with COVID-19 and 15 individuals with other viral infections at University Hospital Galway. From these recordings, we extracted a set of acoustic time series representative of speech production subsystems, as well as their univariate statistics. The time series were further utilized to derive correlation-based features, a proxy for speech production motor coordination. We additionally extracted phoneme-based features. These features were used to create machine learning models to distinguish between the COVID-19 positive and other viral infection groups, with respiratory- and laryngeal-based features resulting in the highest performance. Coordination-based features derived from harmonic-to-noise ratio time series from read speech discriminated between the two groups with an area under the ROC curve (AUC) of 0.94. A longitudinal case study of two subjects, one from each group, revealed differences in laryngeal based acoustic features, consistent with observed physiological differences between the two groups. The results from this analysis highlight the promise of using nonintrusive sensing through simple speech recordings for early warning and tracking of COVID-19.
READ LESS

Summary

In the face of the global pandemic caused by the disease COVID-19, researchers have increasingly turned to simple measures to detect and monitor the presence of the disease in individuals at home. We sought to determine if measures of neuromotor coordination, derived from acoustic time series, as well as phoneme-based...

READ MORE

An emotion-driven vocal biomarker-based PTSD screening tool

Summary

This paper introduces an automated post-traumatic stress disorder (PTSD) screening tool that could potentially be used as a self-assessment or inserted into routine medical visits for PTSD diagnosis and treatment. Methods: With an emotion estimation algorithm providing arousal (excited to calm) and valence (pleasure to displeasure) levels through discourse, we select regions of the acoustic signal that are most salient for PTSD detection. Our algorithm was tested on a subset of data from the DVBIC-TBICoE TBI Study, which contains PTSD Check List Civilian (PCL-C) assessment scores. Results: Speech from low-arousal and positive-valence regions provide the best discrimination for PTSD. Our model achieved an AUC (area under the curve) equal to 0.80 in detecting PCL-C ratings, outperforming models with no emotion filtering (AUC = 0.68). Conclusions: This result suggests that emotion drives the selection of the most salient temporal regions of an audio recording for PTSD detection.
READ LESS

Summary

This paper introduces an automated post-traumatic stress disorder (PTSD) screening tool that could potentially be used as a self-assessment or inserted into routine medical visits for PTSD diagnosis and treatment. Methods: With an emotion estimation algorithm providing arousal (excited to calm) and valence (pleasure to displeasure) levels through discourse, we...

READ MORE

Noninvasive monitoring of simulated hemorrhage and whole blood resuscitation

Published in:
Biosensors, Vol. 12, No. 12, 2022, Art. No. 1168.

Summary

Hemorrhage is the leading cause of preventable death from trauma. Accurate monitoring of hemorrhage and resuscitation can significantly reduce mortality and morbidity but remains a challenge due to the low sensitivity of traditional vital signs in detecting blood loss and possible hemorrhagic shock. Vital signs are not reliable early indicators because of physiological mechanisms that compensate for blood loss and thus do not provide an accurate assessment of volume status. As an alternative, machine learning (ML) algorithms that operate on an arterial blood pressure (ABP) waveform have been shown to provide an effective early indicator. However, these ML approaches lack physiological interpretability. In this paper, we evaluate and compare the performance of ML models trained on nine ABP-derived features that provide physiological insight, using a database of 13 human subjects from a lower-body negative pressure (LBNP) model of progressive central hypovolemia and subsequent progressive restoration to normovolemia (i.e., simulated hemorrhage and whole blood resuscitation). Data were acquired at multiple repressurization rates for each subject to simulate varying resuscitation rates, resulting in 52 total LBNP collections. This work is the first to use a single ABP-based algorithm to monitor both simulated hemorrhage and resuscitation. A gradient-boosted regression tree model trained on only the half-rise to dicrotic notch (HRDN) feature achieved a root-mean-square error (RMSE) of 13%, an R2 of 0.82, and area under the receiver operating characteristic curve of 0.97 for detecting decompensation. This single-feature model's performance compares favorably to previously reported results from more-complex black box machine learning models. This model further provides physiological insight because HRDN represents an approximate measure of the delay between the ABP ejected and reflected wave and therefore is an indication of cardiac and peripheral vascular mechanisms that contribute to the compensatory response to blood loss and replacement.
READ LESS

Summary

Hemorrhage is the leading cause of preventable death from trauma. Accurate monitoring of hemorrhage and resuscitation can significantly reduce mortality and morbidity but remains a challenge due to the low sensitivity of traditional vital signs in detecting blood loss and possible hemorrhagic shock. Vital signs are not reliable early indicators...

READ MORE

Backdoor poisoning of encrypted traffic classifiers

Summary

Significant recent research has focused on applying deep neural network models to the problem of network traffic classification. At the same time, much has been written about the vulnerability of deep neural networks to adversarial inputs, both during training and inference. In this work, we consider launching backdoor poisoning attacks against an encrypted network traffic classifier. We consider attacks based on padding network packets, which has the benefit of preserving the functionality of the network traffic. In particular, we consider a handcrafted attack, as well as an optimized attack leveraging universal adversarial perturbations. We find that poisoning attacks can be extremely successful if the adversary has the ability to modify both the labels and the data (dirty label attacks) and somewhat successful, depending on the attack strength and the target class, if the adversary perturbs only the data (clean label attacks).
READ LESS

Summary

Significant recent research has focused on applying deep neural network models to the problem of network traffic classification. At the same time, much has been written about the vulnerability of deep neural networks to adversarial inputs, both during training and inference. In this work, we consider launching backdoor poisoning attacks...

READ MORE

Contingent routing using orbital geometry in proliferated low-earth-orbit satellite networks

Published in:
2022 IEEE Military Communications Conf., MILCOM, 28 November - 2 December 2022.

Summary

Optimum adaptive routing in proliferated low-earth-orbit (pLEO) satellite networks requires intensive computation. The very small size, light weight, and low power of individual satellites in such networks makes a centralized, terrestrial, SDN-like approach to routing computation an attractive solution. However, it is highly desirable to have a distributed backup routing capability onboard each satellite that can maintain service if the central computational node(s) fail or lose their pathway(s) to upload routing data frequently to each satellite. This paper presents a routing algorithm based on orbital geometry that has a very low computation and storage requirements and is suitable as a backup routing capability in the event of failure of a centralized routing calculation node or nodes. Path failure rate, path latency, and link resource usage are simulated for a 360-satellite Walker Delta constellation with 4 inter-satellite link (ISL) terminals per satellite, and with up to 10% of the satellites having failed. For the fully intact satellite constellation, path failure rate is zero (identical to a shortest path routing algorithm), while mean latency and average link resource usage are shown to be approximately 12% and 13% higher, respectively, than with shortest path routing. With 10 random satellite failures in the constellation, the geometric algorithm has a path failure rate of less than 0.5%, while the mean latency and link resource usage are approximately 12% and 16% higher, respectively, than with shortest path routing.
READ LESS

Summary

Optimum adaptive routing in proliferated low-earth-orbit (pLEO) satellite networks requires intensive computation. The very small size, light weight, and low power of individual satellites in such networks makes a centralized, terrestrial, SDN-like approach to routing computation an attractive solution. However, it is highly desirable to have a distributed backup routing...

READ MORE

Failure resilience in proliferated low earth orbit satellite network topologies

Published in:
2022 IEEE Military Communications Conf., MILCOM, 28 November - 2 December 2022.

Summary

The vision of continuous network connectivity for users located anywhere on Earth is increasingly being enabled by satellite constellations with hundreds to thousands of satellites operating in low altitude orbits (typically somewhere between a few hundred and two thousand km). These constellations are often referred to as proliferated Low Earth Orbit (pLEO) constellations. Potential military use of such constellations would require a high degree of resilience against various types of failures. This paper examines how resilience to satellite failures in particular is affected by topology and topology management for a moderate-sized constellation of 360 low-earth-orbit satellites providing 2X-redundant global coverage. We present simulations quantifying the effects of two vs. four inter-satellite links (ISLs) per satellite, and of dynamic post-failure topology reconfiguration vs static topology management. Simulations show differences of 65-80% in mission connectivity between 4-ISL topologies with dynamic topology reconfiguration and 2-ISL topologies with static topology using two different traffic scenarios.
READ LESS

Summary

The vision of continuous network connectivity for users located anywhere on Earth is increasingly being enabled by satellite constellations with hundreds to thousands of satellites operating in low altitude orbits (typically somewhere between a few hundred and two thousand km). These constellations are often referred to as proliferated Low Earth...

READ MORE

Rate control with autoregressive forecasting for high frequency communication

Published in:
2022 IEEE Military Communications Conf., MILCOM, 28 November - 2 December 2022.

Summary

This work introduces a data-driven framework for rate control and applies it to high frequency (HF) communication systems that propagate via the Earth’s ionosphere. The rate control approach uses statistical techniques to forecast channel state with an autoregressive (AR) model, which has previously been applied to different forms of wireless fading, including "medium" timescale fading at HF. The objective of rate control is to maximize the data rate while constraining the rate of packets decoded in error. We show that under ideal assumptions, the rate controller selects the rate by backing off from the forecast average signal-to-noise ratio (SNR) by a factor of sigmaQ^-1(Beta), where sigma^2 correlates with fading variance, Beta denotes a constraint on decoder errors, and Q(.) is the complementary cumulative distribution function of the Gaussian distribution. Simulation results on an HF channel model show that compared with naive schemes, AR forecasting provides a good balance between achieving high rate and ensuring reliability.
READ LESS

Summary

This work introduces a data-driven framework for rate control and applies it to high frequency (HF) communication systems that propagate via the Earth’s ionosphere. The rate control approach uses statistical techniques to forecast channel state with an autoregressive (AR) model, which has previously been applied to different forms of wireless...

READ MORE

Mission resilience experimentation and evaluation testbed

Published in:
IEEE Military Communications Conf., MILCOM, 28 November - 2 December 2022.

Summary

As the complexity of DoD systems increases exponentially, the DoD continues to struggle with understanding and improving the resilience of its mission software. The Applied Resilience for Mission Systems (ARMS) Testbed is an environment that enables resilience improvement by experimentation and assessment of different mission system architectures and approaches. This Testbed consists of components for deploying mission system software for testing, capturing system performance, generating traffic, introducing disruptions into the mission system, orchestrating controlled experiments, and assessing and comparing the performance of mission systems. This paper covers the implementation of this Testbed, analysis for mission resilience comparisons, and their application to an operational terrestrial network architecture. Additionally, we introduce the Distance to Failure metric for comparing the resilience of arbitrary mission systems variations.
READ LESS

Summary

As the complexity of DoD systems increases exponentially, the DoD continues to struggle with understanding and improving the resilience of its mission software. The Applied Resilience for Mission Systems (ARMS) Testbed is an environment that enables resilience improvement by experimentation and assessment of different mission system architectures and approaches. This...

READ MORE

Automated contact tracing assessment

Published in:
MIT Lincoln Laboratory Report TR-1287

Summary

The COVID-19 pandemic placed unprecedented demands on the global public health systems for disease surveillance and contact tracing. Engineers and scientists recognized that it might be possible to augment the efforts of public health teams, if a system for automated digital contact tracing could be quickly devised and deployed to the population of smartphones. The Private Automated Contact Tracing (PACT) protocol was one of several digital contact tracing proposals offered worldwide. PACT’s mission—to preserve individuals’ privacy and anonymity while enabling them to quickly alert even nearby strangers of a likely risky exposure—was adopted by Google and Apple and realized in the Exposure Notifications (EN) service and API for mobile application development. The Exposure Notifications system, like many digital proximity tools, is based on Bluetooth signal strength estimation, and keeps much of the necessary information and computation on the smartphones themselves. It implemented a decentralized approach to contact tracing: the public health authority, and other governmental authorities, cannot access the records of an individual’s encounters with others; nor is physical location used or shared by the service. Although the service is available on most modern iOS and Android devices, it is not enabled by default; the individual must opt in to use a particular region’s implementation of the service, either by installing the regional app or by enrolling through a menu of regions in the operating system settings. Likewise, individuals must affirm their consent before the service can share anonymized infection status with the regional public health authority, and alert recent close contacts. The widespread availability of Exposure Notifications through Apple and Google’s platforms has made it a de facto world standard. Determining its accuracy and effectiveness as a public health tool has been a subject of intense interest. In July 2020, CDC’s Innovative Technologies Team designated MIT LL and the PACT team as trusted technical advisors on the deployment of private automated contact tracing systems as part of its overall public health response to COVID-19. The Innovative Technologies Team sought to answer the following key question regarding automated contact tracing: Does automated contact tracing have sufficient public health value that it is worthwhile to integrate it at scale into existing and evolving manual contact tracing systems? Rapidly rising caseloads necessitated parallel-path assessment activities of most mature systems at the time. When access to the Google and Apple Exposure Notifications system became available, MIT LL focused the assessment efforts on the systems being built and deployed. There were two immediate and significant challenges to observing and quantifying the performance of the system as a whole: first, the privacy preserving design decisions of PACT and the system implementers denied access to system-level performance metrics, and second, obtaining accurate “ground truth” data about risky encounters in the population, against which to measure the detector performance, would require an unacceptable level of effort and intrusion. Therefore, MIT LL designed a set of parallel research activities to decompose the problem into components that could be assessed quantifiably (Bluetooth sensor performance, algorithm performance, user preferences and behaviors), components that could be assessed qualitatively (potential cybersecurity risks, potential for malicious use), and components that could be modeled based on current and emergent knowledge (population-level effects). The MIT LL research team conducted early assessments of the privacy and security aspects of new EN app implementations and closely reviewed the available system code exercised by the apps, before conducting a series of phone-to-phone data collections both in the laboratory and in simulated real-world conditions. The data from these experiments fed into models and visualization tools created to predict and understand the risk score output of candidate “weights and thresholds” configurations for EN, i.e., to predict the performance of the system as-built against ground truth data for distance and duration of “exposure”. The data and performance predictions from this effort helped to inform the global and local community of practice in making configuration decisions, and can help to predict the performance of future versions of similar tools, or alternative implementations of the current system. We conducted a human factors and usability review of early app user interfaces and messaging from public health, and designed a follow-on large-scale survey to investigate questions about user trust and system adoption decisions. The results of the human factors, user trust, and adoption studies were used by U.S. public health jurisdictions to make adjustments to public-facing communications, and were shared with Apple and Google to improve the user interface. Information gathered from public health experts enabled us to better understand conventional contact tracing workflows and data streams, and we incorporated that information into an agent-based model of “hybrid” contact tracing plus Exposure Notifications. We then combined it with emerging reports on vaccination, mask effectiveness, social interaction, variant transmissibility, and our own data on the sensitivity and specificity of the Bluetooth “dose” estimator, to predict system-level effects under various conditions. Finally, we helped to establish a network of Exposure Notifications “practitioners” in public health, who surfaced desirable system-level key performance indicators (implemented during 2021 and 2022, in the Exposure Notifications Private Analytics system, or ENPA). At the conclusion of the program, many of the initial conditions of the pandemic had changed. The Exposure Notifications service was available to most of the world, but had only been deployed by 28 U.S. states and territories, and had not been adopted by much of the population in those regions. High case rates during the Omicron surge (December 2021 – January 2022) and newly available ENPA data offered the first hints at calculating “real” state-level performance metrics, but those data belong to the states and many are cautious about publishing. Although Google and Apple have stated that Exposure Notifications was designed for COVID-19, and will not be maintained in its current form after the pandemic ends, the public health and engineering communities show clear interest in using the “lessons learned” from Exposure Notifications and other similar solutions to preserve the capabilities developed and prepare better systems for future public health emergencies. The intent of this report is to document the work that has been completed, as well as to inform where the work could be updated or adapted to meet future needs.
READ LESS

Summary

The COVID-19 pandemic placed unprecedented demands on the global public health systems for disease surveillance and contact tracing. Engineers and scientists recognized that it might be possible to augment the efforts of public health teams, if a system for automated digital contact tracing could be quickly devised and deployed to...

READ MORE

On randomization in MTD systems

Published in:
Proc. of the 9th ACM Workshop on Moving Target Defense, MTD ’22, 7 November 2022.

Summary

Randomization is one of the main strategies in providing security in moving-target-defense (MTD) systems. However, randomization has an associated cost and estimating this cost and its impact on the overall system is crucial to ensure adoption of the MTD strategy. In this paper we discuss our experience in attempting to estimate the cost of path randomization in a message transmission system that used randomization of paths in the network. Our conclusions are (i) the cost crucially depends on the underlying network control technology, (ii) one can reduce this cost by better implementation, and (iii) reducing one type of cost may result in increased costs of a different type, for example a higher device cost. These suggest that estimating the cost of randomization is a multivariable optimization problem that requires a full understanding of the system components.
READ LESS

Summary

Randomization is one of the main strategies in providing security in moving-target-defense (MTD) systems. However, randomization has an associated cost and estimating this cost and its impact on the overall system is crucial to ensure adoption of the MTD strategy. In this paper we discuss our experience in attempting to...

READ MORE