Publications

Refine Results

(Filters Applied) Clear All

Quantification of radar QPE performance based on SENSR network design possibilities

Published in:
2018 IEEE Radar Conf., RadarConf, 23-27 April 2018.

Summary

In 2016, the FAA, NOAA, DoD, and DHS initiated a feasibility study for a Spectrum Efficient National Surveillance Radar (SENSR). The goal is to assess approaches for vacating the 1.3- to 1.35-GHz radio frequency band currently allocated to FAA/DoD long-range radars so that this band can be auctioned for commercial use. As part of this goal, the participating agencies have developed preliminary performance requirements that not only assume minimum capabilities based on legacy radars, but also recognize the need for enhancements in future radar networks. The relatively low density of the legacy radar networks, especially the WSR-88D network, had led to the goal of enhancing low-altitude weather coverage. With multiple design metrics and network possibilities still available to the SENSR agencies, the benefits of low-altitude coverage must be assessed quantitatively. This study lays the groundwork for estimating Quantitative Precipitation Estimation (QPE) differences based on network density, array size, and polarimetric bias. These factors create a pareto front of cost-benefit for QPE in a new radar network, and these results will eventually be used to determine appropriate tradeoffs for SENSR requirements. Results of this study are presented in the form of two case examples that quantify errors based on polarimetric bias and elevation, along with a description of eventual application to a national network in upcoming expansion of the work.
READ LESS

Summary

In 2016, the FAA, NOAA, DoD, and DHS initiated a feasibility study for a Spectrum Efficient National Surveillance Radar (SENSR). The goal is to assess approaches for vacating the 1.3- to 1.35-GHz radio frequency band currently allocated to FAA/DoD long-range radars so that this band can be auctioned for commercial...

READ MORE

Hybrid mixed-membership blockmodel for inference on realistic network interactions

Published in:
IEEE Trans. Netw. Sci. Eng., Vol. 6, No. 3, July-Sept. 2019.

Summary

This work proposes novel hybrid mixed-membership blockmodels (HMMB) that integrate three canonical network models to capture the characteristics of real-world interactions: community structure with mixed-membership, power-law-distributed node degrees, and sparsity. This hybrid model provides the capacity needed for realism, enabling control and inference on individual attributes of interest such as mixed-membership and popularity. A rigorous inference procedure is developed for estimating the parameters of this model through iterative Bayesian updates, with targeted initialization to improve identifiability. For the estimation of mixed-membership parameters, the Cramer-Rao bound is derived by quantifying the information content in terms of the Fisher information matrix. The effectiveness of the proposed inference is demonstrated in simulations where the estimates achieve covariances close to the Cramer-Rao bound while maintaining good truth coverage. We illustrate the utility of the proposed model and inference procedure in the application of detecting a community from a few cue nodes, where success depends on accurately estimating the mixed-memberships. Performance evaluations on both simulated and real-world data show that inference with HMMB is able to recover mixed-memberships in the presence of challenging community overlap, leading to significantly improved detection performance over algorithms based on network modularity and simpler models.
READ LESS

Summary

This work proposes novel hybrid mixed-membership blockmodels (HMMB) that integrate three canonical network models to capture the characteristics of real-world interactions: community structure with mixed-membership, power-law-distributed node degrees, and sparsity. This hybrid model provides the capacity needed for realism, enabling control and inference on individual attributes of interest such as...

READ MORE

The Offshore Precipitation Capability

Summary

In this work, machine learning and image processing methods are used to estimate radar-like precipitation intensity and echo top heights beyond the range of weather radar. The technology, called the Offshore Precipitation Capability (OPC), combines global lightning data with existing radar mosaics, five Geostationary Operational Environmental Satellite (GOES) channels, and several fields from the Rapid Refresh (RAP) 13 km numerical weather prediction model to create precipitation and echo top fields similar to those provided by existing Federal Aviation Administration (FAA) weather systems. Preprocessing and feature extraction methods are described to construct inputs for model training. A variety of machine learning algorithms are investigated to identify which provides the most accuracy. Output from the machine learning model is blended with existing radar mosaics to create weather radar-like analyses that extend into offshore regions. The resulting fields are validated using land radars and satellite precipitation measurements provided by the National Aeronautics and Space Administration (NASA) Global Precipitation Measurement Mission (GPM) core observatory satellite. This capability is initially being developed for the Miami Oceanic airspace with the goal of providing improved situational awareness for offshore air traffic control.
READ LESS

Summary

In this work, machine learning and image processing methods are used to estimate radar-like precipitation intensity and echo top heights beyond the range of weather radar. The technology, called the Offshore Precipitation Capability (OPC), combines global lightning data with existing radar mosaics, five Geostationary Operational Environmental Satellite (GOES) channels, and...

READ MORE

En route sector capacity model final report

Author:
Published in:
MIT Lincoln Laboratory Report ATC-426

Summary

Accurate predictions of en route sector capacity are vital when analyzing the benefits of proposed new air traffic management decision-support tools or new airspace designs. Controller workload is the main determinant of sector capacity. This report describes a new workload-based capacity model that improves upon the Federal Aviation Administration's current Monitor Alert capacity model. Analysts often use Monitor Alert sector capacities in evaluating the benefits of decision-support aids or airspace designs. However, Monitor Alert, which was designed to warn controllers of possible sector overload, sets sector capacity limits based solely on handoff workload and fixed procedural constraints. It ignores the effects of conflict workload and recurring workload (from activities such as monitoring, vectoring, spacing, and metering). Each workload type varies differently as traffic counts and airspace designs are changed. When used for benefits analysis, Monitor Alert's concentration on a single workload type can lead to erroneous conclusions. The new model considers all three workload types. We determine the relative contribution of the three workload types by fitting the model to the upper frontiers that appear in peak daily sector traffic counts from today's system. When we fit the Monitor Alert model to these same peak traffic counts, it can only explain the observed frontiers by hypothesizing large handoff workload. Large handoff workload would imply that decision-support aids should focus on handoff tasks. The new model fits the traffic data with less error, and shows that recurring tasks create significantly more workload in all sectors than do handoff tasks. The new model also shows that conflict workload dominates in very small sectors. These findings suggest that it is more beneficial to develop decision-support aids for recurring tasks and conflict tasks than for handoff tasks.
READ LESS

Summary

Accurate predictions of en route sector capacity are vital when analyzing the benefits of proposed new air traffic management decision-support tools or new airspace designs. Controller workload is the main determinant of sector capacity. This report describes a new workload-based capacity model that improves upon the Federal Aviation Administration's current...

READ MORE

Percolation model of insider threats to assess the optimum number of rules

Published in:
Environ. Syst. Decis., Vol. 35, 2015, pp. 504-10.

Summary

Rules, regulations, and policies are the basis of civilized society and are used to coordinate the activities of individuals who have a variety of goals and purposes. History has taught that over-regulation (too many rules) makes it difficult to compete and under-regulation (too few rules) can lead to crisis. This implies an optimal number of rules that avoids these two extremes. Rules create boundaries that define the latitude at which an individual has to perform their activities. This paper creates a Toy Model of a work environment and examines it with respect to the latitude provided to a normal individual and the latitude provided to an insider threat. Simulations with the Toy Model illustrate four regimes with respect to an insider threat: under-regulated, possibly optimal, tipping point, and over-regulated. These regimes depend upon the number of rules (N) and the minimum latitude (Lmin) required by a normal individual to carry out their activities. The Toy Model is then mapped onto the standard 1D Percolation Model from theoretical physics, and the same behavior is observed. This allows the Toy Model to be generalized to a wide array of more complex models that have been well studied by the theoretical physics community and also show the same behavior. Finally, by estimating N and Lmin, it should be possible to determine the regime of any particular environment.
READ LESS

Summary

Rules, regulations, and policies are the basis of civilized society and are used to coordinate the activities of individuals who have a variety of goals and purposes. History has taught that over-regulation (too many rules) makes it difficult to compete and under-regulation (too few rules) can lead to crisis. This...

READ MORE

Wind information requirements for NextGen applications, phase 3 final report

Published in:
MIT Lincoln Laboratory Report ATC-422

Summary

Many NextGen applications depend on access to high accuracy wind data due to time-based control elements, such as required time of arrival at a meter fix under 4D-Trajectory-Based Operations/Time of Arrival Control procedures or compliance to an assigned spacing goal between aircraft under Interval Management procedures. Any errors in the ground and/or aircraft wind information relative to the truth winds actually flown through can significantly degrade the performance of the procedure. Unacceptable performance could be mitigated by improving wind information in the aircraft, for example, by using higher accuracy wind forecast models to generate wind inputs for the ground or airborne systems, updating wind information more frequently, or to upgrade the way winds are handled in the avionics systems. The work described in this report summarizes the activities conducted in FY14, which builds upon prior work. It (1) establishes the relationship of wind information accuracy to 4D-TBO and IM performance for a selection of operationally relevant scenarios to identify wind needs to support them, and (2) presents examples of what wind information content and update rate to the aircraft will deliver a given target performance level to help inform concept of operations development and datalink technology needs.
READ LESS

Summary

Many NextGen applications depend on access to high accuracy wind data due to time-based control elements, such as required time of arrival at a meter fix under 4D-Trajectory-Based Operations/Time of Arrival Control procedures or compliance to an assigned spacing goal between aircraft under Interval Management procedures. Any errors in the...

READ MORE

Application of the Fornasini-Marchesini first model to data collected on a complex target model

Summary

This work describes the computation of scatterers that lay on the body of a real target which are depicted in radar images. A novelty of the approach is the target echoes collected by the radar are formulated into the first Fornasini-Marchesini (F-M) state space model to compute poles that give rise to the scatterer locations in the two-dimensional (2-D) space. Singular value decomposition carried out on the data provides state matrices that capture the dynamics of the target. Furthermore, eigenvalues computed from the state transition matrices provide range and cross-range locations of the scatterers that exhibit the target silhouette in 2-D space. The maximum likelihood function is formulated with the state matrices to obtain an iterative expression for the Fisher information matrix (FIM) from which posterior Cramer-Rao bounds associated with the various scatterers are derived. Effectiveness of the 2-D state-space technique is tested on static range data collected on a complex conical target model; its accuracy to extract target length is judged and compared with the physical measurements. Validity of the proposed 2-D state-space technique and the Cramer-Rao bounds are demonstrated through data collected on the target model.
READ LESS

Summary

This work describes the computation of scatterers that lay on the body of a real target which are depicted in radar images. A novelty of the approach is the target echoes collected by the radar are formulated into the first Fornasini-Marchesini (F-M) state space model to compute poles that give...

READ MORE

Forecast confidence measures for deterministic storm-scale aviation forecasts

Published in:
4th Aviation, Range, and Aerospace Meteorology Special Symp., 2-6 February 2014.

Summary

Deterministic storm-scale weather forecasts, such as those generated from the FAA's 0-8 hour CoSPA system, are highly valuable to aviation traffic managers. They provide forecasted characteristics of storm structure, strength, orientation, and coverage that are very helpful for strategic planning purposes in the National Airspace System (NAS). However, these deterministic weather forecasts contain inherent uncertainty that varies with the general weather scenario at the forecast issue time, the predicted storm type, and the forecast time horizon. This uncertainty can cause large changes in the forecast from update to update, thereby eroding user confidence and ultimately reducing the forecast's effectiveness in the decision-making process. Deterministic forecasts generally lack objective measures of this uncertainty, making it very difficult for users of the forecast to know how much confidence to have in the forecast during their decision-making process. This presentation will describe a methodology to provide measures of confidence for deterministic storm-scale forecasts. The method inputs several characteristics of the current and historical weather forecasts, such as spatial scale, intensity, weather type, orientation, permeability, and run-to-run variability of the forecasts, into a statistical model to provide a measure of confidence in a forecasted quantity. In this work, the forecasted quantity is aircraft blockage associated with key high-impact Flow Constrained Areas (FCAs) in the NAS. The results from the method, which will also be presented, provide the user with a measure of forecast confidence in several blockage categories (none, low, medium, and high) associated with the FCAs. This measure of forecast confidence is geared toward helping en-route strategic planners in the NAS make better use of deterministic storm-scale weather forecasts for air traffic management.
READ LESS

Summary

Deterministic storm-scale weather forecasts, such as those generated from the FAA's 0-8 hour CoSPA system, are highly valuable to aviation traffic managers. They provide forecasted characteristics of storm structure, strength, orientation, and coverage that are very helpful for strategic planning purposes in the National Airspace System (NAS). However, these deterministic...

READ MORE

Review of Systems-Theoretic Process Analysis (STPA) method and results to support NextGen concept assessment and validation

Published in:
MIT Lincoln Laboratory Report ATC-427

Summary

This report provides an assessment of the applicability of Systems-Theoretic Process Analysis (STPA) to perform preliminary risk-based modeling of complex NextGen concepts, based on the observed application of STPA to Interval Management-Spacing (IM-S) as a case study. The report also considers the potential use of STPA as a formal tool for safety analysis at the Federal Aviation Administration. This report's sources include a report documenting the application of STPA performed by the MIT Systems Engineering Research Lab (SERL), previous reports, and input from other staff and aviation subject-matter experts.
READ LESS

Summary

This report provides an assessment of the applicability of Systems-Theoretic Process Analysis (STPA) to perform preliminary risk-based modeling of complex NextGen concepts, based on the observed application of STPA to Interval Management-Spacing (IM-S) as a case study. The report also considers the potential use of STPA as a formal tool...

READ MORE

Sector workload model for benefits analysis and convective weather capacity prediction

Published in:
10th USA/Europe Air Traffic Management Research and Development Sem., ATM 2013, 10-13 June 2013.

Summary

En route sector capacity is determined mainly by controller workload. The operational capacity model used by the Federal Aviation Administration (FAA) provides traffic alert thresholds based entirely on hand-off workload. Its estimates are accurate for most sectors. However, it tends to over-estimate capacity in both small and large sectors because it does not account for conflicts and recurring tasks. Because of those omissions it cannot be used for accurate benefits analysis of workload-reduction initiatives, nor can it be extended to estimate capacity when hazardous weather increases the intensity of all workload types. We have previously reported on an improved model that accounts for all workload types and can be extended to handle hazardous weather. In this paper we present the results of a recent regression of that model using an extensive database of peak traffic counts for all United States en route sectors. The resulting fit quality confirms the workload basis of en route capacity. Because the model has excess degrees of freedom, the regression process returns multiple parameter combinations with nearly identical sector capacities. We analyze the impact of this ambiguity when using the model to quantify the benefits of workload reduction proposals. We also describe recent modifications to the weather-impacted version of the model to provide a more stable normalized capacity measure. We conclude with an illustration of its potential application to operational sector capacity forecasts in hazardous weather.
READ LESS

Summary

En route sector capacity is determined mainly by controller workload. The operational capacity model used by the Federal Aviation Administration (FAA) provides traffic alert thresholds based entirely on hand-off workload. Its estimates are accurate for most sectors. However, it tends to over-estimate capacity in both small and large sectors because...

READ MORE