Publications

Refine Results

(Filters Applied) Clear All

Evaluation of the baseline NEXRAD icing hazard project

Published in:
37th Conference on Radar Meteorology, 14-18 September 2015

Summary

MIT Lincoln Laboratory has developed an icing hazard product that is now operational throughout the NEXRAD network. This initial version of the Icing Hazard Levels (IHL) algorithm is predicated on the presence of graupel as determined by the NEXRAD Hydrometeor Classification Algorithm (HCA). Graupel indicates that rime accretion on ice crystal aggregates is present. It is inferred that the riming process occurs at the altitude that HCA reports graupel as well as to some vertical depth above. To capture some of that depth, temperature and relative humidity interest fields are computed from meteorological model data based on the technique used in the National Center for Atmospheric Research's Current Icing Potential Product and utilized within IHL as warranted. A critical aspect of the IHL development has focused on the verification of the presence of icing. Two methods are used. For the first, pilot reports of icing (PIREPs) are used to score the performance of IHL. Since PIREPs are provided with inherent time and space uncertainties, a buffer of influence is associated with each PIREP when scoring IHL. Results show the IHL as configured is an effective indicator of a potential icing hazard when HCA graupel classifications are present. Results also show the importance of radar volume coverage pattern selection in detecting weak returns in winter weather. For the second, in situ icing missions were performed within range of a dual pol NEXRAD to provide quantitative data to identify the presence of supercooled liquid water. Comparisons of in situ data to HCA classifications show that HCA graupel indications do not fully expose the icing hazard and these findings are being used to direct future attention of IHL development. This paper will describe the verification method and performance assessment of the IHL initial capability.
READ LESS

Summary

MIT Lincoln Laboratory has developed an icing hazard product that is now operational throughout the NEXRAD network. This initial version of the Icing Hazard Levels (IHL) algorithm is predicated on the presence of graupel as determined by the NEXRAD Hydrometeor Classification Algorithm (HCA). Graupel indicates that rime accretion on ice...

READ MORE

Sampling large graphs for anticipatory analytics

Published in:
HPEC 2015: IEEE Conf. on High Performance Extreme Computing, 15-17 September 2015.

Summary

The characteristics of Big Data - often dubbed the 3V's for volume, velocity, and variety - will continue to outpace the ability of computational systems to process, store, and transmit meaningful results. Traditional techniques for dealing with large datasets often include the purchase of larger systems, greater human-in-the-loop involvement, or more complex algorithms. We are investigating the use of sampling to mitigate these challenges, specifically sampling large graphs. Often, large datasets can be represented as graphs where data entries may be edges, and vertices may be attributes of the data. In particular, we present the results of sampling for the task of link prediction. Link prediction is a process to estimate the probability of a new edge forming between two vertices of a graph, and it has numerous application areas in understanding social or biological networks. In this paper we propose a series of techniques for the sampling of large datasets. In order to quantify the effect of these techniques, we present the quality of link prediction tasks on sampled graphs, and the time saved in calculating link prediction statistics on these sampled graphs.
READ LESS

Summary

The characteristics of Big Data - often dubbed the 3V's for volume, velocity, and variety - will continue to outpace the ability of computational systems to process, store, and transmit meaningful results. Traditional techniques for dealing with large datasets often include the purchase of larger systems, greater human-in-the-loop involvement, or...

READ MORE

Secure architecture for embedded systems

Summary

Devices connected to the internet are increasingly the targets of deliberate and sophisticated attacks. Embedded system engineers tend to focus on well-defined functional capabilities rather than "obscure" security and resilience. However, "after-the-fact" system hardening could be prohibitively expensive or even impossible. The co-design of security and resilience with functionality has to overcome a major challenge; rarely can the security and resilience requirements be accurately identified when the design begins. This paper describes an embedded system architecture that decouples secure and functional design aspects.
READ LESS

Summary

Devices connected to the internet are increasingly the targets of deliberate and sophisticated attacks. Embedded system engineers tend to focus on well-defined functional capabilities rather than "obscure" security and resilience. However, "after-the-fact" system hardening could be prohibitively expensive or even impossible. The co-design of security and resilience with functionality has...

READ MORE

Improving big data visual analytics with interactive virtual reality

Published in:
HPEC 2015: IEEE Conf. on High Performance Extreme Computing, 15-17 September 2015.

Summary

For decades, the growth and volume of digital data collection has made it challenging to digest large volumes of information and extract underlying structure. Coined 'Big Data', massive amounts of information has quite often been gathered inconsistently (e.g from many sources, of various forms, at different rates, etc.). These factors impede the practices of not only processing data, but also analyzing and displaying it in an efficient manner to the user. Many efforts have been completed in the data mining and visual analytics community to create effective ways to further improve analysis and achieve the knowledge desired for better understanding. Our approach for improved big data visual analytics is two-fold, focusing on both visualization and interaction. Given geo-tagged information, we are exploring the benefits of visualizing datasets in the original geospatial domain by utilizing a virtual reality platform. After running proven analytics on the data, we intend to represent the information in a more realistic 3D setting, where analysts can achieve an enhanced situational awareness and rely on familiar perceptions to draw in-depth conclusions on the dataset. In addition, developing a human-computer interface that responds to natural user actions and inputs creates a more intuitive environment. Tasks can be performed to manipulate the dataset and allow users to dive deeper upon request, adhering to desired demands and intentions. Due to the volume and popularity of social media, we developed a 3D tool visualizing Twitter on MIT's campus for analysis. Utilizing emerging technologies of today to create a fully immersive tool that promotes visualization and interaction can help ease the process of understanding and representing big data.
READ LESS

Summary

For decades, the growth and volume of digital data collection has made it challenging to digest large volumes of information and extract underlying structure. Coined 'Big Data', massive amounts of information has quite often been gathered inconsistently (e.g from many sources, of various forms, at different rates, etc.). These factors...

READ MORE

Enabling on-demand database computing with MIT SuperCloud database management system

Summary

The MIT SuperCloud database management system allows for rapid creation and flexible execution of a variety of the latest scientific databases, including Apache Accumulo and SciDB. It is designed to permit these databases to run on a High Performance Computing Cluster (HPCC) platform as seamlessly as any other HPCC job. It ensures the seamless migration of the databases to the resources assigned by the HPCC scheduler and centralized storage of the database files when not running. It also permits snapshotting of databases to allow researchers to experiment and push the limits of the technology without concerns for data or productivity loss if the database becomes unstable.
READ LESS

Summary

The MIT SuperCloud database management system allows for rapid creation and flexible execution of a variety of the latest scientific databases, including Apache Accumulo and SciDB. It is designed to permit these databases to run on a High Performance Computing Cluster (HPCC) platform as seamlessly as any other HPCC job...

READ MORE

Big data strategies for data center infrastructure management using a 3D gaming platform

Summary

High Performance Computing (HPC) is intrinsically linked to effective Data Center Infrastructure Management (DCIM). Cloud services and HPC have become key components in Department of Defense and corporate Information Technology competitive strategies in the global and commercial spaces. As a result, the reliance on consistent, reliable Data Center space is more critical than ever. The costs and complexity of providing quality DCIM are constantly being tested and evaluated by the United States Government and companies such as Google, Microsoft and Facebook. This paper will demonstrate a system where Big Data strategies and 3D gaming technology is leveraged to successfully monitor and analyze multiple HPC systems and a lights-out modular HP EcoPOD 240a Data Center on a singular platform. Big Data technology and a 3D gaming platform enables the relative real time monitoring of 5000 environmental sensors, more than 3500 IT data points and display visual analytics of the overall operating condition of the Data Center from a command center over 100 miles away. In addition, the Big Data model allows for in depth analysis of historical trends and conditions to optimize operations achieving even greater efficiencies and reliability.
READ LESS

Summary

High Performance Computing (HPC) is intrinsically linked to effective Data Center Infrastructure Management (DCIM). Cloud services and HPC have become key components in Department of Defense and corporate Information Technology competitive strategies in the global and commercial spaces. As a result, the reliance on consistent, reliable Data Center space is...

READ MORE

Portable Map-Reduce utility for MIT SuperCloud environment

Summary

The MIT Map-Reduce utility has been developed and deployed on the MIT SuperCloud to support scientists and engineers at MIT Lincoln Laboratory. With the MIT Map-Reduce utility, users can deploy their applications quickly onto the MIT SuperCloud infrastructure. The MIT Map-Reduce utility can work with any applications without the need for any modifications. For improved performance, the MIT Map-Reduce utility provides an option to consolidate multiple input data files per compute task as a single stream of input with minimal changes to the target application. This enables users to reduce the computational overhead associated with the cost of multiple application starting up when dealing with more than one piece of input data per compute task. With a small change in a sample MATLAB image processing application, we have observed approximately 12x speed up by reducing the application startup overhead. Currently the MIT Map-Reduce utility can work with several schedulers such as SLURM, Grid Engine and LSF.
READ LESS

Summary

The MIT Map-Reduce utility has been developed and deployed on the MIT SuperCloud to support scientists and engineers at MIT Lincoln Laboratory. With the MIT Map-Reduce utility, users can deploy their applications quickly onto the MIT SuperCloud infrastructure. The MIT Map-Reduce utility can work with any applications without the need...

READ MORE

Parallel vectorized algebraic AES in MATLAB for rapid prototyping of encrypted sensor processing algorithms and database analytics

Published in:
HPEC 2015: IEEE Conf. on High Performance Extreme Computing, 15-17 September 2015.

Summary

The increasing use of networked sensor systems and networked databases has led to an increased interest in incorporating encryption directly into sensor algorithms and database analytics. MATLAB is the dominant tool for rapid prototyping of sensor algorithms and has extensive database analytics capabilities. The advent of high level and high performance Galois Field mathematical environments allows encryption algorithms to be expressed succinctly and efficiently. This work leverages the Galois Field primitives found the MATLAB Communication Toolbox to implement a mode of the Advanced Encrypted Standard (AES) based on first principals mathematics. The resulting implementation requires 100x less code than standard AES implementations and delivers speed that is effective for many design purposes. The parallel version achieves speed comparable to native OpenSSL on a single node and is sufficient for real-time prototyping of many sensor processing algorithms and database analytics.
READ LESS

Summary

The increasing use of networked sensor systems and networked databases has led to an increased interest in incorporating encryption directly into sensor algorithms and database analytics. MATLAB is the dominant tool for rapid prototyping of sensor algorithms and has extensive database analytics capabilities. The advent of high level and high...

READ MORE

Using a power law distribution to describe big data

Published in:
HPEC 2015: IEEE Conf. on High Performance Extreme Computing, 15-17 September 2015.

Summary

The gap between data production and user ability to access, compute and produce meaningful results calls for tools that address the challenges associated with big data volume, velocity and variety. One of the key hurdles is the inability to methodically remove expected or uninteresting elements from large data sets. This difficulty often wastes valuable researcher and computational time by expending resources on uninteresting parts of data. Social sensors, or sensors which produce data based on human activity, such as Wikipedia, Twitter, and Facebook have an underlying structure which can be thought of as having a Power Law distribution. Such a distribution implies that few nodes generate large amounts of data. In this article, we propose a technique to take an arbitrary dataset and compute a power law distributed background model that bases its parameters on observed statistics. This model can be used to determine the suitability of using a power law or automatically identify high degree nodes for filtering and can be scaled to work with big data.
READ LESS

Summary

The gap between data production and user ability to access, compute and produce meaningful results calls for tools that address the challenges associated with big data volume, velocity and variety. One of the key hurdles is the inability to methodically remove expected or uninteresting elements from large data sets. This...

READ MORE

ASR-9 Weather Systems Processor technology refresh and upgrade

Summary

The Weather Systems Processor (WSP) is an add-on system to the Airport Surveillance Radar-9 (ASR-9) that generates wind shear detection and storm tracking products for the terminal airspace. As the original system ages and pre-purchased replacement parts in the depot are used up, it becomes increasingly problematic to procure hardware components for repairs. Thus, a technical refresh is needed to sustain WSP operations into the future. This phase of the project targets the intermediate frequency digital receiver, the radar interface module, and the digital signal processor for replacement by updated hardware platforms. At the same time, the increase in computational capability allows for an upgrade in the signal processing algorithm, which will lead to data quality improvements.
READ LESS

Summary

The Weather Systems Processor (WSP) is an add-on system to the Airport Surveillance Radar-9 (ASR-9) that generates wind shear detection and storm tracking products for the terminal airspace. As the original system ages and pre-purchased replacement parts in the depot are used up, it becomes increasingly problematic to procure hardware...

READ MORE