Publications

Refine Results

(Filters Applied) Clear All

Safety analysis methodology for unmanned aerial vehicle (UAV) collision avoidance systems

Author:
Published in:
USA/Europe Air Traffic Management Seminar, 27-30 June 2005.

Summary

The integration of Unmanned Aerial Vehicles (UAVs) into civil airspace requires new methods of ensuring collision avoidance. Concerns over command and control latency, vehicle performance, reliability of autonomous functions, and interoperability of sense-and-avoid systems with the Traffic Alert and Collision Avoidance System (TCAS) and Air Traffic Control must be resolved. This paper describes the safety evaluation process that the international community has deemed necessary to certify such systems. The process focuses on a statistically-valid estimate of collision avoidance performance developed through a combination of airspace encounter modeling, fast-time simulation of the collision avoidance system across millions of encounter scenarios, and system failure and event sensitivity analysis. Example simulation results are provided for an implementation of the analysis process currently being used to evaluate TCAS on the Global Hawk UAV.
READ LESS

Summary

The integration of Unmanned Aerial Vehicles (UAVs) into civil airspace requires new methods of ensuring collision avoidance. Concerns over command and control latency, vehicle performance, reliability of autonomous functions, and interoperability of sense-and-avoid systems with the Traffic Alert and Collision Avoidance System (TCAS) and Air Traffic Control must be resolved...

READ MORE

Safety analysis for advanced separation concepts

Published in:
USA/Europe Air Traffic Management Seminar, 27-30 June 2005.

Summary

Aviation planners have called for increasing the capacity of the air transportation system by factors of two or three over the next 20 years. The inherent spatial capacity of en route airspace appears able to accommodate such traffic densities. But controller workload presents a formidable obstacle to achieving such goals. New approaches to providing separation assurance are being investigated to overcome workload limitations and allow airspace capacity to be fully utilized. One approach is to employ computer automation as the basis for separation-assurance task. This would permit traffic densities that exceed the level at which human cognition and decision-making can assure separation. One of the challenges that must be faced involves the ability of such highly automated systems to maintain safety in the presence of inevitable subsystem faults, including the complete failure of the supporting computer system. Traffic density and flow complexity will make it impossible for human service providers to safely reinitiate manual control in the event of computer failure, so the automated system must have inherent fail-soft features. This paper presents a preliminary analysis of the ability of a highly automated separation assurance system to tolerate general types of faults such as nonconformance and computer outages. Safety-related design features are defined using the Advanced Airspace Concept (AAC) as the base architecture. Special attention is given to the impact of a severe failure in which all computer support is terminated within a defined region. The growth and decay of risk during an outage is evaluated using fault tree methods that integrate risk over time. It is shown that when a conflict free plan covers the region of the outage, this plan can be used to safely transition aircraft to regions where service can still be provided.
READ LESS

Summary

Aviation planners have called for increasing the capacity of the air transportation system by factors of two or three over the next 20 years. The inherent spatial capacity of en route airspace appears able to accommodate such traffic densities. But controller workload presents a formidable obstacle to achieving such goals...

READ MORE

Megapixel CMOS image sensor fabricated in three-dimensional integrated circuit technology

Summary

In this paper a 3D integrated 1024x1024, 8um pixel visible image sensor fabricated with oxide-to-oxide wafer bonding and 2-um square 3-D-vias in every pixel is presented. The 150mm wafer technology integrates a low-leakage, deep-depletion, 100% fill factor photodiode layer to a 3.3-V, 0.35-um gate length fully depleted (FD) SOI CMOS readout circuit layer.
READ LESS

Summary

In this paper a 3D integrated 1024x1024, 8um pixel visible image sensor fabricated with oxide-to-oxide wafer bonding and 2-um square 3-D-vias in every pixel is presented. The 150mm wafer technology integrates a low-leakage, deep-depletion, 100% fill factor photodiode layer to a 3.3-V, 0.35-um gate length fully depleted (FD) SOI CMOS...

READ MORE

Polymorphous computing architecture (PCA) kernel-level benchmarks [revision 1]

Published in:
MIT Lincoln Laboratory Report PCA-KERNEL-1,REV.1

Summary

This document describes a series of kernel benchmarks for the PCA program. Each kernel benchmark is an operation of importance to DoD sensor applications making use of a PCA architecture. Many of these operations are a part of the composite example applications described elsewhere. The kernel-level benchmarks have been chosen to stress both computation and communication aspects of the architecture. "Computation" aspects include floating-point and integer performance, as well as the memory hierarchy, while the "communication" aspects include the network, the memory hierarchy, and the I/O capabilities. The particular benchmarks chosen are based on the frequency of their use in current and future applications. They are drawn from the areas of signal processing, communication, and information and knowledge processing. The specification of the benchmarks in this document is meant to be high-level and largely independent of the implementation.
READ LESS

Summary

This document describes a series of kernel benchmarks for the PCA program. Each kernel benchmark is an operation of importance to DoD sensor applications making use of a PCA architecture. Many of these operations are a part of the composite example applications described elsewhere. The kernel-level benchmarks have been chosen...

READ MORE

Design considerations for space-based radar phased arrays

Author:
Published in:
2005 IEEE MTT-S Int. Microwave Symp. Digest, 12-17 June 2005, pp. 1631-1634.

Summary

Space Based Radar (SBR) is being considered as a means to provide persistent global surveillance. In order to be effective, the SBR system must be capable of high area coverage rates, low minimum detectable velocities (MDV), accurate geolocation, high range resolution, and robustness against electronic interference. These objectives will impose challenging requirements on the antenna array, including wide-angle electronic scanning, wide instantaneous bandwidth, large poweraperture product, low sidelobe radiation patterns, lightweight deployable structures, multiple array phase centers, and adaptive pattern synthesis. This paper will discuss key enabling technologies for low earth orbit (LEO) SBR arrays including high efficiency transmit/receive modules and multilayer tile architectures, and the parametric influence of array design variables on the SBR system.
READ LESS

Summary

Space Based Radar (SBR) is being considered as a means to provide persistent global surveillance. In order to be effective, the SBR system must be capable of high area coverage rates, low minimum detectable velocities (MDV), accurate geolocation, high range resolution, and robustness against electronic interference. These objectives will impose...

READ MORE

Dynamic buffer overflow detection

Published in:
Workshop on the Evaluation of Software Defect Detection Tools, 10 June 2005.

Summary

The capabilities of seven dynamic buffer overflow detection tools (Chaperon, Valgrind, CCured, CRED, Insure++, ProPolice and TinyCC) are evaluated in this paper. These tools employ different approaches to runtime buffer overflow detection and range from commercial products to open source gcc-enhancements. A comprehensive test suite was developed consisting of specifically-designed test cases and model programs containing real-world vulnerabilities. Insure++, CCured and CRED provide the highest buffer overflow detection rates, but only CRED provides an open-source, extensible and scalable solution to detecting buffer overflows. Other tools did not detect off-by-one errors, did not scale to large programs, or performed poorly on complex programs.
READ LESS

Summary

The capabilities of seven dynamic buffer overflow detection tools (Chaperon, Valgrind, CCured, CRED, Insure++, ProPolice and TinyCC) are evaluated in this paper. These tools employ different approaches to runtime buffer overflow detection and range from commercial products to open source gcc-enhancements. A comprehensive test suite was developed consisting of specifically-designed...

READ MORE

Application of a development time productivity metric to parallel software development

Published in:
SE-HPCS '05, 2nd Int. Worskhop on Software Engineering for High Performance Computing System Applications, 15 May 2005, pp. 8-12.

Summary

Evaluation of High Performance Computing (HPC) systems should take into account software development time productivity in addition to hardware performance, cost, and other factors. We propose a new metric for HPC software development time productivity, defined as the ratio of relative runtime performance to relative programmer effort. This formula has been used to analyze several HPC benchmark codes and classroom programming assignments. The results of this analysis show consistent trends for various programming models. This method enables a high-level evaluation of development time productivity for a given code implementation, which is essential to the task of estimating cost associated with HPC software development.
READ LESS

Summary

Evaluation of High Performance Computing (HPC) systems should take into account software development time productivity in addition to hardware performance, cost, and other factors. We propose a new metric for HPC software development time productivity, defined as the ratio of relative runtime performance to relative programmer effort. This formula has...

READ MORE

Measuring translation quality by testing English speakers with a new Defense Language Proficiency Test for Arabic

Published in:
Int. Conf. on Intelligence Analysis, 2-5 May 2005.

Summary

We present results from an experiment in which educated English-native speakers answered questions from a machine translated version of a standardized Arabic language test. We compare the machine translation (MT) results with professional reference translations as a baseline for the purpose of determining the level of Arabic reading comprehension that current machine translation technology enables an English speaker to achieve. Furthermore, we explore the relationship between the current, broadly accepted automatic measures of performance for machine translation and the Defense Language Proficiency Test, a broadly accepted measure of effectiveness for evaluating foreign language proficiency. In doing so, we intend to help translate MT system performance into terms that are meaningful for satisfying Government foreign language processing requirements. The results of this experiment suggest that machine translation may enable Interagency Language Roundtable Level 2 performance, but is not yet adequate to achieve ILR Level 3. Our results are based on 69 human subjects reading 68 documents and answering 173 questions, giving a total of 4,692 timed document trials and 7,950 question trials. We propose Level 3 as a reasonable nearterm target for machine translation research and development.
READ LESS

Summary

We present results from an experiment in which educated English-native speakers answered questions from a machine translated version of a standardized Arabic language test. We compare the machine translation (MT) results with professional reference translations as a baseline for the purpose of determining the level of Arabic reading comprehension that...

READ MORE

Laser beam combining for high-power, high-radiance sources

Author:
Published in:
IEEE J. Sel. Top. Quantum Electron., Vol. 11, No. 3, May/June 2005, pp. 567-577.

Summary

Beam combining of laser arrays with high efficiency and good beam quality for power and radiance (brightness) scaling is a long-standing problem in laser technology. Recently, significant progress has been made usingwavelength (spectral) techniques and coherent (phased array) techniques, which has led to the demonstration of beam combining of a large semiconductor diode laser array (100 array elements) with near-diffraction-limited output (M2 ~ 1.3) at significant power (35 W). This paper provides an overview of progress in beam combining and highlights some of the tradeoffs among beam-combining techniques.
READ LESS

Summary

Beam combining of laser arrays with high efficiency and good beam quality for power and radiance (brightness) scaling is a long-standing problem in laser technology. Recently, significant progress has been made usingwavelength (spectral) techniques and coherent (phased array) techniques, which has led to the demonstration of beam combining of a...

READ MORE

Multi-PRI signal processing for the terminal Doppler weather radar, part I: clutter filtering

Author:
Published in:
J. Atmos. Ocean. Technol., Vol. 22, May 2005, pp. 575-582.

Summary

Multiple pulse repetition interval (multi-PRI) transmission is part of an adaptive signal transmission and processing algorithm being developed to aggressively combat range-velocity ambiguity in weather radars. In the past, operational use of multi-PRI pulse trains has been hampered due to the difficulty in clutter filtering. This paper presents finite impulse response clutter filter designs for multi-PRI signals with excellent magnitude and phase responses. These filters provide strong suppression for use on low-elevation scans and yield low biases of velocity estimates so that accurate velocity dealiasing is possible. Specifically, the filters are designed for use in the Terminal Doppler Weather Radar (TDWR) and are shown to meet base data bias requirements equivalent to the Federal Aviation Administration's specifications for the current TDWR clutter filters. Also an adaptive filter selection algorithm is proposed that bases its decision on clutter power estimated during an initial long-PRI surveillance scan. Simulations show that this adaptive algorithm yields satisfactory biases for reflectivity, velocity, and spectral width. Implementation of such a scheme would enable automatic elimination of anomalous propagation signals and constant adjustment to evolving ground clutter conditions, an improvement over the current TDWR clutter filtering system.
READ LESS

Summary

Multiple pulse repetition interval (multi-PRI) transmission is part of an adaptive signal transmission and processing algorithm being developed to aggressively combat range-velocity ambiguity in weather radars. In the past, operational use of multi-PRI pulse trains has been hampered due to the difficulty in clutter filtering. This paper presents finite impulse...

READ MORE