The technology is a nonlinear filter processor composed of an array of polynomial nonlinear filters, which accept and transfer data samples, producing an output through systematic, cooperative processing.

Filter processors are essential components in many digital signal processing systems. These systems are widely used in fields, such as telecommunications or medical imaging, for which there's often a need for filtering high volumes of data and computations. Yet, considering the modern world's increasing data levels, the existing technologies are struggling to keep up. Therefore, a technology that can efficiently handle large volumes of data has become a necessity. Existing filter processors struggle with both high volumes of data and high-speed computations. Additionally, they lack a systematic process for handling large amounts of data and efficient distribution and utilization of resources. This massive processing requirement strains conventional processor systems, inhibits their efficiency, and creates room for errors, thus leading to inaccurate results and inefficient processing.

Technology Description

The nonlinear filter processor consists of an array of polynomial nonlinear filters, with a first and a last filter. The first filter receives an input data sample. The array of filters systematically passes this data from the first to the last filter. Each filter generates an output data sample based on the input it received. Excepting the last filter, each filter transfers the output it generated to an adjacent filter. Any filter other than the first one sums a nonlinearly filtered input data sample produced by it with the output data sample received from a neighboring polynomial nonlinear filter. What sets this technology apart is the systolic, or coordinated, toolbox it uses. This systolic functionality enables each filter to systematically pass outputs to the next, creating a cascading effect that promotes efficiency and speed. This effect allows the overall filter processor array to effectively handle and manipulate a large amount of data, making it much more reliable and efficient than other similar technologies.

Benefits

  • Capability to systematically process large volumes of data
  • Efficiency in handling high-speed computations
  • Improved reliability through systolic functionality
  • Efficient use of resources
  • Reduced margin for errors and inaccuracies

Potential Use Cases

  • Telecommunications, for signal filtering and processing
  • Medical imaging systems, for analyzing large volumes of data
  • Audio and video processing systems, for high-quality output
  • Data centers, for quick and efficient data sorting
  • Artificial intelligence systems, for speedy computations