Case ID: M24-026P

Published: 2024-08-07 10:10:06

Last Updated: 1723025406


Inventor(s)

Aradhita Sharma
Glen Uehara
Andreas Spanias

Technology categories

Computing & Information TechnologyPhysical Science

Technology keywords

PS-Computing and Information Technology
Quantum Computing
Signal Processing


Licensing Contacts

Physical Sciences Team

Quantum Autocorrelation Computation Using QFT

Quantum computing offers unprecedented computational power, enabling the solution of complex problems beyond the reach of classical computers. In signal processing, autocorrelation is used to analyze signals and extract valuable information, but traditional methods involve high computational complexity due to numerous multiplications and additions. The Fast Fourier Transform (FFT) has improved this process by utilizing frequency domain representations, but there remains a need for further optimization. Quantum computing's inherent parallelism offers promising potential to enhance the efficiency of signal processing tasks.

Researchers at Arizona State University have developed a method using the Quantum Fourier Transform (QFT) to compute signal autocorrelation. The approach involves pre-processing, windowing, segmentation, and calculating the power spectral density with QFT operations. It uses segmentation and overlapping of the signal followed by an inverse quantum Fourier transform (IQFT). This quantum autocorrelation computation leverages quantum principles for parallel processing, reducing computational complexity, enabling temporal and lag windowing, and facilitating overlap-and-save approaches. The technology also includes quantum circuits for encoding, QFT, and IQFT, examines qubit precision and quantum noise effects, and compares quantum autocorrelation results with classical methods.

Potential Applications:

  • Signal Processing (e.g., denoising, system identification, correlograms, linear prediction, etc.)
  • Telecommunications/Communications (e.g., radar)
  • Machine Learning and AI

Benefits and Advantages:

  1. Reduced Computational Complexity
  2. Enhanced Efficiency
  3. Improved Accuracy
  4. Scalability