Case ID: M24-048P^

Published: 2024-11-15 13:12:30

Last Updated: 1731676350


Inventor(s)

Yatiraj Shetty
Troy McDaniel

Technology categories

Applied TechnologiesManufacturing/Construction/MechanicalPhysical ScienceWireless & Networking

Licensing Contacts

Physical Sciences Team

Design of Device to Infer Hand Interactions by Fusing Depth Sensing and Bioacoustics

Background

Machine learning is the capability of machines to interpret intelligent behavior. A subset of this is gesture recognition which allows for virtual gesture control, allowing users to manipulate the virtual environment using the real-world gestures.

Current technologies available for gesture recognition are a variety of sensors to better the depth sensing and bioacoustics signals. For example, a time of flight (TOF) sensor measures the distance between an object and itself utilizing light waves. Similarly, an acoustic vibration sensor measures the amount and frequency of a vibration from a given piece of equipment. Additionally, a vision processing unit (VPU) is a ultra-low processor that handles visual data. However, current technology is not very responsive to analyzing hand movements as wrist-worn sensors require frequent calibration when the device is removed/re-worn. Moreover, current technology only responds direct hand gestures held in specific locations.

Invention Description

Researchers at Arizona State University have developed a compact, portable gesture tracking device that accurately captures hand interactions, including both discrete gestures and continuous hand actions, with the aim of providing intuitive gesture-based control for smart devices and cameras.

The device incorporates a 2D 8×8 time of flight (TOF) sense and acoustic vibration sensor, both controlled by a small microcontroller. Using principles of multimodal sensing and combines depth information (from TOFsensor) and bioacoustic information (from the VPU), this technology generates accurate inferences for dynamic hand gestures, micro gestures in midair or when grasping objects, force/squeeze of fingers or palm, or object detection.

Potential Applications:

  • Smart devices
  • Healthcare- analyze patient’s movements
  • Automotive- driver attentiveness
  • Sign language translation

Benefits and Advantages:

  • Cheaper alternative to using camera sensors
  • Lower processing requirements than competition
  • Mitigates privacy concerns- does not record objects or background information
  • No impact to efficacy from occlusion