A SBIR Phase II contract was awarded to Quantum Interface in August, 2022 for $539,200.0 USD from the U.S. Department of Defense and United States Air Force.
Quantum Interface LLC - Quantum Interface (QI) has invented the first new Human Interface that can predict user intent in real time and respond simultaneously. This new capability is made possible by QI’s new patented user interface environment and the capabilities of our Quantum User-Focus Point or QPoint. The Quantum Interactive Virtual Experience (QIVX) environment is the first new user interface environment in over 50 years. This environment can monitor any user action (e.g. mouse, hands, eyes, head, face, body, etc.), analyze their intent and pre-emptively provide contextual options. The QI technology’s vector analysis of the user’s motions can be used in many creative and advanced ways. In addition to using this data to determine user intent in the context of process flow situations (menus, option selection, data analysis, etc.), we can use this data to determine the user’s confidence in the context of training and evaluation, while they are interacting. We have successfully integrated this into a product (360IVX) built for the USAF in a recently completed 19.1 Phase 2 to take 360 videos and provide a platform for instructors to create their own lessons from these videos in minutes, and export for students across PC, mobile and VR devices (i.e. Oculus), and ready for immediate use. This is currently licensed by the 355th TRS, and under current discussions to be used by the 149th FW. The 19th AF/PTN uses Prepar3d, or P3D, for part of basic flight instruction training. This Sims environment is effective but does not provide an easy way to take those Sims experiences and environments and convert into lessons. Just as QI took 360 videos and provided a completely new immersive-academics tool to make lessons from these videos by adding media, highlighting, audio, text, images, quizzes, and other videos, the 19th AF desires to make a new immersive-sims product for P3D that will allow them to take the P3D files and convert them to lessons, having the same capabilities as the 360IVX product, but with increased functionality and features. This will require building a proprietary virtual camera and recorder for P3D using their open-source and binary files, transcoding the output to a cloud-based app, and providing all the new tools desired by the 19th/PTN to augment those files and make lessons consumable on PC/Mobile/VR environments. This includes a secure platform that provides course/class/student/instructor capabilities, reporting metrics of user outcomes and confidence levels (uniquely QI), ability to create quizzes in the experience, VR log-in capabilities, and reusable modules that can be moved from lesson to lesson, increasing ease of lesson creation and consumption. This system needs to be web-based, but also have off-line capabilities and will integrate QI's patented gaze-based interactions and smart-object response systems using QI AI/physics engines and analytics to provide the best user information of cognition and retention, and be easy.