Patent attributes
A method and system for rendering an augmented reality scene on a mobile computing device tracks a real-world scene and/or a viewpoint of a user with one or more event-based vision sensors and blends an augmented reality scene displayed on the mobile computing device based on the viewpoint of the user and a scene map of the real-world scene and on the tracking of the one or more event-based vision sensors. Event-based vision sensors offer many advantages, mainly by intrinsically compressing the data stream and thus reducing the amount of data that a processing unit needs to perform. Furthermore, the event-based vision sensor pixels continuously sense the visual changes in the scene and report them with a very low latency. This makes the event-based vision sensor an ideal sensor for always-on tasks such as visual tracking and smart sensor control or data enhancement of secondary sensing modalities.