EDGE APPLICATIONS WITH EVENT-BASED VISION
Event-based vision is a neuromorphic-enabled approach that mimics how the human brain and eye work. With this vision sensing method, each pixel in a sensor embeds its own intelligence, enabling it to activate itself only when motion is sensed. The result is more efficient operation - less data, lower power and lower latency – because there are no more frames and redundant data as is the case with traditional image-based cameras.
Event-based vision is a new paradigm in acquisition and processing of visual information. The highly efficient way of acquiring sparse data and the robustness to uncontrolled lighting conditions make event-based vision ideal for computer vision applications in industrial, safety, loT, AR/VR, automotive.
EVENT-BASED VISION IN CONSUMER APPLICATIONS
The release of the new Prophesee GenX320 sensor marks the entrance of event-based vision in low-power Edge vision-enabled devices. It directly addresses the power (microwatt), latency (microsecond time resolution) and dynamic range (>120dB) requirements of Edge devices which are often battery powered, must operate autonomously in challenging lighting conditions, and process and analyze data locally on-device. Operating at the microwatt power range and capable of 10K images per second time resolution equivalent, the Genx320 sensor improves integrability and usability of event-based vision in embedded systems for uses in applications such as:
EYE TRACKING
Unlock next-generation eye-tracking capabilities with ultra-low power and high refresh rate Metavision? sensors capabilities. Reach 1ms sampling times for ultra-smooth eye position tracking while optimizing system autonomy and heating performance. Video courtesy of Zinn Labs
20mW entire gaze-tracking system
1kHz or more eye position tracking rate
GESTURE RECOGNITION
Achieve high-robustness and smooth gesture recognition and tracking thanks to Metavision? sensors’ high dynamic range (>120dB), low-light cutoff (0.05lux), high power efficiency (down to μW range) and low latency properties. Video courtesy of Ultraleap
>120dB dynamic range?
Down to 36 μW power efficiency at sensor level
OBJECT DETECTION & TRACKING
Track moving objects in the field of view. Leverage the low data rate and sparse information provided by event-based sensors to track objects with low computing power. Video courtesy of Restar
Continuous tracking in time: no more “blind spots” between frame acquisitions
领英推荐
Native foreground segmentation: analyze only motion, ignore the static background
FALL DETECTION
Detect and classify activities in real time while respecting subject’s privacy at the sensor level. Bring more intelligence to the edge and trigger alerts only on key events such as a person falling in a hospital room while generating 10-1000x less data and benefiting from high robustness to lighting conditions (>120dB dynamic range, 0.05 lux low-light cutoff). Video courtesy of YunX
Privacy by design:?Metavision sensors do not capture images
AI-enabled: Train your models on lighter datasets thanks to background & color invariability event properties
ACTIVE MARKERS
Achieve high-speed LED frequency detection in the 10s of kHz with high tracking precision. Thanks to live frequency analysis, natively filter out parasite flickering light for optimal tracking robustness.
>10kHz High-speed LED frequency detection
Native parasite frequency filtering for optimal tracking robustness
INSIDE-OUT TRACKING
Unlock ultra-fast and smooth inside-out tracking running at >10kHz and benefit from high robustness to lighting conditions (>120dB dynamic range, 0.05 lux low-light cutoff).
>10kHz?high-speed pose estimation
>120dB dynamic range?
We will be at CES 2024, showcasing these applications and more. Book a meeting with us to learn more!
Inviting my connections to read this and view the capabilities of Neuromorphic sensing as the future of computer vision is #eventbased and #ai powered with #lowpower and #privacyfirst based compute.