Edge AI and Vision Insights

Edge AI and Vision Insights

A NEWSLETTER FROM THE EDGE AI AND VISION ALLIANCE Late February 2024 | VOL. 14, NO. 4

LETTER FROM THE EDITOR

Dear Colleague,

We’re excited to announce the initial sessions , complete with abstracts and speaker bios, for the 2024 Embedded Vision Summit program. A solid start on what will end up being 100+ sessions! Head to embeddedvisionsummit.com now for more information, and take 25% off your registration by using code SUMMIT-SEB-NL. This is the best price you’ll be able to get on the Summit, taking place May 21-23 in Santa Clara, California. Register now and share the news with your colleagues!

Brian Dipert Editor-In-Chief, Edge AI and Vision Alliance

SMART HOMES AND CITIES

Reinventing Smart Cities with Computer Vision Hayden AI has developed the first AI-powered data platform for smart and safe city applications such as traffic enforcement, parking and asset management. In this talk, Vaibhav Ghadiok, Co-founder and CTO of Hayden AI, presents his company’s privacy-preserving vision-based mobile solution, which it deploys in a city’s existing transportation fleet to collect real-time data. Hayden AI’s mobile intelligent camera detects objects in the environment, such as vehicles, pedestrians and license plates, while the company’s location engine fuses data from multiple sensors (such as cameras, GNSS and IMUs) to locate these objects in 3D. Ghadiok shares some of the key business and technical challenges his company has faced in developing and deploying this solution with partners like New York City’s Metropolitan Transportation Authority, and how it has overcome these challenges. He also explains how these cameras, deployed across a fleet of vehicles, can collaboratively upload data to build a real-time 3D re-creation of a city, providing insights that can be used to improve safety and efficiency.

Advanced Presence Sensing: What It Means for the Smart Home Eventually, homes will become highly autonomous—powered by ubiquitous, connected, intelligent devices (sometimes referred to as ambient computing)—but this remains a distant vision. In the meantime, advanced presence sensing innovations from component manufacturers, software developers and smart home device companies are enabling meaningful improvements in convenience, efficiency and safety. Omdia expects advanced presence detection to be adopted rapidly in the coming years, and to become a key factor in how consumers select devices and service providers. In this presentation, Jack Narcotta, Principal Analyst for the Smart Home at Omdia, provides an overview of key enabling technologies, market appetite, consumer expectations and long-term outlook for advanced presence sensing in the home.

STARTUP INSIGHTS

Navigating the Evolving Venture Capital Landscape for Edge AI Start-ups The venture capital environment has undergone a huge shift in recent years. In this presentation, Todd Poole, Director of Venture Capital Investments for HPE Pathfinder Ventures, explains how the VC landscape has changed and the implications for start-ups. He examines the impacts on valuations, VC’s willingness to fund, appetite for risk and leverage in deal structuring and terms negotiation. Pool explores how AI and ML hardware and software companies have been affected differently from the broader start-up ecosystem and considers the distinct challenges faced by start-ups in the edge AI sector. He offers guidance for founders, soon-to-be-founders and start-up employees on how to navigate this dynamic environment and strategize effectively in response to changing market conditions.

War Stories from the Entrepreneurial Front Lines You have a killer idea that will change the world. You’ve thought through product-market fit and differentiation. You have seed funding and a world-beating team. Best of all, you’ve caught the attention of major players in your industry. You’ve reached peak “start-up”—that point of limitless possibility—when you go to bed with the same level of energy and enthusiasm you had when you woke. And then the first proof of concept starts… In this talk, Tim Hartley, Vice President of Product for SeeChange Technologies, lays out some of the pitfalls that await those building the next big thing. Using real examples, he shares some of the dos and don’ts, particularly when dealing with that big potential first customer. Hartley discusses the importance of end-to-end design, ensuring your product solves real-world problems. He explores how far the big companies will tell you to jump—and then jump again—for free. And, most importantly, how to build long-term partnerships with major corporations without relying on over-promising sales pitches.

UPCOMING INDUSTRY EVENTS

Accelerate Edge AI Development With NVIDIA Metropolis Microservices For Jetson - NVIDIA Webinar: March 5, 2024, 8:00 am PT

Build vs Buy: Navigating Optical Image Sensor Module Complexities - FRAMOS Webinar: March 7, 2024, 9:00 am PT

Embedded Vision Summit : May 21-23, 2024, Santa Clara, California

More Events

FEATURED NEWS

AMD Unveils the Embedded+ Architecture , Combining Embedded Processors with Adaptive SoCs to Accelerate Time-to-market for Edge AI Applications

Ambarella Expands Its Autonomous Driving AI Domain Controller Family with Two New Members

NVIDIA's RTX 2000 Ada Generation GPU Brings Performance and Versatility for the Next Era of AI-accelerated Development

eYs3D's New Edge AI Chips Transform Computer Vision for Robotics

Intel Delivers New High-level Compute Solutions in Mobile, Desktop and at the Edge

More News

EDGE AI AND VISION PRODUCT OF THE YEAR WINNER SHOWCASE

Qualcomm Cognitive ISP (Best Camera or Sensor) Qualcomm’s Cognitive ISP is the 2023 Edge AI and Vision Product of the Year Award winner in the Cameras and Sensors category. The Cognitive ISP (within the Snapdragon 8 Gen 2 Mobile Platform) is the only ISP for smartphones that can apply the AI photo-editing technique called “Semantic Segmentation” in real-time. Semantic Segmentation is like “Photoshop layers,” but handled completely within the ISP. It will turn great photos into spectacular photos. Since it’s real-time, it’s running while you’re capturing photos and videos – or even before. You can see objects in the viewfinder being enhanced as you’re getting ready to shoot. A real-time Segmentation Filter is groundbreaking. This means the camera is truly contextually aware of what it’s seeing. Qualcomm achieved this by building a physical bridge between the ISP and the DSP – it’s called “Hexagon Direct Link”. The DSP runs Semantic Segmentation neural networks in real-time. Thanks to Hexagon Direct Link, the DSP and the ISP can operate simultaneously. The ISP captures images and the DSP assigns context to every image in real-time.

Please see here for more information on Qualcomm’s Cognitive ISP. The Edge AI and Vision Product of the Year Awards celebrate the innovation of the industry's leading companies that are developing and enabling the next generation of edge AI and computer vision products. Winning a Product of the Year award recognizes a company's leadership in edge AI and computer vision as evaluated by independent industry experts.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了