Science Fiction to Forklifts in < 2 Years
When I first joined Voyant full time Steve and Chris had so many ideas on where to take their incredible technology.? We narrowed the opportunity by focusing on markets where Voyant could quickly introduce game-changing solutions and markets where we could reach customers with immediate needs and who demonstrated fast adoption cycles.??
Through this narrow lens (pun intended) we developed our first product, Blue.? Blue uses a “hybrid” beam steering approach with on-chip fast-axis beam steering and a simple, inexpensive moving mirror to scan a one-dimensional array of optical antennas across a scene.??
?Our first Blue design is a smashing success, demonstrating that coherent imaging LiDAR out to hundreds of meters from an integrated photonic chip is not science fiction but an industrial solution available now.? Our earliest embedded sensors using the Blue LiDAR chip have been bouncing on Manhattan and the Las Vegas strip, taking laps on race courses , mounted on overpasses and mobile robots. We are signing up customers for early samples, and our production schedule will intercept our customers’ next designs.?
?Compared to software, nothing is fast in the world of hardware, but software alone does not improve the world. ? Somebody must make the hardware that runs the software and, in Voyant’s case, digitizes the real world for software to operate on.?
?Semiconductor development makes other hardware development look fast.? You can think of designing optical chips as a mechanical design exercise using 3D printing: design, simulate, print, then test. While I can get my custom snowboard binding mounts printed in a day, that “print” step for semiconductors takes a season, and “test" requires examination objects the size of viruses.?
?In semiconductor development, you can’t miss an opportunity to explore the next design. Iteration takes too long, so along with our first tape out of Blue we included an implementation for an idea we could not let go of: a two-dimensional array of monostatic optic antennas.? This design requires many, many more components and critically, a radical increase in layers of optical switches than in Blue.?
?Complete on-chip beam steering is the holy grail of active sensing, paving the way for coherent imaging to go far beyond mobility, smart infrastructure, and industrial robotics and penetrate wearable augmented reality and consumer electronics.? Compact and low-cost chip-based LiDAR will become a truly ubiquitous technology like CMOS camera sensors are today, which we manufacture by the billions each year.?
?A lens and a simple electronics board are all you need to turn a LiDAR chip into a complete LiDAR system, and results in an absurdly simple BOM. A LiDAR system can be simpler than a CMOS camera, because there is no need to focus a lens or control an aperture.
?The disadvantage of this approach is that it's freaking impossible.?
Designing optical switches with enough performance to switch between hundreds of optical antennas in an on-chip LiDAR configuration was thought impossible until we released Blue.? We got that working on the first attempt.
领英推荐
?Creating switches good enough to switch light between thousands of optical antennas?? We went for it; I did not get my hopes up.?
?Our latest chips arrived last summer. The 2D chips looked ok, but as the optical antennae are smaller than blood cells? it's hard to know by looking. We put them on a shelf. No time to test them.? Deploying Blue sensors to customers was our priority… until a few weeks ago when the photonics team had the bandwidth to mount them on some test boards.??
?Our two-dimensional beam steering chips work, and better than expected!? The results are utterly spectacular. Hundreds of digitally controllable optical switches, so lossless as to enable coherent LiDAR from a chip’s surface.? Now that LiDAR is on-chip, it only gets better with every iteration.
?Here is a demonstration of the dynamic scanning that these devices are capable of.? 2D on-chip beam steering offers incredible range and velocity precision but also lets an application sense in almost any arbitrary scan pattern.? You could raster scan the same pattern all the time, but isn't this demonstration so much more fun?
?The video shows one possible scan pattern, which looks like “VOYANT”,? in multiple ways, as viewed a few meters from the sensor using an IR camera.?
?We use a small fraction of the full chip, a 4 x 80 antenna sub-array.? Our first generation production version has just over 5000 antennas. Our API supports almost any scan pattern with a 30 Hz update rate.?
The impact of these products on perception systems is huge.? Applications can focus their scanning where they need it.? A perception system could decide to? track individual objects? at higher rates or resolutions while scanning the background in lo-res.? Perception systems can push a lot of functionality to our edge sensors by asking for? hi-res data only for moving objects while maintaining a background scan to identify additional moving objects, achieving better precision with faster response times compared toa full raster scan, frame by frame…
?…Or you can stick with an upper left to lower right raster scan.? But your application can change its mind at 30Hz.?
What will you use this for?
CTO at USZoom, LLC
2 年Hey Peter Stern , amazing. I don't know the tech, so please reformulate the following layman's question if it makes no sense: For this iteration of your on-chip solution, how are the limits of resolution determined? Specifically, assuming a target at a given distance, at what size would the target be too small to resolve? Is there a formula (presumably including target reflectivity, your operating wavelength(s), angle of target, atmospheric conditio s, etc)? Or is this something that can only be determined empirically?