Using Autoware as a Framework for OEM-backed Autonomous Vehicle Projects
In June 2021, we announced that Leo Drive was elevated to a premium member status in the Autoware Foundation. Since then, our commitment to open-source autonomous technology development got only stronger. Today, Leo Drive fully embraces the Autoware Foundation roadmap and drives its technology development and business model around its strategy.
To better explain why Leo Drive chose this path, we will try to convey our thinking by breaking some concepts about autonomous driving into parts while looking at this problem from both technology developers' and users' perspectives.
First and Foremost: What is Autoware?
The Autoware Foundation is a non-profit organization supporting open-source projects enabling self-driving mobility. In addition, the Autoware Foundation creates synergies between corporate development and academic research, enabling autonomous driving technology for everyone.
Today, Autoware is the largest autonomous driving open source community with 2300+ stars on GitHub and 500+ accounts on Slack. It has a widespread adoption used by 100+ companies, runs on 30+ vehicles, and is used in 20+ different countries, and Autoware courses are being offered in 5 countries. Besides that, Autoware has been used by OEMs for Mobility as a Service (Maas) development and has been qualified to run on driverless vehicles on public roads in Japan since 2017.
Autoware's track record can be seen best on the official Autoware Foundation website.
Autoware.AI, Autoware.IO and Autoware.Auto
Autoware.AI was the original Autoware project built on ROS1, which was started in 2015 by Shinpei Kato at CMU. This project launched as a research and development platform for autonomous driving technology for researchers, developers, and students interested in the domain. Some early pilots that used Autoware were running on Autoware.AI.
Autoware.IO was imagined to be an interface project for Autoware to extend Autoware's capabilities and reach with proprietary software and third-party libraries reliably. Examples included device drivers for sensors, by-wire controllers for vehicles and hardware-dependency programs for SoC boards.
And finally, Autoware.Auto was introduced as Autoware reimagined. The project is based on ROS2, and the open-source community manager manages it. Autoware.Auto applies best-in-class software engineering practices and is based on a redesigned architecture.
With Autoware. Auto's arrival, the foundation shifted towards a use-case-based development approach, where use-cases -in other terms, ODDs (Operational Design Domain)- are determined with foundation members' consensus and foundation leadership's drive.
As can be seen from the Autoware roadmap, in 2020 and 2021, three ODDs were studied and developed by the community, including Autonomous Valet Parking (AVP), Cargo Delivery and Racing. This blog post by Christian John from Tier IV explains AVP demo capabilities and touches the other demos that run Autoware.Auto.
Autoware.Auto functioned as a framework for sophisticated autonomous driving pilots. Again, at the Autoware Foundation's website (at the Autoware Operational Design Domains segment), various demos have been explained in great detail.
After talking about the existing implementations, let's talk about the new paradigm of Autoware with its revised strategic plan.
Autoware Core and Universe
Operating the autonomous driving technology in real-world/public road conditions is one the most profound (and one of the most exciting) challenges of the autonomous vehicle domain. Running an autonomous vehicle on the public road dictates developers to provide stability and resilience to the system, to enhance the capabilities of the system on all fronts (including perception, mapping/localization, planning and control) while initiating the debate on how to demystify the technology's scaling and path to commercialization.
The newly proposed Autoware Core/Universe paradigm addresses this concern. Since the implementation of Autoware.Auto, the codebase development was bound to stringent requirements such as complying with best-in-class software engineering practices, 100% code coverage, an intuitive coding style guidance and exhaustive testing. In that sense, Autoware Core will be an enhancement on Autoware.Auto project.
Although the Autoware Core enables the users to develop their solutions on top of an end-to-end software framework, as an open-source framework, Autoware should not be a restrictive development project. Instead, Autoware should be available for all potential applications, including niche autonomous use-cases, advanced R&D implementations and university projects.
This is where the Autoware Universe falls into place.
Although the Autoware Universe uses Autoware Core as a foundation for functionality and message definitions, it extends Autoware's scope broader by enabling contributors to develop additional components on top of Autoware Core without being bound by stringent development requirements. These additional components may manifest in various ODD-based features or tools and algorithms designed through advanced R&D studies. The relaxed development practices also serve as a teaser for future ODD implementations; contributors can prototype and demonstrate niche ODDs in a streamlined fashion.
Autoware Core will be put to good use in the coming Bus ODD that will be developed during 2022, allowing for operation on public road environments, navigating amongst multiple vehicles, bicycles, and pedestrians, as well as recognizing traffic signals and complex traffic situations.
Autoware's new approach is not limited to new development practices, however. Let's look at the alliance strategies, the will to develop Autoware towards more automotive-grade, and indications on how Autoware wants to scale to enable the path to commercialization.
Go-To-Market Strategy and Alliances
Autoware's leadership team achieved significant results in fostering collaboration with external parties, including other open-source consortia such as AVCC (The Autonomous Vehicle Computing Consortium) and MIH Consortium (and Open EV Alliance) as well as eSync Alliance (a multi-company initiative for OTA updates and diagnostics). These newly established alliances will help Autoware contribute and participate in new go-to-market paradigms to achieve widespread adoption of Autoware.
Alongside the alliances, there are also firm efforts within the Autoware ecosystem to enable the path to commercialization. Yes, Autoware serves as an enabler to lower the barriers to entry to autonomous vehicle development. Still, if it also serves as a facilitator for commercial solutions, the ecosystem would prosper with OEMs and technology providers joining in. That's why Autoware Foundation started the OpenAD Kit initiative.
As it was described on the official web page, "The Autoware Open AD Kit provides a reference framework for SDV development of commercial AD solutions. Companies can integrate and promote their commercial solutions as part of the Autoware Open AD Kit, including cloud-native development pipelines and/or tools, proprietary AD microservices integrated into Autoware OSS, middleware and OS solutions, and heterogeneous compute platforms."
You can explore the OpenAD kit via the official website. Also, this post by Daisuke Tanaka from Tier IV is a good read.
I think now is an excellent time to dive into OEM engagements.
How to fast-track the autonomy development for OEMs?
The autonomous vehicle industry literally boomed since Google's moonshot driverless vehicle initiative. Expectations were set for blue skies. Some prominent OEMs revealed their plans to introduce autonomous vehicles for private ownership with highly automated driving capabilities by 2017, 2018 and 2019. It did not turn out the way it was imagined, as we all know, and by 2019 the industry had a reality check. That year, Gartner showed autonomous vehicles were stepping into the trough of disillusionment at their annual hype cycle.
OEMs tried to build their autonomous vehicle technology in-house and ended this approach with an epic fail. Then they quickly changed their course to seek external suppliers and poured billions of dollars of funding money. Although the industry giants such as Waymo, Cruise, Argo AI and Aurora did not yet reveal big commercial success (this might be another blog post topic), this mentality stuck. So today, top-tier OEMs are working with top-tier autonomous tech startups to take them into the commercialization phase for a so-long awaited revenue-generating financial position.
But how about the incumbents of the automotive industry? These incumbent vehicle manufacturers are doing good business, producing all sorts of land vehicle modes (passenger vehicles, buses, trucks, agricultural machinery, military vehicles) while applying changes to their mentality with the changing mobility ecosystem. But, more and more, we see a surge of need for these vehicle manufacturers to adapt to the autonomous ecosystem, who are left alone, unaddressed.
Leo Drive believes that the open-source community is a solution to these OEMs' needs.
We work with OEMs as a sparring partner for enabling their access to the world of autonomous vehicles. By using Autoware as a software framework, we lower the barrier of entry for the OEM profoundly. Furthermore, we spend joint efforts with our customers to get them at a satisfactory pace in autonomous technology development to achieve their immediate and long-term goals. In this process, we provide a wide variety of consultancy and technical services within the scope of our project.
Autoware fosters collaboration between the OEM and the technology supplier leveraging several unique advantages (sharing the codebase transparently, cost-conscient development methodology, natural reach to a broad open-source community for support) over the proprietary approach.
Here's how we do it.
Autonomy Essentials Kit
We've been working on a quick install autonomy kit suitable for all passenger vehicles.
Autonomy Essentials Kit serves as a reference system solution for Autoware-powered autonomous vehicle prototypes.
The product comes with a set of sensors (LiDARs, cameras and positioning sensors) capable of testing all pillars (perception, localization, planning and control) of autonomous vehicle development. All the sensors are purposefully built within the kit, eliminating the initial hurdles of autonomy hardware development, including sensor selection, calibration, time synchronization, interference, etc. Also, a computing system of customer's choosing (with Leo Drive's consultations) is sourced, configured and adapted for Autoware use, which is later offered as a bundle to the rooftop sensor suite.
Suppose the platform has drive-by-wire capabilities, fine. But if it doesn't, we have several offerings to our customers to get their vehicle drive-by-wire ready, including contracting our engineering services through our sister company, Robeff (Maybe more on Robeff in another post), to carry out the necessary development/integration tasks to retrofit the platform with drive-by-wire capabilities.
Autonomy Essentials Kit enables customers without prior autonomous vehicle development experience to kickstart their journey to autonomous vehicles.
We have a late-stage working prototype of the Autonomy Essentials Kit and currently looking for early adopter partners/testers.
End-to-End Autonomous Vehicle Engineering
While Autonomy Essentials Kit has the virtue of providing a quick testing possibility to new market entrants, we also work with OEMs with more ambitious goals. Namely, if there's a partner with a concrete idea of building a fleet of autonomous vehicles (for whatever purpose: people transportation, goods transportation, delivery, etc.) Then, Leo Drive is there for you.
领英推荐
This model constitutes a very firm engagement approach between the Leo Drive team and the prospective customer/partner. We provide an end-to-end service to take a customer's platform from nothing to a full suite of autonomy components to demonstrate a particular application and potentially serve its purpose in a regular service fashion.
Here's how we do it.
1. Concept and ODD Development
The first step to collaborating is understanding the customer's desires and requirements. We work with our customer's technical, business, and strategy teams to understand what they want to achieve in the autonomous vehicle realm.
Although the customer always has a target in their agenda with this autonomous vehicle development effort, it's important to demystify all the unknowns about autonomous concepts and applications because vehicle automation spans a wide selection of more conventional (although somewhat modernized) ADAS functions to L4 autonomous applications.
We first discuss and mutually agree on a higher level concept with the customer using their platform's realistic capabilities. Then we move on to the ODD (Operational Domain Design). Finally, we list all the limiting factors for this application, including road types, weather conditions, speed limits of the operation, vehicle's interaction with other traffic users and road infrastructure.
We finalize this stage by creating a roles and responsibilities document to anchor collaboration efforts. Roles and responsibilities will differ according to the customer's platform and willingness to contribute to the project.
2. Sensor and Vehicle Modeling
The first step in a customer project is acquiring knowledge about the platform. Is it drive-by-wire compatible? Where should the sensors and systems be placed? How does the platform behave on road conditions?
Initially, we request the CAD models of the platforms to import this model into a 3D design workbench where we try to understand and iterate different sensor and systems positions. Then, we select and position all autonomous hardware, including LiDARs, cameras, positioning sensors, computing systems, power distribution systems, and data recording systems, appropriate to the platform's indigenous properties and the application's requirements. Our first impulse is always to cover 360* surrounding the vehicle with different sensor modalities to eliminate any blind spots.
Besides the CAD model, we also model the vehicle properties to be utilized for the subsequent phases of the development. Usually, the kinematic model of the vehicle is suitable and adequate (especially for a low-speed automation application). Furthermore, we work with 3rd party computational physics tools that can work hand in hand with various simulation environments compatible with Autoware and ROS2 in general as a messaging foundation.
2. Design and Manufacturing
The next step in the development process is to place and install the autonomy hardware onto the prototype platform. We usually follow a two-step design process.
Initially, we place sensors and systems onto the platform in the simulation environment and reach a conceptual consensus that the sensor placement will be adequate and meaningful for the desired application. Next, we compose an analysis report discussing the initial design's advantages, disadvantages, and implementation at this stage. Finally, we can reach a consensus and move on to the prototyping phase.
For the prototyping phase, we design a simple mechanical installation frame that guarantees the decided sensor positions on the vehicle (including observing rigid FoVs, any possibilities of sensor interference, etc.). Then, using this simple installation frame, we retrofit the platform and move on to the design validation phase, where we collect sensor data using the vehicle. Finally, we confirm that everything functions correctly, and data fidelity is warranted by carefully analyzing the data.
After validating the sensor architecture and sensor placement on the vehicle, our industrial design team takes responsibility for designing a sleek and seamless sensor suite that would house all the sensors and protect them against external factors.
3. Calibration & Time Synchronization
Sensor implementation for an autonomous vehicle prototype is not an easy feat. To select sensors to use in the platform arbitrarily and mounting them on the platform without considering many factors (e.g., FoVs, vibration, interference between sensors) will cause much harm resulting in time and money loss.
There are many challenges to sensor implementation, and we resolve these challenges with real and virtual testing. First, we mount all sensors on the platform using the mechanical installation frame and perform all necessary pre-processing steps before using the sensory data on the whole system.
One of the most challenging preparatory steps to read the sensory data is calibrating all sensors mounted on the vehicle. This step is required, as we want to observe the same scene simultaneously (with micro-second fidelity) with different sensors around the vehicle. That means an object detected by LiDAR, radar, camera and other sensors must be represented in the same coordinate plane with pixel-correct projection.
After the preparatory tasks are completed successfully, we collect data on the road, analyze the data, and ensure all incoming data is healthy, gapless, and safely stored in our data collection system in the vehicle.
4. Simulation and Fine Tuning
The next step in the autonomous vehicle development process is to import the vehicle and its indigenous characteristics into the simulation environment.
Testing the autonomous software on the road is an expensive venture. So rather than deploying the prototype vehicle on the road, the best practice is to simulate all software in a virtual environment to cut costs and drastically streamline the development and fine-tuning processes.
There are different options available when it comes to simulation environments. For example, some companies use commercial solutions with proprietary simulation engines, while others prefer open-source simulation environments such as Unity and Unreal Engine solutions. Proprietary solutions offer a more technically qualified solution as they are customizable to user needs (the simulation developer offers further customizations). However, using proprietary simulation solutions usually require two-way communications between the vendor and the user, which often results in friction to use the solution altogether.
Hence, Leo Drive prefers to use open-source simulation environments to alleviate any friction between the customer and the development team that could hinder the very objective of rapid prototyping and validation.
Our team starts the simulation design thinking process by exploring the use case and the test field for validation stages. Then, if the situation allows, the Leo Drive team maps the entire test area and creates a high-fidelity digital twin from the environment to be imported into the simulation domain.
Then, we move on with algorithm testing on the simulation platform to test, adapt, fine-tune and further develop necessary components to achieve the use case required autonomous driving tasks.
HOW CAN AN OEM WORK WITH AUTOWARE?
Rather than deep diving into further technical details about autonomous driving algorithms, I want to take a step back to look at the broader picture.
We can consider four main pillars in every autonomous vehicle architecture. If a company like Leo Drive works on all these pillars, they count as a full-stack autonomous vehicle company. These pillars are named and explained very briefly below:
1. Scene Understanding
An autonomous vehicle must replace the driver behind the steering wheel with sensors to understand what's happening around the vehicle (near and far proximity). Scene understanding requirements vary tremendously according to the use case. For example, a highway scenario with 70 km/h driving velocity requires a different perception pipeline than an urban delivery scenario with 30 km/h driving velocity.
2. Mapping and Localization
An autonomous vehicle must know where it is (locally and globally) at all times to prevent making wrong decisions at wrong places that may harm things around the autonomous vehicle. Again, localization and mapping requirements will vary with the use case. For example, performing a localization task on an airport tarmac will look nothing like performing a localization task in a metropolitan area which is a highly GNSS-denied area. In addition, sensors used for localization will change and the localization techniques, again according to the use case.
3. Path Planning and Decision Making
An autonomous vehicle must plan and make many decisions to execute that plan along its route from point A to point B, safely and efficiently, to achieve its driving task. Path planning and decision-making also require inputs from perception, localization, and vehicle control modules. Path planning and decision-making variety and complexity depend on the use case. For example, an autonomous delivery vehicle operating in an urban environment will have different path planning and decision-making tasks than an autonomous truck operating in highway conditions.
4. Control
After the planning and decision-making tasks, the vehicle must be commanded to execute those decisions. The control of an autonomous vehicle usually refers to lateral and longitudinal control along with signaling, which at the end translates to the vehicle actuation (i.e., steering, throttle and braking), however, it's not trivial, a lot of peripheral requirements must be taken into account such as user comfort, safety, robustness and resilience.
---
These four pillars of autonomous driving can be considered a common denominator for all autonomous applications. What's important here is to understand how the functionalities of each pillar are changing to meet the use case requirements.
Autoware has worked on three ODDs (i.e., autonomous valet parking, cargo delivery, and racing) so far, and the ecosystem is currently working on the Bus ODD. Each ODD is discussed within Autoware's respective working groups to identify the requirements and functionality of the solution and create a system architecture proposal. Therefore, in Autoware's Github and Gitlab repositories, there are readily available architecture proposals for any OEM to evaluate reference architectures for those ODDs and build a solution on top of it.
Besides the past implementations, everybody on the internet can monitor the ongoing implementations and request features to be added to those implementations. On the other side, OEMs can be a part of the Autoware ecosystem and appoint key people to the decision-making and governing bodies of the foundation.
Becoming a member of Autoware is a convenient method for an OEM to join forces with the world's largest open-source autonomous vehicle ecosystem while contributing to the strategic and technical steering of the ecosystem.
And therefore, we highly encourage vehicle manufacturers and new service designers to be a part of the Autoware ecosystem to accelerate their autonomous vehicle efforts conveniently, cost-consciously and sustainably.