VUnit Integration and Verification Components
Lars Asplund
Main author and maintainer of VUnit. ASIC/FPGA/DSP Developer consulting for Q-Free ASA
This is the second of a series of posts in which I will review and provide personal insights on the presentations featured in the VUnit track at FPGAworld. In this review, I will cover the first session that was hosted by Sebastian Hellgren, founder of CodeCache and currently engaged with Hitachi Energy. His presentation comprised of two distinct parts. The first segment summarized his experiences with introducing VUnit in large organisations with extensive existing code bases and a diverse range of different testbench strategies. In the second part, he explored VUnit verification components, a concept extensively used by his client.
Part 1
In my previous article covering Kunal Panchal's presentation on MathLib, I reflected over how the project uses VUnit in a way that aligns very well with one of VUnit's core philosophies: test early and often. This philosophy is based on the insight that development is a process of uncertainties and control theory dictates that small code/test iterations with frequent feedback is the only way to successfully manage such a process.
A unit testing framework removes the obstacles to adopting such a philosophy but what about the obstacles when introducing a new tool
There are some obstacles that can prevent these iteration so let's address some of them.
Sebastian also used an incremental approach when introducing the tool to new colleagues, beginning with one or a select few individuals and then gradually expanding its use. Today, multiple sites at Hitachi Energy have made the transition. Another important step was to establish a forum
Part 2
The first priority when developing VUnit was to support the test early and often approach as there were no tools suitable for this purpose. However, we also recognized that VHDL lacked some features needed to support the type of system-level verification approach made popular by UVM and its predecessors. While the test structures and automation provided by unit testing can and is used for testing at all levels, a more complex system-level testbench needs some additional features.
One of these requirements is support for constrained random verification, which led us to integrate OSVVM for this specific purpose. OSVVM's focus on constrained random verification made it a fitting choice, allowing us to focus on other things. This choice reflects another of our core philosophies: be open to integration with other tools in order to achieve a more comprehensive open-source ecosystem. Being "flexible and non-invasive" is at the core of this but some more concreate examples are:
领英推荐
The second crucial element missing to support system-level verification was a simple yet feature-complete approach for managing information exchange between a set of concurrently executing verification components (VCs), such as sending a write command (transaction) to a bus functional model (BFM). Once again, we turned to software for inspiration and found the actor model and the message-passing paradigm to be a perfect fit. A more comprehensive presentation of our support for message passing can be found in this blog and the documentation.
An important attribute of message passing is the support for asynchronous communication
The systems developed by Hitachi Energy comes with many different interfaces and VUnit-based verification components were heavily used to interact with these. The need for asynchronous communication to test concurrent interface interactions is something I expect for such system-level tests. However, I was a bit surprised by the remarkable speed-up achieved. Legacy testbenches that verified interfaces sequentially saw a significant performance boost, becoming up to 100 times faster when exercised in parallel.
Fundamental to verification components is that they separate the pin wiggling details of the DUT from the test sequence by creating a high-level abstraction, known as a transaction. For example, a BFM can provide read and write procedures only dealing with address and data. These procedures send a read or a write type of message, in zero simulation time, to the BFM which takes the information within the message and translates that to pin wiggling. In the case of a read, there will also be a reply message returned with the data read. Given the asynchronous nature of communication, we have the flexibility to decide whether or not to block the test sequence while awaiting the reply message or read that message at any later point in time.
Sebastian also pointed out that the verification component itself separates different concerns. The core of the verification component is independent of its transaction-level interface (the message types it can handle). For example, basic read and write transactions can be reused by many BFMs. Some BFMs may support more than the basic read and write transactions, such as burst operations. That is just an extension of the reused interface, an extra message type supported. Think of VC interfaces (VCIs) as capabilities supported by the VC and a VC provides a mix of reused and custom interfaces (message types). As a consequence you can build your VCs incrementally. Start supporting the basic transactions and reuse if possible. Then add support for more transactions/messages as you verify the more complex features of the DUT interface.
Some of you may recognize these concepts from object-oriented programming (OOP). I think it is important to recognize that we can make use of powerful OOP concepts in VHDL, even though VHDL doesn't natively support classes. In fact, Alan Kay, the computer scientists who coined the term "object-oriented programming," once stated:
"The notion of object oriented programming is completely misunderstood. It's not about objects and classes, it's all about messages"
When questioned about the distinctions between OOP and the actor model, his response was:
"Not a lot of difference"
That concludes our discussion for now. If you'd like to hear Sebastian's presentation directly, you can access it here.
Software Development Consultant
1 年Nice to see VUnit getting widely used. It filled a gap in the industry when we released it about 10 years ago and has grown organically over the years. The core features have been very stable and proven to work well.