Formal Verification vs Simulation – Data ordering and integrity
Verifying data transportation and movement through a digital design with basic I/O is often considered straightforward with simulation. This approach helps in capturing data losses and data corruption. However, how do you ensure the detection of out-of-order cases, or a single byte dropped from a sequence? 🤔 Simulation could address this, but it would require a complex set of testbenches, which might not cover all possible cases.
With the recent advances in AI and cloud-based solutions, more complex hardware accelerators for data manipulation and movement are required, and completely verifying such designs is a crucial and challenging task, where data ordering and integrity should be checked for every beat.
Exhaustively verifying data ordering and integrity of a complex protocol can be challenging with simulation. Fortunately, formal analysis can be easily applied here using a formal scoreboard. 🎯
How does formal scoreboarding work?
The main idea behind a formal scoreboard is to track data packets going into the design and check the ordering and the integrity of these packets once they are generated by the design.
In general, the use of a formal scoreboard is quite simple, it requires helper logic to calculate the expected content of data packets that are generated by the design and controls the signal that pushes them into the scoreboard in the correct order.
The formal scoreboard has four inputs:
data_in: Data created by the helper logic. i.e. the content of a W beat of an AXI transaction.
push: Controlled by the helper logic to push the expected data into the scoreboard.
data_out: Data generated by the design after processing.
pop: Controlled by the design when the design signals that the data is ready to be read.
Once the design receives the data to be processed, the helper logic calculates the expected data and pushes it to the scoreboard and the tracking of that data packet starts. Once the design is done processing, it can pop the scoreboard announcing that the data packet is processed, and the scoreboard can compare the data_in that was pushed with the data_out that the design generated. Once there is a discrepancy between the data generated by the design and the expected data the scoreboard has, the data ordering and integrity violation will be signaled by a failing property within the scoreboard with the shortest path possible.
Overcoming the increased complexity
Someone could argue that with this setup, the complexity of the proof is very high. However, this setup allows the formal tool to push as many data packets to the design as possible and check all possible scenarios while tracking any single data packet. Also, formal methods can be used within the scoreboard to decrease the proof complexity, such as symbolic checking. Moreover, a formal scoreboard is not necessarily used in an end-to-end fashion but can also be used around sub-blocks within the design. Additionally, several scoreboards can be used for different elements of the same data packet. i.e. One scoreboard for checking the AWID of an AXI transaction, another scoreboard for checking WDATA, …etc.
What does formal scoreboarding verify?
The formal scoreboard can detect any violation in the integrity and the ordering of the data generated by the design, including incorrect number of data packets. All this using one simple formal property, without writing specific cases or scenarios while abstracting the timing behavior.
To conclude, formal scoreboards are widely used by formal verification engineers. At LUBIS EDA, we have used this method to catch numerous corner-case bugs that required over 20 specific events to trigger a violation—scenarios that are often missed by conventional simulation-based verification. 🎉🚀
Have you ever used a formal scoreboard in your verification process?