- πExplain the complete flow of a UVM Testbench from Sequence to Scoreboard ?
In a UVM testbench, the stimulus flow begin from a sequence that generates sequence items (transactions). These transactions are sent to the sequencer, which arbitrates and forwards them to the driver.
The driver converts transaction-level data into pin level DUT through a virtual interface. The DUT processes the stimulus and produces outputs, which are captured by the monitor. The monitor converts signal-level activity back into transactions and sends them to the scoreboard via analysis ports.
The scoreboard compares expected and actual outputs using a reference model and flags mismatches using UVM reporting mechanisms.
Flow summary :
Sequence β> Sequencer β>Driver β>DUT β>Monitor β>Scoreboard
π‘Tip : Mentioning TLM analysis port and transaction abstraction shows deeper understanding.
- πHow do you identify and debug a scoreboard mismatch in a regression ?
When a scoreboard mismatch occurs, following a structured debug approach helps.
- Check UVM logs to identify failing testcases and timestamp.
- Analyze waveform (FSDB/VCD) around failure point.
- Trace transaction path : driver β>DUT β>monitor
- Compare expected vs actual data in scoreboard
- Verify reference model correctness
- Check constraints and stimulus configuration
- Identify whether issue is RTL bug, testbench bug, or constraint issue.
In large regressions, prioritize first failure analysis and failure clustering to reduce debug time.
- πWhat is the difference between functional coverage and code coverage ? Which one matter more ?
Code coverage measures how much RTL code is exercised during simulation (line, branch, toggle, FSM).
Functional coverage measures whether all intended design scenarios and features defined in the verification plan was exercised.
Functional coverage is more important for verification completeness because it ensures that all design behaviors and corner case are validated, whereas code coverage only indicates code execution.
- πWhat is a virtual interface and why is it critical in UVM environments ?
A virtual interface is a handle that allows class-based UVM components (driver,monitor) to access DUT interface signals defined in modules.
Since UVM is class-based and DUT interfaces exist in static modules, virtual interfaces act as a bridge between the static and dynamic domains. They are typically passed using uvm_config_db during the build_phase.
Without virtual interfaces, reusable and scalable UVM testbenches cannot properly drive or monitor DUT signals.
- πExplain constrained random verification and its advantages over directed testing?
Constrained Random Verification (CRV) uses randomized stimulus with constraints to explore a wide range of input scenarios and corner cases automatically.
Advantages over directed testing :
- Better coverage of corner case
- Scalable manual testcase writing
- Reduces manual testcase writing
- Improves bug detection probability
Directed testing is useful for deterministic scenarios, while CRV is preferred for large scale regressions and coverage closure.
- πWhat causes simulation to end prematurely in UVM and how do you prevent it ?
Premature simulation termination usually occurs when objections are not properly raised or dropped in the run_phase.
In UVM, the objection mechanism controls simulation lifetime. If no component has an active objection, the simulation ends even if sequences are incomplete.
To prevent this :
- Raise objection at start of sequence/test
- Drop objection after stimulus completion
- Ensure layered sequences manage objections correctly
Proper objection handling is critical in complex testbenches and regressions.
- πHow do you ensure re-usability in a UVM testbench ?
Re-usability in UVM testbench is achieved through :
- Modular agent architecture
- Use of factory overrides
- Configuration via uvm_config_db
- Transaction-level abstraction (TLM)
- Parameterized sequences and components
This allows the same environment to be reused across different IP configurations and test scenarios without major code changes.
- πWhat is the role of assertions (SVA) in design verification ?
SVA are used to specify and check design properties such as protocols , timing relationships, and data integrity during simulation and formal verification.
Key benefits :
- Early bug detection
- Protocol validation
- Automated error checking
- Reduced manual debug effort
Assertions are widely used for checking handshakes, FIFO behavior, and interface protocols in IP level verification.
- πDifference between monitor and driver in terms of abstraction ?
The driver operates at the transaction to signal abstraction level, converting sequence items into pin-level DUT stimulus via virtual interfaces.
The monitor operates in the reverse direction, observing DUT signals and converting them into high level transactions for analysis and checking.
Driver = Active component (stimulus generation)
Monitor = Passive component (observation & analysis)
This abstraction enables scalable verification and transaction level debugging.
- πHow will you handle race conditions between driver and monitor in a uvm environments ?
Race conditions typically occur due to improper sampling timing or lack of synchronization between signal driving and monitoring.
To handle this :
- Use clocking blocks for proper signal sampling
- Ensure monitor samples on stable clock edges
- Avoid zero delay sampling
- Use non blocking assignments on drivers
- Implement analysis FIFOs if transaction ordering is critical
- Align driver and monitor timing with DUT protocol
Proper synchronization ensures accurate transaction reconstruction and prevents false scoreboard mismatches.
Created by : Abinaya Senthil
Next Week : SVA assertions