What is behind the rise in hardware simulation status?


0

Five FAQs often pop up when chip designers and verification engineers ask me about hardware simulations. They are all well thought out and the answers are widely shared. Today, simulation is mandatory in the design validation toolbox. why? For two unrelated reasons: the ever-increasing demand for performance and throughput from validators and significant advances in hardware simulation technology. The convergence of the two has pushed hardware simulation to a prominent place in any validation toolbox. Today, SoC designs consist of two rising areas: the astonishing complexity of hardware and the escalation of software content. Only hardware simulations can handle the difficult task of verifying the integrity of the two and tracing design errors across their boundaries. The invention of virtualization technology to support hardware simulation pioneered by IKOS design systems in the late 1990s opened the way for new deployment modes and led to the creation of simulation data centers. (Note: IKOS was acquired by Mentor Graphics, now Siemens EDA, in 2002.) What is the simulation value proposition? Whether we like it or not, market dynamics are a huge force in our lives. They can generate wealth and destroy fortunes. Missing the market window for a new product in a highly competitive market is at your own risk – it could kill your product and undermine the company. In the world of electronic design, the loss of market window is usually due to silicon recycling. In general, this is due to a poor roadmap schedule with insufficient resources in manpower and design tools. The more advanced the node of the technological process, the higher the cost of recirculation. Regardless of the cost of recirculation, late entry into the market is much more expensive. A product that is three months late eliminates a third of all potential revenue. The bottom line is quite clear: it is necessary to eliminate the risk of losing the market window. Hardware emulation is the best verification tool to avoid risks. With its comprehensive and fast hardware/software verification capabilities, it can eliminate recycling operations, speed up the roadmap schedule, and increase product quality at the same time. From a user perspective, what are the differences between HDL emulators and emulators? The differences are the size of the design and the size of the verification workload. As long as the design under test (DUT) size is in the ballpark of 100 million gates or less, and the execution workload extends to no more than one day, HDL emulators are the preferred choice for hardware debugging. They are easy to use, quick to setup, extremely fast to compile a DUT, and flexible to debug hardware design errors. And most importantly, it is inexpensive to obtain. All this confirms that HDL emulators are the ideal choice for IP-level verification and blocking in the early stages of the hardware design cycle. When design sizes and workloads exceed those limits and hardware/software testing is necessary, HDL emulators become ineffective, leaving hardware emulation as the only option. Today, hardware emulators are unbeatable by any design size, even with the billions of gates, found in AI/ML, 5G, and automotive applications. They can identify hard-to-find hardware errors that may need to expose billions of verification cycles as required to integrate embedded software with core hardware. They support multiple simultaneous users and can be accessed remotely from anywhere in the world. Most importantly, despite their high CPA, their ROI is remarkably low. From a user perspective, what are the differences between emulators and FPGA models? In principle, FPGAs share the same technological basis as hardware emulators. Both use dedicated, reprogrammable hardware to speed up the verification cycle. Hardware in emulators is typically built from the ground up and intended to target design validation. In prototyping, it is based on a set of commercial FPGAs. With a closer look, prototyping quickly and easily exchanges design setup and assembly, as well as powerful DUT debugging for significantly faster execution speed. Specifically, in the same DUT, the prototype may run 10x faster than the emulator. FPGA prototyping is a better choice for software validation and emulators are ideal for system-level hardware validation and hardware/software integration. Can emulators and FPGA models be integrated into a cross-validation/validation flow? definitely. They can and should be integrated. First, they have to share the front-end for translation, while the back-end depends on the tool. The benefit will be easier and faster DUT assembly. If it is compiled for simulation, it is likely to be compiled for prototyping. Second, they have to share the same DUT database to allow execution to be offloaded from one to the other at runtime. For example, the operating system boot can be performed and the program workload is executed in a prototype until an error occurs. Hence saving the design database in the prototype and restoring it in the emulator will greatly speed up accurate patch tracking. Another step in the integration roadmap can be implemented by adding a hypothetical platform based on hybrid simulation. By closely linking best-in-class simulators, virtual prototyping, and FPGA models, the validation team can implement a sophisticated and effective “turn left” strategy. Earlier this year, several announcements promoted next-generation hardware-assisted validation platforms that connect hardware simulation, virtualized prototyping with a comprehensive software testing environment, and all the essential tools in all chip design validation flows. Image source: Siemens EDA.


Like it? Share with your friends!

0

What's Your Reaction?

hate hate
0
hate
confused confused
0
confused
fail fail
0
fail
fun fun
0
fun
geeky geeky
0
geeky
love love
0
love
lol lol
0
lol
omg omg
0
omg
win win
0
win
Joseph

0 Comments

Your email address will not be published. Required fields are marked *