In this paper, we evaluate the dependencies between tools, data and environment in process design kits, and present a framework for systematically analyzing the quality of the design tools and libraries through the design flow. The framework consists of a regression engine which executes sets of tests in a distributed computing environment. These tests vary from simulations to validate models and simulators, to tests on layout versus schematics, parasitics extraction accuracy, and ultimately, tests to validate the extracted circuit integrity against the ideal. In particular, it is shown that test-chaining is required to obtain confidence in the simulation-to-silicon equivalence. A secondary objective is to identify and quantify the peak-error injection points. Finally, future work is outlined to extend the framework to automate entire design flows and provide capability for inter-tool constraint satisfaction and design optimization.
M. C. Scott, M. O. Peralta, Jo Dale Carothers