The Mythical Top-Down Design Flow

Back in 1995, the EDA companies were all touting top-down design as the methodology that would overcome the unending rise in product complexity. They were also pushing frameworks at that time. Frameworks are rarely spoken of today and, when they are, it is usually in a sarcastic manner. Top-down design however, would seem to be more relevant than ever.

What is top-down design and why do I call it mythical?

I use the term “mythical” in relation to top-down design because so few engineers practice it. Here is how it works:

Start with a specification for your system, board, FPGA, whatever is the top level of your design. Code your specification in behavioral VHDL or SystemVerilog if it is hardware only or, perhaps SystemC if it is hardware and software. Simulate it. Does the specification work as intended? If not refine your spec. If it does work, you have just created a testbench and a reference design for the rest of your project.

Next decompose the behavioral model. Depending on where you are starting, you may decompose to more behavioral models of boards, FPGAs or ASICs or, your next step may be RTL code or board-level schematics. Continue until you have implemented your design at the level appropriate for prototyping while simulating at each level and validating the results against the original testbench.

In the few cases I have seen this methodology used, all the non-participants have wondered why the responsible engineer is so slow in the beginning, when he is refining his spec and decomposing, and how he could be so fast in the end when his design sails through debug.

So why don’t more people use this methodology? I can think of two primary reasons.

I have heard some people say, “It would take me as long to write the specification as it would to complete the project”. That may be true the first time through but, you may have to “complete the project” two or three times to create a design that does what you need.

There is also the issue of impatience and the extra time it takes to reach some intermediate milestones. For example, in the early ’90s I worked on project that involved designing a tiny, by today’s standards, GaAs IC. It was in the 3-5K gate range. I suggested we try a new methodology – VHDL simulation and synthesis. The project manager, being more progressive than most, agreed we could try it if we worked over our Christmas shutdown. That way, if it did not work out, it would not have much impact on the schedule.

I started with the specification he gave me for the chip and wrote a behavioral model and a testbench. Simulation revealed serious problems with the spec. The manager rewrote the spec and I repeated the process. This time I got a bit farther but found another problem in the spec. We went for a third iteration. We were two weeks into the process.

Then the ASIC vendor came in and talked to the manager. The vendor asked “Why are you wasting time trying to synthesize such a small design? You could draw the schematics in two days”. So they sat down and indeed drew the schematics over a weekend.

The manager came in on Monday and told us to abandon the simulation and synthesis method, we had put in two weeks and had yet to produce a schematic. He was going with the old method. And he did. However, it still took him 6 more months to tape out because it was impossible to discern specification errors from schematic errors.  Had we continued with top-down, we probably would have finished in three or four more weeks.

The main advantage to the “mythical” top-down methodology is, it allows you to debug high-level errors at a high-level, and low-level errors at a low level, and know which is which.

2 Responses to “The Mythical Top-Down Design Flow”

  1. Mike Mintz says:

    Hi Richard,

    In this context, top down design is just good design. Separating what something is supposed to do versus how to do it has been successfully used in the software domain.

    A related concept that I like to use in verification is “ends-in” design. What this means is start at the top and the bottom and work your way inward. I suspect that this is the iterations you were referring to in your blog.

    In my world of verification, I use ends-in design to build the top layer of verification components and the lowest layers at the same time. I know what I want on the top, because it should be that obvious. On the lowest layers, I can start on the BFMs and monitors.

    So its not really mythical, just known by different names to different engineers.

    Take care,

  2. Magnus Danielson says:

    For the scale of designs that I deal with, it is virtually meaningless to start doing simulations on a top-level and fill the costume as we go. While we indeed use a top-level-down method we start we mostly do it in the thought process, in the documentation trail, do a divide and conquer on the analysis. In my experience, spending sufficient time on all that, and selected tests of critial issues, the actual coding and debugging isn’t a very painful process. We often find that we detect hard-to-fix problems in the peer-revision of documents. While the initial part of a project may start top-down, we also do bottom-up and middle-out analsyses early out, in order to investigate the field in advance, and once we know that much of the details can be set just right as you go from the top and downwards.

    There is a lot of love in the latest philosphy and the tools that goes with them (especially if you are a tool vendor). Few tools is effective in helping all the way, and in the end you need to think yourself, and then the methods you use in the way you go about engineering the product is the main thing. Tools are there to help, but in the end they can never be the backbone of a solution, they can however be a key to sucsessfull execution of the solution. The focus of most waves of system tools is to shape up every problem to fit into the tools, rather than to shape up the tools to match the specific design problems. If we want a tool to actually work fully top-down, we need to change tool philosophy on a overall scale.

Leave a Reply