Quality street

|

About ten years ago, the semiconductor intellectual property (IP) was just getting underway. Although ARM and MIPS had carved out decent businesses for themselves selling processor cores by that time - ARM floated in April 1998 and was riding high courtesy of the Internet stock boom - there were still serious doubts over the viability of the IP business model. Ten years on, there still are.

At the recent IP-ESC conference in Grenoble, some old favourites from the early days came back with a vengeance, such as the perennial favourite, IP quality. The system-on-chip (SoC) industry has, on the one hand, dealt reasonably well with the quality issue. And IP is now a core part of SoC design. It's hard to think of any SoC on the market that does not incorporate a hardware block bought from somewhere else.

STMicrolectronics has even formed its own internal IP suppliers. "We separated the SoC team from the IP team," said Francois Remond of ST, primarily to make the IP more robust. A danger with reusing blocks that were never designed for the purpose is that shortcuts taken on the original project don't show until too late on subsequent designs.

"We had recently the experience of transforming an adult RTL block [developed internally] into IP. It has a high cost," said Remond. "It is better to start with reuse in mind."

Remond described how far the IP-based design has come for the company with a set-top box chip designed for a 55nm process. The 209-million transistor chip contains more than 50 IP cores that were supplied as RTL code with a further 160 delivered as hard IP - that is, ready-made layouts for the target process - and close to 500 memory blocks.

In the generations that followed the 2002 design of a smaller chip for a 130nm process, Remond claimed productivity at ST on these projects has increased four-fold. "At the same time, we have reduced the number of bugs inside the IP by a factor of two."

"Debugging IP at the SoC level is very costly. The SoC team does not have a deep understanding of the functionality and the potential problems. If you integrate a piece of IP that doesn't work, you are trapped. Imagine a video circuit with a DDR interface that is not working. You can't continue."

In 1999, other than ARM and some of the other reasonably experienced suppliers, it was hard to tell who had good quality cores and who had a bundle of bugs held by a hardware-description language (HDL) wrapper. Today, it's much easier: there are the people who are still in business and those who aren't.

The difference between the two is frequently down to whether their cores worked or not when inserted into a chip design. The story of the set-top box chip that failed because of one of the IP cores inside it is now a legend in the industry - the supplier pretty soon got out of the IP business.

The quest for quality has made IP a surprisingly expensive business. Joachim Kunkel of Synopsys said the cost of developing, for example, a USB version 3.0 core is anywhere between $5m and $50m. Much of that is spent on making sure it works - often using extensive simulation to check as many possible situations as possible. And manual checking.

"Design reviews continue to be one of the best ways of achieving quality," said Kunkel. "Going through the code line by line, it's amazing what you find."

"The IP vendors who are still in business are all about the rigorousness of the process," said Kathryn Kranen, president of verification tools supplier Jasper Design Automation.

Even with the attention to detail that the surviving IP vendors need, there is still a quality issue.

"We need to move to the next level, which is integration," said Kunkel.

Quality is not so much about whether the core works but whether it works in a target system. A core might work to the letter of the spec but fail within the context of a full chip where it's not possible to implement the spec as written.

"One of the things that we are hearing from customers is that the typical thinking used to be 'I am going to verify to the legal input spec'," said Kranen. But that's not enough. "Even for internal block development there is great value in verifying with very much looser constraints. So that you can harden your block against changes in logic around it or other vendors' IP."

Kranen's suggestion, which is not unusual for a supplier of formal verification, is to use assertions and formal techniques to check whether a bad transaction can confuse an IP block, or whether it will sail through with nothing more than an error signal.

According to Kranen, formal verification has even been used as a support tool. She cited a situation where a customer asked ARM about an apparently aberrant logic trace they found when simulating the core inside their proposed system. ARM support asked for the sequence of events that led up to the odd output trace but the customer wanted to keep that confidential - ARM may call its customers 'partners' but the trust that the word implies isn't often there. So, the ARM engineers used the Jasper tool to work out if the core could get into that state and what it would take to get there.

"It's a tool that provides answers to specific reuse questions," said Kranen.

One thing is for sure. Although the IEEE has resurrected the QIP standard - a spreadsheet meant to demonstrate the approach a vendor took to guarantee quality - hardly anyone in the business still takes it seriously. You can expect more automation to help with IP integration, but a usable quality standard remains elusive.