Machine learning is hot recently, while the EDA processing and large data type, but in the consolidation of machine learning technology in the EDA tools have not made much progress.
Many EDA problems and solutions are essentially statistical, which means that machine learning is naturally appropriate for EDA. So why is this field so slow to embrace other technological fields such as machine learning and visual recognition and search?
Solido Design Automation technology operations vice President Jeff Dyck, said: "you can feel this is a machine learning problem. We have a large amount of data, but we can use what method to solve these problems? This is the difficult point. Not to say that to read a textbook or learn a lesson will be able to apply these methods to all problems. Engineering issues that need to be a different perspective."
Before delve into where machine learning can be applied, let's take a look at some of the issues.
Learn from rule-based to deep learning
Let's first classify these techniques. "In the broadest sense, we are all accustomed to rule-based techniques (RBT) in the EDA field." Nvidia senior engineering director Ting Ku explained, "machine learning is actually a subset of RBT, and deep learning is a subset of machine learning. RBT is deterministic, not involves database, also won't define features. And the starting point is the statistical machine learning, rather than the deterministic, and since you have to learn from experience, still can involve to the database. So the use of machine learning, we may also need to the predefined characteristics, and this is the difference between machine learning and deep learning. For the deep learning, in addition to no predefined characteristics, other and machine learning. So the question: 'characteristics (feature) is what?"
Once you have the features and store enough data, you have to figure out how to use them. "Searching the whole design space is impractical." NetSpeed Systems, vice President of marketing and business development Anush Mohandass said, "because the space is highly nonlinear nature, the search time will increase exponentially with the search scope. With such a problem, machine learning (among them, the last solution to the problem of similar experience can be used as training data, to study and predict similar solutions to new problems) can be used to represent the huge potential."
There are many methods of machine learning, which can be broadly divided into supervised learning, unsupervised learning and intensive learning. Most EDA applications focus on supervised learning. E3 Data Science of CTO Eric Hall explained: "there are two types of supervised learning. Regression (regression) can be used for numerical, we want to predict and classification (classification) can be used to predict the results of multiple results in a situation. Can the machine learning algorithms to solve these problems has a lot of, but no one can solve all the problems alone."
Also has other problems, Hall added: "in the search for the undetected to the characteristics of a nonlinear modeling, deep learning technical performance is excellent, but it is a black box, and may need a long training time is difficult to explain."
The performance of machine-learning techniques depends on the data used in their training. "Machine learning is an iterative process," Ku said, "machine learning algorithm based on the input data have corresponding output. This output may not be correct, so you must also verify. After completion, the data is contained to a database. This time you should start training again (rertraining). The cycle continues. To some extent, we hope these iterative cycle can make the model is quite accurate, make it when I saw a new case to make good predictions."
In many cases, the data may come from previous designs, but is that enough? Solido Dyck said: "imagine, if there are 2000 work in parallel the SPICE simulation to solve a chip we had never seen a have never seen a problem of manufacturing process. The information we can gather some past practices and USES this information to model, but we also have real-time data. This is a real-time learning and build the model in real time."
There are many other problems with real-time learning. "If there is a problem with streaming data or a wrong answer to a polluting model, then you need to filter or adjust it - and it's very hard." "We need automatic recovery and repair," he added. "when something goes wrong, you have to be able to debug the data."
But debugging machine learning systems is a relatively unknown area. Verification techniques, if any, are few and far between.
There are other types of learning involved with EDA process. "We need to be able to get through the design process to acquire knowledge," qualcomm, senior technical director Sorin Dobre said, "EDA has a very good chance, supervision and unsupervised machine learning solutions can be extended to design process optimization, the design flow optimization). We have 20 years experience of senior engineers to ensure the high quality design, but we also need to help the designer has just started. We can't wait five years to fully play their productivity."
Even for experienced designers, this job is getting harder. "In the past, architects designed interconnect (interconnect) based on their experience, and they made key design decisions about topology and wiring choices based on their intuition." NetSpeed Mohandass said, "but, this method does not apply to requirements on a very diverse heterogeneous systems. Because of the complexity of the interaction between multiple pieces of device, to design a kind of close to the optimal, and able to work, the performance good, also considered the interconnection of all cases is virtually impossible."
The data set
Plunify CEO Harnhua Ng said: "to get a good data set may be difficult. The learning ability of these tools can ensure that when the engineering team to use them more learning database will become more intelligent, increasing the design completion time."
So only those who already have large data sets can use these technologies? Or can EDA provide initial training? "For many machine learning applications in the EDA field, the selection and training of parameters related to the algorithm needs to be done entirely in the computing environment of the design client or the factory." Cadence of distinguished engineer David White said, "in these applications, the most challenging task is to create automation training and validation method, make its can ensure algorithm in target works as expected on silicon technology. In some cases, more advanced and more complicated machine learning method can provide higher accuracy, but to provide support in this area is the most difficult. In the process of development, people need to according to the required accuracy and the number of available training data and other support related constraints and model to weigh the appropriate algorithm and the choice of architecture."
It sounds difficult, and it is. "EDA problem with high dimension, the higher-order interaction, discontinuity and nonlinear, need advanced experimental technology design, advanced study, intelligence screening and filtering and the supervision of the benchmark infrastructure (very infrastructure)." Solido, President and CEO Amit Gupta said, "in addition, the problem of EDA and high flow rate and mass data archive, to optimize the flow of the parser (streaming parser), can be parallelized algorithm, efficient and scalable cluster management, automatic recovery and repair, and big data debugging."
Mohandass gives an example of a data set needed for interconnection design. "Perfect interconnection strategy depends on very large Numbers of SoC parameters, including the layout (floorplan), wiring, available resources, connection requirements, protocol level dependence, clock, craft characteristic (such as delay, power consumption and bandwidth and delay constraints), and so on. The number of different dimensions of space design strategy can be increased to the hundreds of, this will create a large design space."
There are several dimensions to this problem. Synopsys, vice President and CIO Hasmukh Ranjan said: "machine learning can be used in EDA, but in order to maximize the benefits of machine learning itself, should be used in both these tools should be around these tools used in the design process."
Qualcomm's Dobre agrees: "there's no need to have everything in the EDA tool. You can use an independent machine learning solution to drive existing tools."
IC the Manage, executive vice President of Shiv Sikand provides an example: "by analyzing the flow before (tapeout) billions of data points, we can predict vulnerability, and design complexity, human resources, the influence of the certificate, and calculate the throughput of the current project server clusters. By identifying the bottleneck in the design of semiconductor provides foresightedness we can predict and identify potential delay."
We may also need to check the infrastructure of our running tools. "We also need to consider intelligent storage," Sikand added: "through the analysis of the data flow of associated with the file operations, machine learning techniques such as clustering and regression analysis to continuously improve the P2P network and cache management, to provide better application performance."
Dobre team are very familiar with these problems: "we have contain tens of thousands of CPU computing farm. When you view the need to verify the number of design at the same time, how do you in a way that is a kind of optimal use of these resources, and resource requirements explosion? This needs data management. How do you deal with in the design space and effectively in foundry so much data, and extract the next design knowledge and information needed to reduce the learning cycle?"
The machine that will run the machine learning algorithm adds another dimension. "Machine learning will reduce design and simulation time through existing complex algorithms." Sachin, associate director of the Markets and Markets Garg said, "EDA tools may adopt intelligent decision to adopt or further, but we need better hardware (CPU + GPU) to run such a complex machine learning algorithms, to make it more efficient. The current generation after generation of GPU can provide parallel computing load with huge expansion to accelerate and excellent performance."
Cadence of White agree: "progress in the field of large-scale parallel computing architecture based on the assumption of optimization and validation (what-if -based optimization and verification) opens the door, in order to explore the design space effectively and fusion of the most potential decisions."
Success depends on the ability to define the right set of features. "To consider the design of process deviation (variation), for example," Ku said: "if you want to model a probability density function, you need to attribute (attribute). Characteristics (feature) is able to distinguish one thing with another thing attributes. For people, may is the hair color, height, gender. To consider the design of process deviation characteristics could be PVT Angle, change the algorithms and defined components of a random variable. So is some important thing to a specific problem."
We can see a lot of process deviation in the process of 10nm and m. "It is important for the plant to come up with new technological efforts." Dobre said, "even if it is in the digital realm, also need to library elements as analog design. You must come across multiple process Angle to verify this design. How do you in the absence of the resources required extraordinary circumstances to achieve high quality? Machine learning can be 10 times productivity, reduce the weeks of measuring time and reduce the waste of resources. On the recognition leads to yield substandard model, machine learning is an effective method. We see the huge potential, can bring economic benefits."
EDA is trying to solve the problem. "With the design of advanced nodes, new silicon technologies and additional validation requirements bring greater uncertainty, thereby raising the potential risk." White says, "in the traditional design process, design and layout of the previous data will not be effectively used to help guide the next design. The progress of the analysis method allows us to check (mining) before the design of the data and trend, and use it to in the early stages of the design process to guide the design decisions. The same method can be used to find and offer training and development to drive the machine learning engine (context). The background of this solution is likely to need to use a large amount of data and components of hundreds of machine learning, they will need to manage and validation. Once the data is placed on the appropriate background, machine learning can be used to obtain the complex behavior of analysis (for example, parasitic, electrical, validation), and high accuracy, high speed."
There are other areas of design that can help. Hall said: "we can use it for memory or logic gate power estimation or sequential estimation. This would reduce the uncertainty and to supplement human work, in order to create a more competitive products."
Another area where a solution is emerging is routing. "In the context of interconnect design, the first step is to identify the combination of design strategies in each dimension, which can lead to a good solution for a large number of different SoC designs." "The next step is to use this information to learn patterns and predict which strategy combinations will most likely lead to good design," Mohandass said.
Similar techniques can be applied to FPGA wiring. "Complex FPGA design with complex timing and performance closure issues is a good candidate for tools based on machine learning techniques." Plunify Ng said, adding that "machine learning tools to compile the results of analysis in the past, it can be predicted in the trillion of possibilities the optimal parameters of integrated/layout and wiring and cloth location. They can use statistical modeling and machine learning to infer what tools for an optimal design parameters, in order to extract insights from data to improve the quality of the results."
The result of trust
But compared with other machine learning applications, the design faces more obstacles. "If there is a risk that a scheme is too radical or too conservative, people will not adopt it." Dyck explained, "machine learning tools are large estimator. You cannot ask people to trust it. So we need to be able to clear the accuracy of modeling technology. This technology has very little -- you have to invent them. We need active learning method, can be gradually found related fields, and these are often the worst side. To show I may cause the failure of the chip, and provides clear insight in the area. In this way you will be in these areas directly. In view of the problem domain is very important."
Dyck also noted the EDA face another obstacle: "if you can't prove that an answer is correct, they will not accept it. So you need to design to validate the algorithm. You will need to verify as part of the technology, so that when you give an answer, you can show it at run time is right."
Machine learning has begun to penetrate EDA and design processes. "Machine learning has begun to play an important role in the EDA," Gupta said. "it also has the opportunity to provide disruptive technological breakthroughs to solve semiconductor problems."
But we still have a long way to go. Ku said: "today we see is the tip of the iceberg. We hope that future EDA can stop providing data. The data is very good, of course, but we really need is a decision. You need to do is add a layer between the data and decision making, the machine algorithm can learn data to understand how should make the decision. EDA is perfect for the job position."
If you want to maintain trust, there are small steps. "Artificial intelligence and machine learning can be a unique place for a company, but using artificial intelligence must not reduce the accuracy of the algorithm," says Ranjan of Synopsys.