Prof. Giovanni De Micheli has received two awards in the last two months: the IEEE Gustav Robert Kirchoff Award and the election to the Swiss Academy of Engineering Sciences. In the same time period he has given keynote speeches at HiPEAC and DATE.

HiPEAC took the opportunity to interview Prof. De Micheli, to reflect on his long career and on the future of computing in Europe.

Can you tell us a bit about yourself?

I grew up in Milan, Italy and I was always fascinated by science and technology. My dream as a child was to be a physicist, and so I enrolled in a nuclear engineering program, because it gave me a chance to study physics at depth. I learnt notions of solid-state and quantum physics, thermodynamics and energy as well as circuits for data acquisition and control that have been useful throughout my career. During my studies at Politecnico di Milano, I had two important experiences. First, I had a stage at CERN in Geneva, Switzerland, in the group of Carlo Rubbia, where I was exposed to complex experiments in particle physics and in particular to complex electronic system design for big data acquisition and evaluation. Next, I met Alberto Sangiovanni-Vincentelli as a student of the course on circuit theory at Politecnico, who later became my adviser at U.C. Berkeley and who taught me how to do rigorous research. Indeed, after receiving my nuclear engineer degree, I became interested in very large-scale integration (VLSI) circuits and their design.  Through a Fulbright scholarship I arrived at U.C. Berkeley, still with little knowledge of what was ahead of me but with a strong basis in the fundamentals. Berkeley was a crossroad for research on circuit and systems.

My interests focused on how to design them with computers. Electronic Design  Automation (EDA) was in its infancy, mainly focused on analyzing circuits by means of simulation, and facing the challenge of increasingly larger circuit sizes. It was also the beginning of computer-aided synthesis, with the first tools to create physical layouts of Programmable Logic Arrays (PLAs) from Boolean specifications, thus involving logic optimization and structured layout generation. It was exciting and enriching to work with Alberto Sangiovanni-Vincentelli, Robert Brayton and Richard Newton, and I am grateful to them for the tremendous opportunity of learning this field. After my graduation, my work at IBM, Stanford and EPFL revolved around circuits and systems design in many flavors. And still today, I can say that we have not experienced all dimensions of electronic design, given the unprecedented opportunity of using materials, technologies and properties (e.g., superposition and entanglement in quantum devices) and the growing demand of computing performance.

How has the EDA landscape evolved since you started your career?

In the early days, EDA was known as computer-aided design (CAD), as the collective vision was to have tools around a human designer who is the decision maker at all levels of design. Then synthesis technology came along, with specific flows for the automated design of circuits from high-level specifications, requiring limited human intervention into the design process. Eventually, artificial intelligence (AI) techniques will drive circuit design, leveraging the learning experience from previous chip designs. In essence, the use of AI-driven tools will move up the required level of abstraction by a human designer to specify an integrated system.

On the other hand, I also think that the devil is in the details, and so chips that will outperform their predecessors in power / performance / area (PPA) will require innovation, that is breaking habits and traditions coming from past design experiences. This means that human creativity will be necessary, even though intelligent EDA tools will be needed to cope with the very large number of choices in design. The massive use of AI tools and big-data repositories will also come at the cost of much more computation and energy consumption. Eventually end-users will have to reconcile the desire for intelligent products with the necessity of protecting the planet.

What was your most rewarding achievement in EDA?

During the 18 years spent at Stanford University I worked mainly on electronic design automation and in particular on logic and high-level synthesis. In the late eighties, synthesis from high-level specifications (like C or SystemC programs) was in its infancy. The Olympus synthesis system that my group released from Stanford was an example of an open-source package that addressed many issues, from high-level language compilation to technology mapping. Synthesis of C models with pointers was fully developed and patented, thus showing the importance of complete high-level models for design.

Circuit sizes and performances exploded in the nineties, with computational systems exploiting multiple programmable cores. Support for structured interconnect design became increasingly important. The notion of “routing packets and not wires”, as proposed by William Dally, led to the concept of a Network on Chip (NoC) as the main means for supporting communication in a multi-processor. NoCs leverage design methods jointly from the communication and circuit design fields, being short latency and low-power consumption important constraints. Designing a high-performance NoC is very challenging task, and my work along with Luca Benini was to create an environment and tools for automated NoC design. The original research at Stanford bloomed and it impacted academia and industry enormously. An entire community of researchers addressed the various aspects of NoC design and optimization. The xPipes model and xPipeCompiler became the backbone on the INOCS start-up, whose technology was later acquired by Arteris, the dominant provider of EDA tools for NoC design. Today, all major chip designs incorporate NoCs of various flavors and complexity, possibly named differently but essentially using the same principles. It is very rewarding to me to know that most electronic chips embody a network designed by leveraging models, methods, tools based on an engineered and improved version of our original prototype at Stanford.

What happened after you left Stanford for EPFL?

I moved to Switzerland after 25 happy and fruitful years in the United States, mainly in California, driven by the desire of tasting different aspects of circuit research. EPFL in Lausanne, Switzerland, gave me a wonderful opportunity to grow technically as well as an office with a view on Mont Blanc. I desired to return to the basics of the physics of circuits, and get my hands dirty in fabricating something new. I shared joys and frustration of the fabrication facility with my colleague Yusuf Leblebici. We embarked on the design of Silicon NanoWire circuits, that are the progenitors of today’s NanoSheets used by advanced commercial fabs. Our goal was not to compete with industry – where our chances were slim – but to explore something new. We crafted the double-gate polarity-controlled transistor, i.e., a unique transistor that can be electrically programmed to be of n or p type.  Having achieved successful devices, we took the design challenge of designing an EDA tool chain for mapping logic specifications into a structured array of cells, that electrically could be programmed to be NANDs, NORs, XORs or Majority gates. The best outcome of the crossbreeding of interests, between EDA and device design, was the discovery of design algorithms for exploiting majority gates, as mentioned before.

Along a parallel path, I became interested in sensing devices, and in particular in biosensing applications. With my colleague Sandro Carrara, we developed transistors that can be turned-on by the presence of specific analytes, e.g., glucose, lactose, antigens, etc. The final challenge was to realize sensing devices in silicon, with specific additional layers, on top of standard silicon used for computation. We addressed issues such as sensing multiple analytes (e.g., ions) and realizing circuits that enable us to disambiguate the response that may be partially corrupted by crosstalk at the fluidic level. Designing biosensors has been also an entry point into electronic systems for precision medicine, and thus understanding the wide untapped opportunities to better society by means of smart bio-circuits and systems.

Eventually, bio-sensors enabled us to design specific embedded systems that can be used for medical monitoring of patients with chronic or acute conditions, and possibly for controlling actuators, as in the case of treatment of diabetes. Medical cyber-physical systems operate under many constraints. Current EDA tools can support only a part of the design, and their extension to the support of different heterogeneous components (e.g., via multi-level simulation) and verification of correctness and safety properties is still a wide and exciting area of research.

In Switzerland, you started a major program for bettering health management and the environment. Can you tell us about the extent of this program?

Shortly after I arrived in Switzerland, I wanted to combine the depth of research on EDA, circuits and systems with the breadth of a wide objective.  Circuits, systems and the associated software are applicable to the in broader context of bettering health and monitoring the environment. This goal requires the cooperative effort of many scientists and the wide availability of funds. As research in this field has no short-term return of investment, accessing pre-competitive funds from the government (with no strings attached) is a requirement. I was fortunate to receive the full support of EPFL former President Patrick Aebischer to make a proposal to the Swiss government. The application was very successful, and the project became a national program with a line item in the national budget. It funded research over ten years to groups of scientists in Swiss universities and hospitals. Industries joined as partners without government contributions. The program was called Nano-Tera.ch, where the name Nano relates to the use of advanced nanotechnologies and the name Tera evokes the large complexity of system design

Health monitoring and environmental protection are imperatives of modern society, to provide the people with the means to live better, longer and in a safer environment. Switzerland has pursued for a long time a policy of supporting and improving the quality of life in the country. Technologies that promote bettering health can reach the combined goals of achieving higher quality services and lower cost of operation. Due to the raising pressure of medical expenditures on the national budgets, a technological rationalization of healthcare is very important and cannot stem from the private sector only, whose objectives are often to preserve/increase revenues (e.g., disease treatment is sometimes more economically rewarding than prevention).

Environmental protection is generally supported by public funding, as there is still limited awareness and/or motivation for private investment in this area. Protection includes, for example, the means to prevent disasters as triggered by rock or ice movement as well as pollution in air and water. With global warming, increasing environmental protection is necessary, and advanced technologies in sampling, sensing and communication can be important enablers.

Nano-Tera has been conceived and is managed to achieve such ethical goals. This has important ramifications. First, from a political perspective, it is both good and necessary that public funds are spent in the direction of improving the conditions we live in. Similarly, it is important that the current new generation of students and researchers see the importance of science and engineering in bettering the world. Next, the ethical dimension of research is an important attractor for students to pursue studies/careers in engineering sciences, as well as to retain engineers in this important domain. It is indeed crucial to train a young generation of engineers and scientists who believe that the outcome of their work, as an engineering product per se, is at least as important as the revenue that it produces. Indeed, excessive stress on financial gains have too often driven youngsters out of the engineering sciences domain and into the business world. It is thus another important imperative of our society to teach and promote engineering as a way to construct, renew and improve our societal infrastructure.

Success stories of this program - where I directly contributed - include realizing biosensors and telemedicine chains for monitoring humans with chronic conditions as well as enabling ultrasound telesonography for fast diagnosis in remote areas.

What, for you, are the strengths and weaknesses of European technology innovation?

New breakthroughs, such as intelligent surfaces and human-computer interaction, stem from the combination of different aspects of science and technology. Europe has multicultural roots and has always been the home of scientists with a renaissance mind and a strong cultural background who are capable of looking far ahead, beyond the limits of short-term returns on investment. I think Europe has the capability of innovating and promoting technologies that can leapfrog the present, just because we educate students to concern themselves with the intrinsic values of engineering products per se, and not just the economic opportunities. In a similar vein, the European Research Council (ERC) grant programme is highly successful because it targets high-risk high-reward research proposals, which are key for the advancement of science and technology.

There has been a sequence of inventions and products, from the internet protocol (developed at CERN) to wireless telephony (the radio was born in Bologna), and to low-power electronics (pioneered in Neuchatel) that originated in Europe and are part of today’s fast technological and societal evolution. Unfortunately, Europe has lost a market where it was dominant (i.e. wireless telephony) due to the lack of an appealing interface and a failure to provide related services to clients. This setback is also serious, as we lost the chance of using our inherent multicultural nature as a way to promote communication across different cultural backgrounds. Looking to the future, the strongest European asset is the profound culture that defines and divides us, and that distinguishes us from a melting pot. I believe that leveraging this factor into products that cater to a multicultural society can be a major enabler for our industry to grow and prosper.

What needs to be done to build our hardware ecosystems?

An ecosystem involves multiple partners and objectives. Large companies should promote the common good not only by providing financial support to targeted research but also by enabling technical events, participating in person and in engaging in co-advising doctoral students. Start-ups are often the best instrument to promote technology, eventually benefitting the large corporations. Academic research in Europe has pinnacles of excellence and trains graduates with a solid background. Nevertheless, the hardware ecosystem is still weak in Europe and depends on overseas technology, despite excellence in many domains, such as the telecom and the transportation sectors. Fostering strong circuit and architecture research groups in universities must be a priority in the coming years.

The visibility of European technology products is limited among university students. We rightly focus our educational principles on teaching the fundamentals, but we could infuse more enthusiasm into youngsters by showing the impactful outcomes of the research. The hardware ecosystem is much stronger in East Asia, where an engineering culture is embraced. We Europeans very often focus on limiting factors such as starting salaries and we don’t sufficiently value the pride of achieving top technical solutions. Overall, I think that European leaders in industry and academia should spend much more time and effort in explaining their objectives, in collaborating on educational programmes that value science and technology, and in making sure that quality is valued above all other metrics.