Spotting the Next Big Thing in Semiconductors
Welcome to SemiLiterate, a guide to the chip industry through the lens of public policy.
BLUF: this post describes strategies for identifying emerging trends and technologies in the semiconductor industry. It observes that the easiest way to spot where this industry (or really any tech industry) is heading is to watch where smart investors and organizations are spending their money, where smart people are investing their time, what the big tech companies are doing (or not doing), and monitor the level of enthusiasm for a product/idea among hobbyist communities. Using this approach, this post suggests that we should all be paying a lot more attention to developments in the chip design world (photonics, RISC-V, and EDA), and whatever it is that Jim Keller is working on.
Introduction
There are several traditional metrics used to track and monitor emerging technologies:
Patent applications (who is inventing new things?)
Publications (who is writing journal articles? where are they?)
Mergers + Acquisitions (what emerging startups do incumbent companies think present threats/opportunities?)
Venture Capital (where does the “smart” money go?)
Public R&D + Regulatory Interest (what is the government betting on?)
Hobbyist interest (how enthusiastic about an idea are people on Reddit etc.?)
There are also several traditional metrics used to evaluate emerging technologies:
Technology readiness level (determine technology maturity/viability)
Potential for growth (is there a market for the technology?)
Potential for adoption (will people care about the technology?)
Speed of growth/adoption (is the technology poised to become exponentially more popular?)
Criticality (does being first confer asymmetric advantage, is winning zero-sum?)
All of these metrics are useful, but some of them are paywalled (M&A activity databases), some of them are too technical for the lay observer (patent databases), some of them are qualitative (criticality) and some of them require collaborative work (TRL assessments). For individuals interested in tracking emerging trends in the semiconductor industry, an eye for synthesizing seemingly idiosyncratic developments is essential. That is what this post is about.
Background
Before getting in to the weeds, some observations Paul Graham made about 10 years ago about the nature of venture capital investing are worth keeping in mind:
The two most important things to understand about startup investing are that:
(1) effectively all the returns are concentrated in a few big winners, and
(2) the best ideas look initially like bad ideas.
These seem like instructive observations to clarify thinking about the future of the semiconductor industry. Until very recently, any idea that diverged radically from Moore’s Law was, by definition, a bad idea. This industry has ridden an innovation gravy train for 65 years. That gravy train has had a few big winners.
But today, these incumbent winners (ex. Intel) are generally faltering or have been eclipsed by new winners (ex. TSMC) who are riding the same gravy train. Many companies and governments think that the solution to their woes is simply finding a way back on this train, seemingly without caring that the end of the tracks is in sight. This is the sort of short term thinking that is characteristic of an industry that has operated on 18-24 month cycles of innovation for 6+ decades with great success.
This short term thinking won’t invent the future. To invent the future, companies and governments need to survey the landscape and ask “How do we fund “bad” ideas that result in a few big winners?” This post is about long-term thinking for a short-term industry: how to accelerate the arrival of a post-Moore’s Law future.
“How to fund “bad” ideas that result in a few big winners?”
A few assumptions and observations up front:
First, the future of the semiconductor industry is not going to be invented by any of the incumbent leaders in the semiconductor industry. Its hard to develop new things in big organizations. From ion implantation in the 1970s to chemical mechanical polishing in the 1980s to copper interconnects in the 1990s to FinFET in the 2000s and EUV in the 2010s, paradigm shifting innovations have generally not started with incumbent leaders. Semiconductor incumbents are quick adopters, not necessarily adept innovators.
Second, many of the paradigm shifting innovations in the semiconductor industry have happened via perceived “outlier” parts of the industry.
Third, big winners are optimal. Governments want national champions, companies want monopolies. Both pursue policies to that end.
Fourth, big winners and their breakthroughs can emerge anywhere in the “stack” that composes the semiconductor industry and change the rest of it.
Historical Interlude
If you had asked anyone in the 1980s about the utility of using a high powered laser to incinerate falling pieces of metal to generate a specific wavelength of light that would then be used to impart minute circuitry on wafers in a vacuum chamber they would have probably said “thanks but no thanks.” Yet that is exactly when the history of extreme ultraviolet lithography (EUV) began: development started in the 1980s with fundamental research done in the US and Japan on the use of soft X-rays, leading to the first demonstration of the technology in 1986. In the 1990s and early 2000s, NTT in Japan, Bell Labs and Lawrence Livermore National Laboratory in the US and the University of Twente in the Netherlands further pushed the research in this technology. ASML (Netherlands) then sought to further develop and commercialize EUV in partnership with institutions like IMEC (Belgium) and corporations including Intel (US), Samsung (South Korea), and TSMC (Taiwan). Today ASML enjoys a pure monopoly on the sale of EUV tools. Every company in the world wants to find themselves in this position 30 years from now. The question is where to start looking.
Step One
Step one is identifying “bad” ideas that have the right ingredients. Peter Thiel was interviewed by Mike Pompeo and Matt Pottinger in April and had some interesting observations on the semiconductor startup ecosystem:
One of the very strange dynamics in Silicon Valley is people don’t do very much with semiconductors anymore. I am a venture capitalist. I get pitched on semiconductor startups every few years, but then it’s always, ‘I haven’t done much with it, I don’t know what is going on and it seems expensive and complicated,’ and I think one of the weird problems with 20 years of intellectual property theft, and where IP doesn’t really have as much value as you used to, is that you learn not to invest in things like that… I think we are still ahead of [China] in lots of ways on the [semiconductor] design side. It is one of the places where we can do more to block China than they can do to block us, but we have lost a lot of ground in the last 20 years.
For a famously iconoclastic thinker, this is an odd blind spot. If your priors lead you to believe investing in semiconductors is a bad idea, shouldn’t you then be on the look out for exceptions to your rule? After all, the best ideas initially look like a bad ideas. And it sounds like semiconductor investments are a very bad idea in his opinion.
This is a decent starting point when looking for ideas about the post-Moore’s Law future: if at any point in the future Peter Thiel invests in a semiconductor startup, that seems like a signal that the startup must really be on to something. To generalize this, you want to look for outlier investments by successful investors who have limited history (or known skepticism) of betting on this industry.
One example of this is Luminous Computing, a small company (46 employees on LinkedIn currently) who humbly state their mission is “to create a single AI training and inference chip that will outperform the world's largest supercomputer.” They’re developing a silicon photonic AI chip (a chip that processes information via light), have been in stealth mode for years, have advertised employment vacancies which indicate they already fabricated test chips, and randomly landed an investment from Bill Gates (who is not known for his big bets on semiconductor hardware). Perhaps we should all be paying more attention to photonics these days.
Step Two
Sticking with this theme of “follow the smart people,” when looking for signs of a post-Moore’s Law future we should all simply pay more attention to wherever Jim Keller is working and whatever he is working on. The idea of a “celebrity” chip designer is laughable, but he is as close as it gets. Jim Keller is one reason AMD is enjoying its renaissance and the reason Tesla’s full self driving chip exists. Before his stint at AMD, he helped get Apple’s silicon division off the ground and after his departure from Tesla he spent two years trying to fix Intel’s chip division. And then six months ago he started working for Tenstorrent.
If you’ve never heard of Tenstorrent, you’re far from alone. Jim thinks “it is the most promising chip architecture out there,” which is the sort of praise you’d expect from the company president/chief technology officer. But its also a huge endorsement because, as his work experience shows, he doesn’t really waste his time on lost causes.
Tenstorrent is an AI chip design company that creates and designs silicon for machine learning, uses GlobalFoundries to fabricate the hardware, and works with partners to create solutions (as in, chips + system + software + optimizations for that customer). What is most interesting is that soon after joining in company January 2021, Tenstorrent licensed a general-purpose CPU design developed by SiFive based on the RISC-V architecture as their AI system on chip in April 2021. Then in May 2021 Tenstorrent secured another $200 million of funding, valuing it at $1 billion. And in early June Intel made a $2 billion offer for SiFive.
These developments are a huge vote of confidence for RISC-V: an open source instruction set architecture (ISA) that forms the basis of chip designs. When the leading chip designer of his generation chooses your design for the backbone of his startup’s product and the leading firm commercializing this ISA gets a $2 billion offer from Intel all in the same month, its a vote of confidence. For a long time (since the 1980s, in fact) RISC-V has been a “bad” idea. Maybe not so much any more.
(Unrelated: if you have the time, listen to this podcast interview of Keller. The guy is so smart he can actually explain how chips work to normal humans.)
Step Three
Moving on from our theme of “follow the smart people” as we look for bad ideas that result in big winners, the next stop is “follow the smart organizations that have a history of funding high risk-high payoff ideas and incentivizing fast failure.” Or, look at what DARPA’s Microsystems Technology Office (MTO) is doing. The MTO funded the breakthroughs which led to FinFET in the mid 2000s, allowing scaling to 14nm and below. DARPA helpfully talks about the MTO’s current focus in very straightforward detail in the context of its Electronics Resurgence Initiative (ERI). ERI’s investments are focused on:
increasing information processing density and efficiency,
accelerating innovation in artificial intelligence hardware to make decisions at the edge faster,
hardware to make decisions at the edge faster,
overcoming the inherent throughput limits of 2D electronics,
mitigating the skyrocketing costs of electronic design,
overcoming security threats across the entire hardware lifecycle, and
revolutionizing communications (5G and beyond).
Within each of these categories, DARPA is sponsoring one or more sub-projects. For example, within the “Mitigating the Skyrocketing Costs of Electronic Design” category, DARPA has projects on:
The two program managers of these four programs come from Sandia National Lab and Mentor Graphics (now owned by Siemens), one of the “big three” EDA companies. Their primary focus is on accelerating hardware design cycle times, developing modular/reusable IP blocks, establishing a system to verify the security of electronics designed in open source hardware IP ecosystems, and developing a program that will fully automate design of machine learning-optimized chips based on real world inputs. All four of these projects seem worthy, but it’s hard to know their viability until the period of performance is over.
Step Four:
This does however lead to a fourth point, which is “look for areas where business and government are both spending their time/resources” as a signal that the environment might be changing in favor of the “bad” ideas. Challenging an oligopoly is generally a terrible idea. And Cadence, Mentor, and Synopsys have controlled the Electronic Design Automation environment for decades, simply buying up emerging competitors before they had a chance to eat in to their market share.
But U.S. export controls on EDA tools are accelerating the Chinese industry’s development of indigenous EDA tools and the Chinese government is certainly willing to spend whatever it takes to help develop a commercially viable indigenous semiconductor ecosystem that doesn’t rely on any U.S. components. Cadence, Mentor, and Synopsys will likely slowly lose market share, its just a matter of time.
More interestingly, it appears Google decided that it was time to partially cut out the oligopoly. Google apparently developed an in-house EDA tool to design one of its in-house chips. This is indicative of a trend more generally: big technology companies are getting in the hardware game. But they are not doing so in a half-hearted or one-off manner. Apple Silicon is competing with some of the leading semiconductor companies in the world, Google’s tensor processing unit changed the artificial intelligence hardware landscape, and Microsoft/Facebook/Amazon are all placing their hardware bets too. Big tech companies can afford to invest time and money in places where they perceive market inefficiencies. EDA is a great example of this, and one that until recently would have been seen as a “bad” idea to invest resources in. These are after all large profitable businesses because they studiously avoid making massively bad bets. For example, while it is notable Google and DARPA are keen on innovating in the EDA space, it remains exceedingly unlikely any big tech company gets in the foundry business any time soon.
These developments are arguably accelerating. A few weeks ago researchers from Google announced in Nature that they used AI to partially design Google’s next AI chip:
The AI now needs fewer than six hours to generate chip floorplans that match or beat human-produced designs at power consumption, performance, and area. Expert humans typically need months of iteration to do this task.
Conclusion/Step Five: Tying It All Together
What could patent applications, publications, and M&A told us about the technology readiness level of EUV in the 1990s and 2000s? Should we have all guessed it was going to work out in 2012 when ASML bought Cymer?
Recall the Historical Interlude from above about how ASML’s EUV system started. The ingredients look very logical in hindsight: fundamental research being done on an existing technology through an industry/government partnership that pushed the research up the technology readiness level “stack.” That research pulled the technology in to areas it previously had not been applied until, eventually, the good idea became so compelling that a company decided to make a relatively big bet to commercialize it. That company then has to apply a decade of patient capital but up inventing the future and is enjoying the returns. Perhaps the developments we’re seeing today in silicon photonics, RISC-V, and EDA will look as logical in hindsight.
There are lessons to be learned, but in the short term we should all spend more time watching for iconoclastic investments, tracking where the smart people work (and what they work on), observing what the smart organizations are funding, and analyzing where incumbent tech companies are burning their cash for signs of an approaching post-Moore’s Law future.