WITH EVERYONE SO mesmerized by silver-tongued AI chatbots, it’s easy to forget that most flashy breakthroughs in science and technology depend on much less glamorous advances in the fundamentals of computing—new algorithms, different computer architectures, and novel silicon chips.
Category: Technology
The US has largely dominated these areas of innovation since the early days of computing. But academics who study advances in computer science say in a new report that by many measures, the US lead in advanced computing has declined significantly over the past five years—especially when measured against China.
It’s well established that America no longer manufactures many of the world’s most advanced computer chips—a process that involves carving insanely intricate patterns into silicon with devilishly difficult techniques. Apple and many other companies instead outsource that work to TSMC in Taiwan or Samsung in South Korea. This is why the US government created the CHIPS Act—a $52 billion package aimed at revitalizing domestic chip-making and related technologies.
The report—from MIT; the Council on Competitiveness, a think tank; and Silicon Catalyst, an investment firm—shows that America’s share of the world’s most powerful supercomputers has also fallen a lot over the past five years.
And while the US has traditionally dominated the development of new computer algorithms, some measures of algorithmic innovation—such as the Gordon Bell Prize, awarded to outstanding scientists working on advanced computing—indicate the US has lost its edge to China. The report sums up the overall trend in its pointed title: “America’s lead in advanced computing is almost gone.”