← all news

AI as Normal Technology

AI · · 1 year ago · source (normaltech.ai)

Arvind Narayanan and Sayash Kapoor's long essay argues that AI is best understood as normal technology, comparable to electricity or the internet, rather than as a coming superintelligence. The point is not that AI is unimportant. It is that impact is gated by diffusion, not by raw capability. They separate invention, innovation, and adoption, and note that in safety-critical settings the lag runs for decades. Their concrete example is Epic's sepsis prediction tool, which scored well in testing but missed about two-thirds of cases once deployed in hospitals because the real environment is messy and opaque. They cite a striking adoption gap: by August 2024 around 40 percent of US adults had used generative AI, but that translated to only 0.5 to 3.5 percent of work hours. From this they argue power, meaning what a system can actually do through tools and institutions, matters more than intelligence, and that misalignment is better treated as an epistemic problem than a stochastic one. The policy stance follows: prefer resilience, decentralization, and strong downstream defenses over trying to restrict access.

Why it matters

This is the most cited counter to fast-takeoff and doom framings, and it makes a falsifiable claim about diffusion speed you can check against your own organization. If adoption, not capability, is the bottleneck, your AI roadmap should be planned in years of institutional change, not model releases.

PolicyForecasting