AlphaEvolve moves from research demo to a production tool, with real numbers
Google DeepMind published an update on AlphaEvolve, its Gemini-powered coding agent, and the news is less about the model than about where it has landed: real systems, with numbers specific enough to check. AlphaEvolve searches for and optimizes algorithms, and this post trades demos for deployed results.
A few stand out. On Google Spanner it cut write amplification by 20% and storage footprint by 9%. It contributed optimizations to next-generation TPU silicon. Outside Google, Klarna doubled the training speed of a transformer model, FM Logistic reported a 10.4% routing gain worth about 15,000 km saved a year, and Schrödinger saw roughly a 4x speedup in molecular force field work. On harder science, DeepMind cites feasible solutions for AC Optimal Power Flow rising from 14% to 88%, and quantum circuits with about 10x lower error on the Willow processor.
The team presents this as a shift from research prototype to a production tool on Google Cloud, with named customers instead of internal demos only. The full post is on DeepMind's blog.
Why it matters
If you work on infrastructure or ML systems, the useful question is which of these wins reproduce outside Google. The Spanner and customer figures are concrete enough to test against your own workloads, which is the right way to judge a claim this broad.