
I’ve argued before that traditional coding, as we’ve known it, is rapidly losing its central role. AI systems improve at a relentless pace, and it’s becoming increasingly plausible that large language models will soon produce higher-quality code than most humans. What’s striking isn’t just that this is happening, but how naturally coding fits into what these systems already do well.
The reason is deceptively simple: programming languages are text. At their core, LLMs are machines designed to ingest vast amounts of textual data, learn its structure, and predict what comes next. Source code, with its rigid syntax and well-defined patterns, turns out to be an ideal match for this approach. When an AI writes code, it’s doing the same thing it does when answering a question — selecting the most likely next token based on everything it has seen before.
This process doesn’t involve understanding in the human sense. Instead, the model breaks a prompt into tokens and navigates an immense mathematical landscape of learned relationships to generate output, one token at a time. Despite how abstract that sounds, it works remarkably well for programming, where consistency, repetition, and formal structure dominate. The predictability of code is precisely what makes it so amenable to statistical generation.
Underneath it all is an extraordinary volume of vector math. Training and running these models requires staggering numbers of calculations, a workload perfectly suited to GPUs. That’s why the same hardware once prized for rendering realistic graphics and powering video games now sits at the heart of AI development. It may feel strange that the technology behind immersive visuals also drives automated coding, but both rely on the same core strength: massive, parallel numerical computation.
Put together, these factors explain why AI agents shine at writing software. Coding isn’t just another task they’ve learned — it aligns almost perfectly with how they already process information. As these systems continue to scale, their dominance in this domain feels less like a surprise and more like an inevitability.

