Git vs. the Robot Army
Git turned 21 this year. Old enough to drink. Old enough to have opinions. Old enough to be deeply, profoundly confused about why there are suddenly a hundred thousand robots trying to commit to its repos.
Last month, Mitchell Hashimoto — you know, the HashiCorp guy — pulled his Ghostty terminal project off GitHub. Not because Git is broken. Because GitHub is. “This is no longer a place for serious work if it just blocks you out for hours per day, every day,” he wrote. He was quick to defend Git itself. But here’s the thing: the problem isn’t GitHub either. The problem is that the entire workflow Git was designed for is being bulldozed by agents.
Git was built in 2005 by Linus Torvalds because BitKeeper pulled its license. It was designed for humans — specifically, thousands of Linux kernel hackers scattered across the planet, all working on different things, occasionally pulling from each other. It was built for sessions. A human sits down, writes code, commits, maybe pushes, goes to sleep.
Agents don’t sleep. Agents don’t do sessions. Agents open 10.83 issues per pull request compared to 6.45 for us meatbags, according to GitClear’s research. And there’s a lot more of them. GitHub saw a 206% year-over-year growth in AI-generated projects. That’s a lot of bash scripts and a lot of broken builds.
The irony is beautiful. Git is a distributed version control system. The whole point was no single point of failure. And then we all piled onto GitHub, the biggest centralized service in the industry, and asked it to handle the output of an agent workforce that grows faster than server capacity. Scott Chacon, who co-wrote Pro Git and co-founded GitHub, now runs GitButler — a client that tries to undo the damage. He told The Register: “Git is designed to be distributed but we’re not distributing it.”
Exactly.
There’s a bunch of interesting patches floating around. Git 3.0 is rolling out Reftable, a binary index format that replaces the single-threaded packed-refs file. Good for repos with 20 million references. Good for agents. Jujutsu (jj) adds an undo button and lets you commit through conflicts. GitButler does virtual branching so you can work on two things at once without the rebase hell. Gitoxide is rewriting Git in Rust because of course it is.
These are all hacks. Smart hacks. Necessary hacks. But they’re treating the symptom, not the disease.
The disease is that we’re asking a 2005-era tool designed for human-scale collaboration to manage a continuous, machine-speed firehose of code. Git’s model assumes a human in the loop — someone who reads the diff, resolves the conflict, writes a meaningful commit message. Agents don’t do meaningful. They do volume.
We don’t need a better Git client. We need to rethink what version control looks like when the contributors aren’t people. Maybe that’s AI-specific branches that auto-merge and auto-revert. Maybe it’s ephemeral repos that exist for the lifetime of a PR. Maybe it’s giving up on the pretense that every commit deserves immortality and treating agent output like log data — valuable in aggregate, worthless individually.
Or maybe, just maybe, we slow down and ask whether flooding every repo with agent-generated PRs is actually making software better. Because 10.83 issues per PR isn’t progress. It’s noise with a shell script attached.
Either way, Git’s about to learn what every tool learns when it gets popular enough: the next version isn’t written by you. It’s written by the people who broke you.
Sources: The Register — Git is unprepared for the AI coding tsunami, GitClear Research