Betteridge’s Law says “no”. But in a blog post last week, Jonathan Edwards says “yes”. Specifically, he says:
Software is eating the world. But progress in software technology itself largely stalled around 1996.
It’s not clear what Edwards thinks happened in 1996. Maybe he blames the introduction of the Palm PIlot? In any case, he argues that the developments since 1996 have all been incremental improvements upon existing technology. Nothing revolutionary has happened in programming languages, databases, etc.
This has real “old man yells at cloud” energy. Literally. He includes “AWS” in his list of technology he dismisses.
Edwards sets up a strawman to knock down. Maybe “[t]his is as good as it gets: a 50 year old OS, 30 year old text editors, and 25 year old languages,” he proposes. “Bullshit,” he says.
I’d employ my expletive differently: who gives a shit?
Programming does not exist for the benefit of programmers. Software is written to do something for people. The universe of what is possible with computing is inarguably broader than in 1996. Much of that is owed to improvements in hardware, to be sure. And you can certainly argue of what’s possible with computing is bad. But that’s not what’s at issue here.
I don’t see carpenters bemoaning the lack of innovation in hammers. Software development isn’t special. It’s a trade like any other. And if the tools are working, let them work.
I won’t even bother with his “open source is stifling innovation” nonsense. Rebutting that is left as an exercise to the reader.
Hear hear!
Besides which: Elixir, Raku, Org mode, Jupyter, Julia, mainstream graph and document databases, computer vision, .NET, the entire macOS development ecosystem (which may have been an incremental improvement over NeXT in 2001 but it’s had a heck of a lot of increments in the last 20 years).
It’s almost like those talks with my high school buddies. Just because you stopped paying attention 30 years ago doesn’t mean people stopped making good music.
Kids these days, Brian.