AI and meaningful tech work

2025-08-21

I started programming professionally in 2018. My first ever engagement was a one-time gig for a large-ish financial services company that wanted a set of 300 Excel files cleaned. I vaguely recall being invited to their office to run the program and seeing a look of subdued horror on the faces of the managers watching my program churn through the files in a few minutes.

A month's worth of work for their department, gone in a few minutes. What were they to do with all this new free time?

I could not understand their reactions. I saw, and honestly still see, programming as some sort of wizardry. The world needs work to be done, and as the work to be done increases, with a proper application of programming, you can scale with knowledge instead of people. For the first few years after I learned to program, I admit that I saw it as a calling to free people of jobs that could be done more efficiently by my machines. Surely their time was better spent doing other things.

It's 2025 now, three years since the release of ChatGPT. The lords in Silicon Valley say that they see it as a calling to free programmers -- my people -- of jobs that could be done more efficiently by their machines. I don't know whether they believe it. What I do know is that I saw my profession as a way to earn a place in the world by doing what I considered to be dignified, intellectual work. The message from the elites now is that they want to take such a path away.

It's tough to think about. I only draw the parallels between programmers in the late 2010s and AI moguls now to reflect on my own lack of empathy from when I began. I do think that the nature of the power that the tech industry holds has changed. For a while, it was only possible to create programs through labor. Once you had written the program, it became an asset, but you had to write it first: there was no skipping the work. That is no longer true. If you control an AI model, you can write programs without needing to do any real work yourself. Power has shifted back into the hands of those with capital.

There are still some arguments against the idea that tech capital will completely dominate tech labor. My personal favorite is the idea that a program is a mere manifestation of the theory that programmers have about the domain problem, which means that you'll still need someone to understand how a system works. But that doesn't actually help the situation for the vast majority of programmers, who work on what is perhaps self-inflicted complexity by the tech industry of the late 2010s. I've written some recent projects where I decided early on to hand-write only the database and the backend, which I consider the heart and bones of a typical application. AI was more than capable of writing a user interface for the core of the app. Before AI, wrangling the frontend would have easily taken 70% of my time: after AI, it takes nearly no time at all. Another argument is that since the concepts behind large language models are usable in open-source models, we aren't actually in danger of being captured by large vendors, since the open models will be only slightly worse than the frontier. This still doesn't help the average programmer, since their work will still be automated away, just by open-source models instead of proprietary ones.

But am I supposed to care for the programming work actually under threat by AI? Most of it seems fake. I never did enjoy the incidental complexity around building applications. I specifically hated the churn of the front-end JavaScript ecosystem, which makes frameworks to solve problems created by other frameworks. If AI can handle that now while programmers move on to tending to the concepts of their systems, maybe I should welcome the changes to the tech world brought by AI. Surely our time is better spent doing other things.