The progression of my opinion on AI has been ... elliptical? Not straightforward, we'll say. But as I've gotten to use it more, both at work and through some of my own personal projects, I've started to see where some of the dissonance, I think, lies – both for me and others.
To back it up, I want to talk about software personalization. Since punchcards, people have written new programs because the program they were using had something wrong with it. Did it let you type into the screen and print out what you see? Sure, but we still had Microsft Word. Microsoft Works. Pages. LaTeX. WordPerfect. WordStar. Claris. Lotus. WordPad. TextEdit. Notepad.
Software, at its best, is thinking embodied through code. It should allow you to do whatever you were going to do, but better and faster.
In my past life, when I was the only person who knew how to code in an organization, that was literally my favorite thing about being a software engineer. I could go in, get a problem from someone, help them solve that problem through software and end up feeling like I built something and I helped them. That is literally why I became a software engineer.
And the reason we want to personalize software is because everyone works differently. You cannot just throw a general piece of software at a specific problem and expect it to work. Despite how expensive ERPs like SalesForce, etc., are, every decent-sized implementation requires the end-user to hire a developer to make it work the way they want to.
And they do it because software is expensive! Even buying into fancy, expensive software is cheaper than building in your own.
Or at least, it used to be.
Now, on a personal level, I strive for efficiency. I am unwilling to admin how many e-readers and tablets I have purchased (and re-purchased) to find the, "perfect method for reading." Oh, Kindle won't read epubs. Oh, the Kobo is a pain to sync new files onto if you don't buy them through Kobobooks. Oh, the eInk Android tablet works slower because it's Android trying to run on eInk. Oh, the iPad is too distracting. Oh, the Kindle ...
I don't think the problem is the device or the software. It's my mindset.
And I think everyone's got at least some piece of their life that works that way – or they've gotten lucky enough to find the thing that works for them, and they stick to it. This is why so many people wind up with clothing "uniforms" despite not being mandated. How many people have bough multiples of an item because you don't know if it'll be available the next time you need to get a new one?
And to be honest, that's how I still see most of AI's generative output, at least when it comes to purely creative efforts like writing or other content creation. It's trying to appeal to everyone (becuase it literally is sliced chunks of combined consciousness).
And as an engineer, when I used some of the coding agents from even six months ago, I saw much of the same thing. Sure it could churn out basic code, but when it tried to do anything complex it veered too hard toward the generic, and software engineering is a very precise arena, as anyone who has forgotten to add or remove a crucial semicolon can tell you.
But then at work, we started to get more tools that were a little bit more advanced. They required more planning than when I had been trying to vibe code my way through things. I was surprised at how nuanced the things our new code review tool could catch were, and the complexity of rules and concepts it checked against. And I started to realize that if we were using AI in conjunction with existing tools and putting in the level that I would put into normal engineering, I could start to get some pretty cool stuff.
A quick digression into the past. I previously tried to build my own CMS. I think everyone has at one point. For about the first three or four days, it was an earnest effort: "I'm going to build this and it's going to be the next WordPress."
I quickly realized one of the reasons WordPress sucks so much is it tries to be everything for everyone and therefore it's just winds up being a middling ball of acceptable. (Especially when they got confused as to what they actually wanted to be – a publishing platform or a Substack alternative or a Wix competitor. Gah, no time to fight old battles.) Again, trying to be everything for everyone winds up in the software usually working poorly for every individual.
So I was like, I'm going to build my own CMS. And I built it. And what it ultimately wound up being was an object lesson in why frameworks (e.g., Laravel) tend to be structured the way they are. It was super useful for me as an engineer because I got to see the process of building this large piece of software and think about things like modularity, how to enable plugins you can easily fit into to an existing system. Legendarily helpful for me from a learning-how-to-architect standpoint.
But from an actual usability standpoint, I hated it. Absolutely abhorred using the software.
I spent about 14 hours yesterday of intensely planned vibe-coding. I had my old blog, which was built on Statamic, so I had an existing content structure that Claude could work on.
And I walked that AI through half a day of building exactly the content management system that I want, both for my personal note storage and this blog (which is actually now powered by that same CMS). It took me about two hours to replace what I already had, and the rest of the time was spent building features I've been planning in my head for years. And the UX is surprisingly polished (and weird) because I want it to be polished and weird. It is customized software, fit for exactly my use case.
Originally I thought, "I'm going to build this thing and I'm going to let everybody use it, whoever wants to use it." But as I kept blowing through features I've been salivating over, I realized: I don't think anyone else would want to use this. They would say, "Hey, those are some cool ideas, but I would love it if it did this." And I have absolutely no interest in sitting through and helping somebody else work out how to make their version of this thing – unless they're going to pay me for it as a job, of course.
In an abrupt change for myself, I am now willing to vocally get behind that idea that AI can be used to build good software. But I am going to be adamant about that crucial "used to build" phrasing: I do not believe AI can build good software on its own. I think it can be used as a tool: Software engineers can use AI to build exactly the software we wanted.
None of the stuff it did was beyond my technical capabilities, it merely exceeded my temporal capacity: Stuff I didn't have time to do.
What's especially funny to me is that the best analogy I have for the utility of AI (and this may just be a function of my career history as an engineer): It is a low-code coding tool for software engineers.
We did it, everybody! We finally managed to build a no-code tool, but it's still only functionally usable by engineers.