Mmmm.They should work in tandem. You're not so much replacing humans as enhancing their capabilities. Humans have always invented new tools to make life easier. The internet, factories, the wheel. They all came about to make things faster and easier. AI is the same, it's just the new kid on the block.
I think this ignores the reality of it though. It's not actually been proven at any scale to add real value. No one is making money here, it's just investment in the expectation that this will lower costs somehow. Otherwise the investment doesn't make much sense, because there is no world where they could recoup the investment in increased revenues.
AI isn't the internet. It's not a revolutionary way to share information. It's just a new spin on how to access that information, nicely preloaded, filtered, cleaned, and without insight into it's specific sourcing....so basically worse in every way from a truth-value standpoint. The thing is, it was the possibilities engendered by the sharing of human ideas that drove the past 25 years of growth.
AI doesn't do that. It just repackages what is out there, it adds nothing new, and I don't see how it enables adding anything new in the long game.
I'm just not sure it actually makes anything easier that's truly worth doing. Everything I've created with it that felt good at the time... when I was done I was left with a sour taste in my mouth and the feeling I should have either done the thing by hand, or the feeling the output itself had no real inherent value.
It's a mildly amusing thing with some specific values*
*Again, aside from the highly valuable use-case of specially training LLMs, which are very cool statistical technologies, which is where the groundbreaking AI headlines all come from. I'm just talking about the big public model LLMs like ChatGPT/Claude/Grok.
Editing, because here I am.
What I mean, is that yes it's good at creating a research paper, but the value of creating a research paper is doing the research, and learning the subject deeply enough to write about it. Using AI creates the output, but none of the value.
In my experience, that is pretty much all of what it's good at; stuff where the value of the output is the actual input that should go into it, with the output just being a side effect. Using AI is, in that world, inherently counterproductive while still appearing to be highly productive to those that are only observing the outputs.
This would function like a time bomb in the innards of all companies working this way, where one day no one knew what anything was doing anymore and that happened slowly, like a frog in a pot.
I dunno. I've been working with it pretty deeply for a while now, and I'm just really not convinced.
Last edited:








