People are rapidly learning how to improve model capabilities and lower resource requirements. The models we throw away as we go are the steps we climbed along the way.
People keep talking about automating software engineering and programmers losing their jobs. But I see no reason that career would be one of the first to go. We need more training data on computer use from humans, but I expect data entry and basic business processes to be the first category of office job to take a huge hit from AI. If you really can’t be employed as a software engineer then we’ve already lost most office jobs to AI.
At this point I consider Scott to have played the Internet like a fiddle. I think he knew the whole time the agent didn’t deserve any attribution. He knew it was a human driving the thing but wanted to grab people’s attention.
It’s kind of shocking the OP does not consider this, the most likely scenario. Human uses AI to make a PR. PR is rejected. Human feels insecure - this tool that they thought made them as good as any developer does not. They lash out and instruct an AI to build a narrative and draft a blog post.
I have seen someone I know in person get very insecure if anyone ever doubts the quality of their work because they use so much AI and do not put in the necessary work to revise its outputs. I could see a lesser version of them going through with this blog post scheme.
LLMs also appear to exacerbate or create mental illness.
I've seen similar conduct from humans recently who are being glazed by LLMs into thinking their farts smell like roses and that conspiracy theory nuttery must be why they aren't having the impact they expect based on their AI validated high self estimation.
And not just arbitrary humans, but people I have had a decade or more exposure to and have a pretty good idea of their prior range of conduct.
AI is providing the kind of yes-man reality distortion field the previously only the most wealthy could afford practically for free to vulnerable people who previously never would have commanded wealth or power sufficient to find themselves tempted by it.
I think it’s also for practical reasons: your dog needs to be near a person with an iPhone. If the dog is in the middle of the woods it won’t show up. Generally most objects require a person to move them and so the chances of them being near an iPhone are much higher.
reply