Gear Acquisition Syndrome is a very different problem. Even if you haven't cured the issue the new synth was meant to fix, at least you have a new synth.
It is an almost universal fact that dealing with retail customers is something that is left to the lowest paid, lowest status workers and often outsourced and now increasingly left to LLM chatbots.
While you obviously can't have highly paid engineers tied up dealing with user support tickets, there is a lot to be said for at least some exposure to the coal face.
> While you obviously can't have highly paid engineers tied up dealing with user support tickets,
You obviously can, that's one of the more visceral way to make them aware of the pain they cause to real people with their work, which sticks better, or simply serves as a reminder there are humans on the other side. There are even examples of higher paid CEOs engaging, we can see some of that on social media
To shuffle it up in semi-random ways that make you think. If you're determined to hate LLMs for any reason or any purpose, just think of them as an elaborate game of Exquisite Corpse or Ultimate Mad-Libs.
You don't have to think LLMs are smart or real people to think of them as useful. I love it when I can make an idea clear enough in text that an LLM can completely regurgitate it and build upon it. I also love it when an LLM trips over and misses the one real novelty that I've slipped into something; what better for an originality test than trying to choke an automatic regurgitator?
Transistors have no understanding of what I'm doing, but somehow I still find them useful.
I feel people talk to LLMs in chat format just so they feel there's someone listening. This puts thats in a journaling/blogging context, hopefully delivering the same value in a unique context.
Catch all the security holes while you were reviewing it, or did you leave those to the machine as well?
reply