Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is the Google Search problem all over again. When Google first came out, it was so much better than other search engines that people were finding websites (including obscure ones) that would answer the questions they had. Others at the time would get upset that these people were concluding things from the search. Imagine you asked if Earth was a 4-corner 4-day simultaneous time cube. You'd find a website where someone explained that it was. Many people would then conclude that Earth was indeed a 4-corner 4-day simultaneous time cube where Jesus, Socrates, the Clintons, and Einstein lived in different parts.

But it was just a search tool. It could only tell you if someone else was thinking about it. Chatbots as they are presented are a pretty sophisticated generation tool. If you ground them, they function fantastically to produce tools. If you allow them to search, they function well at finding and summarizing what people have said.

But Earth is not a 4-corner 4-day simultaneous time cube. That's on you to figure out. Everyone I know these days has a story of a doctor searching for their symptoms on Gemini or whatever in front of them. But it reminds me of a famous old hacker koan:

> A newbie was trying to fix a broken Lisp machine by turning it off and on.

> Thomas Knight, seeing what the student was doing, reprimanded him: "You cannot fix a machine by just power-cycling it without understanding of what is wrong."

> Knight then power-cycled the machine.

> The machine worked.

You cannot ask an LLM without understanding the answer and expect it to be right. The doctor understands the answer. They ask the LLM. It is right.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: