Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You said humans are machines that make errors ans that LLMs can easily find errors in output they themself produce.

Are you sure you wanted to say that? Or is the other way around?



Yes. Just like humans. It's called "checking your work" and we teach it to children. It's effective.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: