Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
Gormo
on June 1, 2024
|
parent
|
context
|
favorite
| on:
Legal models hallucinate in 1 out of 6 (or more) b...
LLMs are only capable of hallucinating, whereas humans are capable of hallucinating, but are also capable of empirically observing reality. So whatever the rate is, it's necessarily lower than those for LLMs.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: