Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Claude's training data is a year further into the future which is often beneficial. The 100k token limit is fantastic for long conversations and pasting in documents. The two downsides are 1) it seems to get confused a bit more than GPT-4 and I have to repeat instructions more often 2) the code-writing ability is definitely subpar compared to GPT-4


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: