Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Are you sure? When I look at the GPT-2 examples, and remember my experience with it when you could try on talktotransformer.com, I would say it is.

GPT-2 would start to ramble incomprehensibly almost every time. It would forget context after a few words. It would get into loops. To make something resembling a conversation you had to heavily influence it.



It's better, just not 100x better, which is a big difference in practice.

Quantitatively about 5x of the output is more useful in my usage, which is not enough to upend society.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: