Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

ITT nobody remembers gpt2 anymore and that makes me sad


This model was trained on 6T tokens and has 256k embeddings, quite different than a gpt2 model comparable in size.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: