Hacker Newsnew | past | comments | ask | show | jobs | submit | zaronymous1's commentslogin

Can anyone explain to me why they've removed parameter controls for temperature and top-p in reasoning models, including gpt-5? It strikes me that it makes it harder to build with these to do small tasks requiring high-levels of consistency, and in the API, I really value the ability to set certain tasks to a low temp.


Can anyone explain to me why they've removed parameter controls for temperature and top-p in reasoning models, including gpt-5? It strikes me that it makes it harder to build with these to do small tasks requiring high-levels of consistency, and in the API, I really value the ability to set certain tasks to a low temp.


It's because all forms of sampler settings destroy safety/alignment. That's why top_p/top_k are still used and not tfs, min_p, top_n sigma, etc, why temperature is locked to 0-2 arbitrary range, etc

Open source is years ahead of these guys on samplers. It's why their models being so good is that much more impressive.


Temperature is the response variation control?


Yes, it controls variability or probability of the next token or text to be selected.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: