Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm astonished how often this comes up and also how wrong it is.

The cost of the GPT-4 API is ballpark around $0.05 / 1000 tokens. If you want to include a rolling context window which you basically HAVE TO DO if you want to maintain a persistent conversation, you will easily meet or exceed 1000+ tokens.

ChatGPT Pro gives you 50 GPT-4 queries every three hours. If you're using it all day you might average about 100 daily queries. Using a dedicated GPT4 API would run you approximately five dollars a day for the same thing - that's $150 a month as opposed to flat cost of $20.



I think you're being really wasteful with that kind of context window which is why it's a bit apples and oranges. I keep stats on this, my average message is around 50 tokens, the average individual response is around 200 tokens, and my average conversion length is 1.2 (only counting my messages). 50/convos/day * 30 days is $21 and I don't come close to that usage. Hell most of the time I don't even turn on GPT-4 because 3.5-instruct is plenty good.


Your usecase is that you don’t really use it very much, don’t want the advanced features and you’re arguing over $20 a month.

We are in a different category.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: