> Gas Town is also expensive as hell. You won’t like Gas Town if you ever have to think, even for a moment, about where money comes from. I had to get my second Claude Code account, finally; they don’t let you siphon unlimited dollars from a single account, so you need multiple emails and siphons, it’s all very silly. My calculations show that now that Gas Town has finally achieved liftoff, I will need a third Claude Code account by the end of next week. It is a cash guzzler.
"New generation server parts significantly reduce operating costs and without a huge price rise, they will also drop TCO more than a little"
Looking at some outside evaluations of their capex, it does not seem to represent a significant drop since CPU cost is only a tiny part of operating a datacenter.
Hence _T_otal _C_ost of _O_wnership. Running a data center is all about getting direct power consumption (servers) and indirect power consumption (cooling) down.
I've heard the assumption that energy costs outweigh cpu costs for TCO, but after some rough guesstimation I'm not so sure.
1MWh of electricity is enough to power a cpu running 114W on average for a year. A 24 core (48 thread) xeon has a 165w tdp, meaning that even such a beast is likely to use less on average (most applications don't have 100% cpu load 100% of the time, and virtualizing up to that load causes serious issues when several clients happen to simulatneously peak). In essence, this should run pretty much any single CPU, and even if you're doing something weird to actually get that 100% load, we're not far off.
Of course, there are power supply losses and cooling overheads, but even if that raises the power consumption by a factor 5 (doubtful), then a several-thousand dollar server cpu is still going to outweigh energy costs for many years.
Furthermore, energy costs are trending downwards and cpu's are still getting more efficient; it's a somewhat risky investment thus to buy a more expensive (but more efficient) chip if that investment needs years to pay off.
So, it looks to me as though plain old cpu costs certainly are far from irrelevant; they may indeed exceed energy costs over the chips lifetime if the energy is acquired at close to wholesale prices and cooling is efficient (say 3x overhead?).
I'm not affiliated with Claude or the project linked.