Nvidia gives money to OpenAI so they can buy GPUs that don't exist yet with memory that doesn't exist yet so they can plug them into their datacenters that don't exist yet powered by infrastructure that doesn't exist yet so they can all make profit that is mathematically impossible at this point - Stolen from someone else.
Indeed. Uber Eats now makes you talk to an AI bot among other customer-hostile issues. I've largely abandoned them. The last straw was a driver leaving the food at some random house I could not even identify from the picture. It made me wait 5 minutes before I could do anything at all. Then it made me talk to a bot.
When I eventually got it to issue a refund I realized they kept the service fees and driver tip. For an order I didn't even receive!
If that's the best they can do I'll just go pick it up myself.
Order directly from the restaurant. You'll get better and faster service. And it'll often be cheaper as they have to increase the price on middle men platforms to pay for the fees.
Support for Youtube playlists please... having to lookup the watch ID on Google after a song is removed to guess what the track was is a highly sucky part of using Youtube for music.
I disagree. For large search pages where you're building payloads from multiple records that don't change often, it could be beneficial to use a cache. Your cache ends up helping the most common results to be fetched less often and return data faster.
> The worst thing is they worked out you can blend costs in using AWS marketplace without having to raise due diligence on a new vendor or PO. So up it goes even more.
reply