Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I am surprised that there's still money for this type of OSS SaaS companies.

Aren't all the money go to AI companies these days (even the unicorns didn't do well with their IPOs. E.g. Hashicorp).

That said, I love every single addition to the Go community so thumbs up from me.



There are a lot of AI startups that fall in the category of LLM API consumers (Anthropic/OpenAI wrappers). Or, as I heard the CTO of one of them joking, "we're actually more EC2 wrappers than OpenAI wrappers".

The problem we often hit when building apps on top of LLMs is managing LLM context windows (and sometimes swappable LLM providers). For which you need different types of worker/consumer/queue setups.

TypeScript is amazing for building full-stack web apps quickly. For a decade my go-to was Django, but everything just goes so much faster with endpoints & frontend all in the same place. But, finding a good job/queue service is a little more of a challenge in this world that "just setup Celery". BullMQ is great, but doesn't work with "distributed" Redis providers like Upstash (Vercel's choice).

So, in a roundabout way, an offering like this is in a super-duper position for AI money :)


It does seem like some really great options are emerging in the Go community, and a lot of newer execution frameworks are supporting Go as one of the first languages. Another great addition is https://github.com/riverqueue/river.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: