Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The interesting thing is that there exist techniques to generate coherent and even interesting stories, that do not rely on copying large amounts of examples thereof:

https://thegradient.pub/an-introduction-to-ai-story-generati...

Unfortunately those are a) virtually unknown outside academic circles and b) about to die a death. The latter, the death, they're about to die it because their space is now being taken over by a much more prolific technology that generates low-grade bullshit cheaply.

Life, innit. It goes in circles.



Thats part of my concern with all the money being dumped into openAI et al after they productive their generative models.

If all the money is in generative models that give "good looking" results, who is going to spend money investing in the techniques that don't produce as "good looking" results right now but who have a potential beyond recombining it's training data?




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: