How the errors/debugging compare? From what I've read this is the main pain point with Nix where a more mature language like Guile should have a much better experience here. The article touches on this but I'd be curious of a more extensive comparison about this aspect.
Just a personal anecdote, but the errors from Guix are terrible. I had to reinstall because I couldn't figure out the scheme errors for my system config
It's generally a problem with Guile. If you get decently good with Geiser or stare at the stack traces long enough, you can figure out the problem but I shouldn't have to do either.
And which an LLM/AI model can apply the huge training set of Lisp/Scheme to help solve your problem.
Nixlang is so infuriatingly obtuse that I generally have to fire up Discord and bug the local Nix acolyte when something goes wrong. I've bounced Nixlang off of the LLM/AIs, but I have learned that if the AI doesn't give you the correct answer immediately for Nixlang then you need to stop; everything forward will be increasingly incorrect hallucinations.
I suspect LLM/AIs will hallucinate far, far less with the Scheme from Guix.
What? From what I've seen Nix configs and Guix configs look nothing alike in terms of structure. Guix uses Scheme modules whereas Nix modules are just fancy dicts merged together lazily. Which is sad because I like Lisps but I prefer the Nix way of structuring the config.
Yes this is the standard proof of infinitely many primes but note that my prompt asked for infinitely many even primes. The point is that GPT would take the correct proof and insert "even" at sensible places to get something that looks like a proof but is totally wrong.
Of course it's much better now, but with more pressure to prove something hard the models still just insert nonsense steps.
While that seems like a convincing explanation, 750Hz is a rather odd value to use for a timer, and more importantly the overflow would be at 66d6h43m43s instead of the reported ~66d12h.
For example, why couldn't you use the waste heat like a power plant? Use it to boil water, to turn turbines, to generate electricity, which gets sent and consumed elsewhere? Adding to the heat wherever the electricity is finally consumed. (Ignoring various losses along the way).
“Elsewhere” is still somewhere on the Dyson sphere.
Or if you magically beam 100% of the captured energy somewhere else, now that place gets to deal with shedding the heat from however many 1e26W+ of power were consumed. God help the poor planet you aim that ray of death at.
Well, what if it's a separate directory meant exclusively for remote systems alone? And what if the remote mount is read-only, perhaps with a writable layer on top using overlayfs that can be discarded on logout?
It's actually far less complex than what container runtimes do. I've even done parts of those, which is why I'm able to suggest it. I'm thinking about implementing it and was checking if anybody else wanted to do it or if they foresee any problems that I can't.
reply