Hacker Newsnew | past | comments | ask | show | jobs | submit | carlmr's commentslogin

It’s funny, I’d say the details are right, but the overall picture is still wrong.

It tries to cram too many things into this one show. Like a medley of computing history.


Yeah, the the characters kind of feel like Doonesbury characters, where they just slot in wherever they're needed at a particular moment in history. Each season's story by itself feels authentic, but when you watch their character arcs from start to finish, each person involved would have to be a generational talent.

And it's not like that kind of thing never happens, like look at General Magic and its through-lines through the tech industry up until 2015 or so, but it just happens too conveniently in the show. Particularly Bosworth's role seems far-fetched to me. He's already at the end of his career in season 1, and somehow he remains relevant through the internet age?

The "Phoenix" monologue in the last episode evokes nostalgia for everything Donna and Cameron have been through, but it also breaks suspension of disbelief by pointing out just how much of history these two people have been involved with firsthand.


Take a look at the careers of Ken Thompson, Rob Pike, Brian Kernighan, Larry Wall, James Gosling, Kirk McCusick, Allen Holub, Al Aho, Marvin Minsky, Daniel Friedman, Gerald Sussman, Lance Leventhal, John Carmack, John Romero, Paul Graham, Guy Steele, Christopher Date, Bill Joy, Eric Raymond, Douglas Comer, Andrew Tanenbaum, David Patterson, Jeffrey Ullman, Fred Brooks, or Jim Keller.

To stretch the human analogy, it's short term memory that's completely disconnected from long term memory.

The models currently have anteretrograde amnesia.


I was thinking the exact same thing. Moltbook isn't that sophisticated. We're moving goal posts a lot here.

However, I do think 1 week is ambitious, even for a bad clone.


So if Reddit is just a CRUD app, what is Moltbook?

An impressive MVP of Reddit, with zero sophistication. It's a CRAP app.

My point exactly. But if you're semi-capable and have a week of spare time, you can build a better Reddit clone, or so I heard.

>IIUC, same model with same seed and other parameters is not guaranteed to produce the same output.

Models are so large that random bit flips make such guarantees impossible with current computing technology:

https://aclanthology.org/2025.emnlp-main.528.pdf


>but the code to me is a forcing mechanism into ironing out the details, and I don't get that when I'm writing a specification.

This is so on point. The spec as code people try again and again. But reality always punches holes in their spec.

A spec that wasn't exercised in code, is like a drawing of a car, no matter how detailed that drawing is, you can't drive it, and it hides 90% of the complexity.

To me the value of LLMs is not so much in the code they write. They're usually to verbose, start building weird things when you don't constantly micromanage them.

But you can ask very broad questions, iteratively refine the answer, critique what you don't like. They're good as a sounding board.


I love using LLMs as well as rubber ducks - what does this piece of code do? How would you do X with Y? etc.

The problem is that this spec-driven philosophy (or hype, or mirage...) would lead to code being entirely deprecated, at least according to its proponents. They say that using LLMs as advisors is already outdated, we should be doing fully agentic coding and just nudge the LLM etc. since we're losing out on 'productivity'.


>They say that using LLMs as advisors is already outdated, we should be doing fully agentic coding and just nudge the LLM etc. since we're losing out on 'productivity'.

As long as "they" are people that either profit from FOMO or bad developers that still don't produce better software than before, I'm ok ignoring the noise.


I yawned when I read your comment.


他是 is tāshì which doesn't transform I think. Did you mean to write 你是 nǐshì? I think that transforms differently though. With the half 3rd tone only dropping.

The classical example is 4/4 不是. Which goes bùshì -> búshì.

Or 3/3 that becomes 2/3. E.g. 你好 nǐhǎo becoming níhǎo.

The 1/4 -> 2/4 transformation I think is specific to one. 一个 yīgè becomes yígè.


>In particular, an error on one line may force you to change a large part of your code.

There's a simple trick to avoid that, use `.clone()` more and use fewer references.

In C++ you would be probably copying around even more data unnecessarily before optimization. In Rust everything is move by default. A few clones here and there can obviate the need to think about lifetimes everywhere and put you roughly on par with normal C++.

You can still optimize later when you solved the problem.


Clone doesn't work when you need to propagate data mutations, which is what you need most of the time.

Another option is to just use cells and treat the execution model similarly to JavaScript where mutation is limited specific scopes.


>Year Of The Linux Desktop

After Win11 Microsoft really did all they could to get us there this year.


On your example, without reading into the implementation, I'm wondering if the comment is wrong, or if the comment is telling us about a hidden default, but then what does the 2 mean.

    // Create an LRU cache with a capacity of 100 entries
    let mut cache = LRUKCache::new(2);

Why 100? Why not 2?


Another time I wish rust had named parameters. For reading code without an IDE (which is a lot of the experience in discovering a library, learning, navigating in code on github...) it would be useful.


I think so, too, it's beautifully designed in many ways, but this seems like an oversight.


derp, It should be LRU instead of LRU-K.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: