Right, that's what I'm saying. For whatever reason it seems publishers decided they don't want their preview-only books as part of the full-text search across all books. If they decide that, Google has to comply.
This isn't like web search where web pages are publicly available and so Google can return search results across whatever it wants. For books, it relies on publisher cooperation to both supply book contents for indexing under license and give permissions for preview. If publishers say to turn off search, Google turns off search.
They don't do full text search anymore esp for copyrighted books. I wonder if this is not a regression but an intent to give them a let up in the AI race.
The YT transcripts are linked to on the YT page itself. If they remove that, it is trivial to use a local STT model to transcribe the video. If they make it impossible to download a video, you could just have a microphone record all of the sound, and so on. Once you have the transcription of anything, summarizing is trivial. I have a local script that does this and I use it all of the time. Also produce diagrams for YT summaries. Hours saved, per day.
The left results are contemporary, the right are decades old. That includes editions of the same book --- surely the newer edition is going to be preferred by most readers.
I guess. That's not immediately clear to me. However, browsing around on Google Books suggests to me that it is the corpus which changed, not the algorithms.
> surely the newer edition is going to be preferred by most readers.
Why? Where different editions exist, the reader will want to know which one they're getting, but they're unlikely to systematically prefer newer editions.
But also, Google Books isn't aimed at "readers". You're not supposed to read books through it. It's aimed at searchers. Searchers are even less likely to prefer newer editions.
> they're unlikely to systematically prefer newer editions
That seems wrong to me. Generally when a new edition of something is put out it's (at least nominally) because they've made improvements.
("At least nominally" because it may happen that a publisher puts out different editions regularly simply because by doing so they can get people to keep buying them -- e.g., if some university course uses edition E of book B then students may feel that they have to get that specific edition, and the university may feel that they have to ask for the latest edition rather than an earlier one so that students can reliably get hold of it, so if the publisher puts out a new edition every year that's just different for the sake of being different then that may net them a lot of sales. But I don't think it's true for most books with multiple editions that later ones aren't systematically better than earlier ones.)
> But I don't think it's true for most books with multiple editions that later ones aren't systematically better than earlier ones.
Most books with multiple editions are books that have been translated multiple times. It is definitely true that later translations aren't systematically better than earlier ones.
You focused on the C++ aspect and completely failed to engage with the actual critique - what is a “simple” language that you’re evaluating Rust against as a failure?
OS is to blame. There should be a way for the OS to tell to the app "offload your state" like phones do. Paging is supposed to achieve this but does not.
reply