He encodes bits as signs of DCT coefficients. I do feel like this is not as optimal as it could be. A better approach IMO would be to just ignore the AC coefficients altogether and instead encode several bits per block into the DC. Not using the chrominance also feels like a waste.
This actually won't work against YouTube's compression. The DC coefficient is always quantized, rounded, scale, and any other things. That means that these bits are pretty much guaranteed to be destroyed immediately. If this is the case for every single block, then data is unrecoverable. Also, chrominance is not used on purpose, because chrominance is compressed much more aggressively compared to luminance.
I meant choosing multiple values, e.g. 4 to represent 2 bits. Say, 0.25, 0.5, 0.75, and 1. Then when decoding you would pick the closest valid value, so for example for 0.20 it would be 0.25. Not using AC coefficients would mean that theoretically you would get more bitrate for the DC ones.
I’ve been told this many times in the comments, but this again is not reliable. Simply put, compression doesn’t necessarily follow a pattern, so specifying “ranges” or rounding to a specific place will not work. Compression optimizes for the eye, and doesn’t do the same thing for every value. It will round some down, some other mores, others less. Giving a range is simply not enough.
I'm not well-versed in the terms, so I'm not sure which part is the so-called "audio aliasing."
To me, the original has very obvious background noise which the enhanced version removes. But as the author has said, the enhanced version sounds "muffled" (and, IMHO, not just a little), which probably makes most people (including me) feel it sounds worse.
Also, shouldn't most of music be included in the game's official OST? I assume that version would not be limited by the game media's technical limitation at the time and should represent the artistically intended version best.
Edit: apparently in this very case, "Metroid: Zero Mission" doesn't seem to have any official OST release. Unfortunate.
What I don't get is how the author can't pin the year down to anything narrower than "between 1994 and 1997," especially considering he wrote the article in 2002: only a few years later.
I'm not at all implying the story was fake; just this particular thing feels weird.
That’s because the HTML code is server-side rendered (SSR) with data-theme="dark" hardcoded on the <html> element, so on the initial page load the browser immediately renders with dark mode styles applied. After ± 500ms-600ms Nuxt’s JavaScript hydration kicks in (as this is a Nuxt app based on __NUXT__ at line 11,236), which detects your macOS system preference via prefers-color-scheme media query and updates data-theme to "light".
it shouldn't need to do this. Nuxt has a @nuxtjs/color-mode module which ensures that the correct colour scheme is applied before the browser starts rendering the html.
Not sure whether such "criticism" is welcome here, since it is ultimately subjective, but I will just be blunt and say: I disagree.
I like this style of writing as well, but I think this article overdoes it, to the point that it became somewhat irritating to read.
The part where I particularly feel this way is when the author spends two whole paragraphs discussing why YouTube (or its developers) chose to sample by "100" segments, to the extent that the author even asks, "If you work at YouTube and know the answer, please let me know. I am genuinely curious." Which, for lack of better words, I found ridiculous.
> Not sure whether such "criticism" is welcome here, since it is ultimately subjective, but I will just be blunt and say: I disagree.
If this was my post I'd certainly appreciate criticism.
> but I think this article overdoes it
Perhaps its overdone in places, to your credit the question about if 100 was an arbitrary number was a bit much. But, as a counterpoint, I found the related pondering of "might it make sense to have variable time duration windows" to be interesting. The interpolation YouTube ultimately selected is deceiving and variable density could be a way to mitigate that.
There's definitely a healthy balance and perhaps the author teeters on the verbose end, but I mostly just wanted to voice that I was surprised about the type of article it was, but not in an unpleasant way.
You are likely right that I over-rotated on the "storytelling" aspect there. My curiosity about the "100 segments" stemmed from wondering if there was a deeper statistical reason for that specific granularity (e.g., optimal binning relative to average video length) versus it just being a "nice round number."
That said, I can see how dedicating two paragraphs to it felt like over-dramatizing a constant. I will try to tighten the pacing on the next one. Thanks for reading despite the irritation!
Windows struggles even with native apps, as soon as you have monitors using different scaling settings.
I'm currently using a laptop (1920x1200, 125%) + external monitor (1920x1080, 100%) at work. The task manager has blurry text when putting in the external monitor. It is so bad.
Yep, I've been running a Windows laptop plugged into a pair of monitors for the past ten years at work, and across multiple laptops and from Windows 10 to 11, this has always been a problem. If I undock to do some work elsewhere and come back, I either have to live with a bunch of stuff now being blurry, or I need to re-launch all the affected programs.
I also have programs that bleed from one monitor onto another when maximized. AutoCAD is one offender that reliably does this -- if it's maximized, several pixels of its window will overlap the edge of the window on the adjacent screen. The bar I set for windows is pretty low, so I'm generally accepting of the jank I encounter in it vs Linux where I know any problem is likely something I can fix. Still, that one feels especially egregious.
reply