My guess is that the OS assets would have to be redone after that. We are still stuck using pixel art for lots of things, and it is nice to have some standards until vector is good enough for everything.
Having said that, I still wish they were better supported in modern OSes as a hybrid solution because while they start to fall apart at small sizes they do handle the scaling in the other direction (which is what this topic is mostly about here anyway) well, certainly better than just upsampling a bitmap icon.
Because you still have to rasterise it, which is a relatively slow operation, and considering rendering the OS assets is one of the most common tasks you will perform, it probably needs to be a very fast routine
It's not that simple. Normally you render significantly upscaled stuff in an off-screen buffer and then downscale it for the best possible appearance for every single frame. There are some ratios you need to keep, or you'd end up with Linux-style graphics "quality" (try scaling factor other than 1.0). If you just scale icon/bitmap once without thinking about this, it would look most likely ugly after upscaling/downscaling steps (unless you got lucky).
It's like when you learn how to notice aliasing artifacts in photos taken with DSLRs. It cannot be unseen. So even if those issues are mostly noticeable in low-res, you "cannot unsee" them in higher resolutions anymore either.
The bigger problem is decent renderings at lower resolution. My wife did a lot of red lining for low-end Nokia phones. Even at the higher end, you still need to carefully produce bitmaps (make in Illustrator, red line the bits in PhotoShop). Now, this becomes easier at higher resolutions, but you still need to do it, I think (fonts are all hinted to avoid artifacts, some basic shapes are fine, but do anything complicated and you'll begin to notice jaggies that you can eliminate with a hand tuned bitmap...).
Probably because Windows already has the worst resolution scaling system of all operating systems, and developers basically have to support every single resolution that would appear in hardware. So they try to limit the resolutions to make it somewhat more manageable (even if it still remains the OS with the worst resolution scaling system).
As depicted in the chart in the article, 8k resolution only makes sense for displays that are very large or very close. 8k is where we stop increasing resolution and start increasing field of view, which is going to require some UI changes: games and movies will be about the only thing that should take up the full screen, popup notifications in the corner of the screen might not even catch the eye if you're working in a window on the other side of the screen, and eye/head tracking might become really useful.
4K on a 28" monitor is only 150 DPI. It is a step up, but not as nice to look at as a rMBP 220 DPI screen (you can see the pixels everywhere!). The retina iMac has to go to 5K to look as nice as an rMBP.
The “retina” iMac is 5120 by 2880 pixels at 27" diagonal, or 218 pixels per inch. (And it’s amazing.)
4K (3840 by 2160) is pretty nice on a 24" display, but I’d rather have a 20–22" diagonal for it.
At “8K” resolution (i.e. 7680 by 4320 pixels), you’ll get a great result for a display at like 40–44 inches diagonal (220–200 pixels per inch), or even 50", as with such a big display you’ll tend to stay a bit further back.
I'm yearning for the day when we can't tell if something is printed or rendered on a computer display. We need to get rid of the backlight, so maybe e-ink or OLED, and then...8K might just be good enough.
You aren't going to look at a monitor from the same distance as at your phone or rMBP. "Retina" is a function of distance. 1x1m pixel is retina when viewed from 1km ;-)
Retina is merely doubling the existing accepted DPI for any device and using it as an integer scaled HiDPI device. I look at my 24" 1920x1080 ("1080p"), so doubling the DPI would give me 3840x2160 ("4k").
My 24" is already at the optimum distance (approximately 29" away), so I'm not sure why you bring up distance here.
Also, my 13" MBP Retina sits under my center monitor, so yes, I do in fact look at it at the same distance (although the optimum distance is approximately 23" for that).
As for 27" screens, their res is typically 2560x1440, thus 5120x2880 ("5k") would be the correct res, and what 27" iMac Retinas ship with (optimum distance is about 24").
8K is unlikely to surface in consumer desktops and laptops within the duration of Microsoft's usual OS upgrade cycle. Does this imply Windows 10 will be supported for longer? I supposed it's either that or Microsoft finally intends on pushing the resolution envelope.
I agree that that is the most likely reason. I may have also overestimated how likely the alternative interpretations were from the tone of the announcement.
Is not that just scaling issue? used 8k virtual display onh Windows 7 without problems.
(multi-display setup, thanks to AMD Eyefinity Windows see only one large screen).
Why not 16k? Microsoft has never done too well with the forward looking bit. We were stuck at 1080p because software did not support more, and there was no demand for monitors and vice versa. Cell phones and Macs validated the market, and 5 years later, MS starts to catch up. 8k displays are now coming out, so Microsoft adds support without looking forward another few years.
An image by definition would always be represented as a 2D array. Unless you are using hologrophy or a projection field... which still in essence require a 2D array of representing the light field, often needing a higher resolution.
Images aren't always processed in 2D arrays though but are always represented in 2D arrays for display.
The problem is, of course, not in 2D arrays per se, but in the RAM and processing power wasted on processing arrays of pixels. You can still use 2D arrays, but in a more efficient fashion.
"Microsoft Windows will support displays with up to 8K resolution" would be better, as using 8K displays would be only useful for the Matrix Architect.
Of course, nobody needs more than 640K of memory either.
Seriously, though, although I'm sure 4K will become somewhat commonplace in 10 years, the rate of progress for desktop screens is ridiculous compared to the mobile world. I still don't quite understand what stops us from making cheap, large arrays, when we can make the tiniest, most colourful pixels to keep in our pockets. Is it just the general market being completely ignorant/indifferent to good monitors?
They already do -- Windows 8.1 looks great on a high-DPI display and most apps support it. (Although it's annoyingly obvious when you run an old app that doesn't, because its window gets upscaled and the text is visibly blurry.)
Basically that's the mood in the industry for the past 5 years or so.
I am using a 31" Cine 4k 10-bit monitor and barring the very latest M.2 PCIe SSDs, I can't even get realtime playback of uncompressed 14-bit RAW 4k@24fps footage on a top-end system (60fps is a sci-fi). Not to mention a top-end NVidia GPU won't allow me to smoothly run any recent game in best settings in 4k or UHD, experiencing occasional signal drop-outs while driving 4k x 2k @ 60Hz via DP or miniDP with expensive cables, slow progress of CPU performance improvements since 2004, and indeed it looks like it might be the case 8k will be the final digital TV resolution unless some dramatic technological change occurs. Cinemas with their large screens seem to be doing just fine with 4k.
My 55" UHD TV shows noticeable improvements only when I stand closer than 1.5m, subconsciously I have a feeling of a better picture quality under 3m distance while driving 4:4:4 UHD@60Hz from a computer via HDMI 2.0 (YMMV)
I don't believe that for a second. My 5" phone has 1920x1080 resolution and it was a very noticeable step up in quality. Assuming the dpi of my phone is near the limit of what is necessary, it'd take more than 8K to fill my field of view with displays at the same DPI.
How far do you hold your phone from the eyes? Now compare it to your regular TV - how far do you watch it from? With 55" 4k UHD you'd get retina-style appearance from 1.5m upwards.
I would have thought that displaying up to N pixels - where N is much higher than the number of pixels in an 8K display - would generally be possible.