Thursday, June 2, 2016
Apple has been criticized for not launching new 4K/5K monitors ever since it launched the 5K Retina Macs, but it seems that's about to change - and that the new 5K monitors will even have an embedded GPU.
Though Ultra HD 4K monitors are becoming increasingly popular, aiming for higher resolutions shows just how close we're getting to current technical limitations. Not long ago 4K monitors could only work at 30Hz; and just think about it: a 60Hz 4K screen requires something like 1.4GB of data por second! (If you're hoping to get a 8K screen, that's 5.6GB of data - each and every second - which is why newer protocols are now looking into data compression to make it feasible to consider such resolutions).
Anyhow, on the 5K Macs Apple had to "tweak" the CPU and pixel driving circuitos to be able to make it work, and that's why it didn't launch a separate monitor. But now we have Thunderbolt 3, newer Displayport capabilities and USB 3.1, making such a monitor possible... and Apple goes even further by embedding a GPU on the monitor itself.
This is uncommon, though it goes in line with the trend of placing the GPU outside the computer (some gaming laptops are doing it as well), and would allow for even a modest computer to be able to use such high resolution displays. On the other hand, unless Apple would allow for a replaceable GPU (not likely) this would mean your monitor will become obsolete faster than ever.
Monitors are one of the few computer associated things that you can keep for a decade (or more). GPUs on the other hand, become easily obsolete in a couple of years or so. (In my case, ever since I first got my current monitor, I've changed graphic cards 3 times - soon to become 4 times.) So... I'm not so sure this is the way to go, though I can imagine all monitor manufacturers would love for the market to buy new monitors every couple of years instead of doing it once a decade or so.