You’re unlikely to notice the difference with 4K TVs, so what’s the point, we ask
Are we being ‘over-pixelated’ in a quest for higher resolution?
Ever since audio-visual media was first produced for our entertainment back in the late-19th century, the race has always been on to improve its quality. We’ve seen countless innovations, from the arrival of glorious stereo sound to the mind-blowing switch from black and white to colour, but the aim has always been to make it seem more “real”, to narrow the gap between our experience of watching the screen and our experience of the world around us.
One of the more recent improvements to be marketed to us has been 4K, or Ultra HD, two terms that have become interchangeable and mean roughly the same thing: extremely high-resolution video that’s roughly 4,000 pixels across. The format has been touted for a while; back in 2015, Panasonic and LG were marketing screens by telling us that 4K “will change the way we watch television”. Thus far it has done no such thing, but that might be set to change.
In the past week, Apple, Amazon and Google have cut the price of 4K films on iTunes, Amazon Video and Google Play respectively, in the hope we will finally buy (and use) their associated 4K hardware, ie. 4K Apple TVs, Amazon Fires and Chromecast Ultras. Oh, and not forgetting a 4K television to actually watch them on.
Yes, having the full 4K experience in a domestic setting involves a big outlay for consumers, even if the price of the videos themselves has fallen. The question that those consumers have been asking – and it’s a good one – is how much improvement they will end up with once they’ve spent that money.
The jump from “normal” television to HD was sufficiently noticeable to prompt millions of people to upgrade, but 4K pushes the boundaries of what humans can actually perceive. These new television sets, with their 3,840 x 2,160 pixels, may well have four-times the 1,920 x 1,080 pixels of an HD television, but in practice we’re likely not to be able to tell the difference between them.
Optical experts tell us that a person with 20/20 vision can resolve 60 pixels per degree of vision. An analogy: if you step further and further away from a bowl of sugar, you’ll become unable to see the individual grains. For us to appreciate the extra pixels that 4K offers, we’d either have to sit much closer to the television, or our screens would have to be obscenely large (around 2.5 metres on the diagonal if we’re sitting two metres away). These solutions would significantly disrupt living spaces; we’d either have to rearrange the room or have our lives dominated by a screen that’s even bigger than the people who are watching it. This poses a dilemma for manufacturers; they’ve become used to successfully coaxing us into upgrading on the promise of better quality (even if we show initial resistance), but what happens when we no longer believe them and start to wonder if we’re being sold ‘the Emperor’s new clothes’?
Historically speaking, we have tended to buy their arguments, even if the claims have been on shaky ground. Few people, under normal listening conditions, could discern between a high-quality MP3 and its uncompressed equivalent, but if we’re told that we’ve been emotionally short-changing ourselves by listening to MP3s, there will always be people who obediently jettison their old media and pay for the upgrade. We’ve snapped up cables with gold-plated connectors or gently curved TV screens, convincing ourselves that there’s a noticeable improvement, even if science tells us we won’t necessarily be able to see or hear it. But the claims being made for 4K look particularly hollow when you discover that most films are currently mastered in 2K and are merely scaled up for the new format.
The irony is that, away from the fuss surrounding the superficially-impressive pixel counts and screens that are too big to carry up a flight of stairs, a more significant technological improvement has been made more quietly. In a list drawn up by a team of video display consultants of the most important aspects of picture quality, the number of pixels (the 4K question, if you like) comes last. Matters relating to colour, such as contrast and saturation, are deemed far more important – and that’s where we’re seeing real, noticeable improvement on newer televisions, thanks to a technology called HDR, or High Dynamic Range (this is different to the HDR you’ll encounter in the world of photography – it just happens to have the same name).
Simply put, HDR pumps up the luminance of colours being displayed on a screen, making dark colours look richer and light colours much brighter. There are three competing standards battling for supremacy, but they all bring significant enhancements that, crucially, the human eye can see very easily. If any new 4K gear that you buy carries an “Ultra HD Premium” label, it’s been certified to offer you the full HDR experience – provided, of course, that whatever film you’re watching has had the HDR mastering treatment.
Given that the industry has produced a technology that makes a genuine improvement to our experience of moving pictures, why are content providers and hardware manufacturers pushing 4K rather than HDR as the next step for consumers to take?
The answer, perhaps unsurprisingly, is that big numbers sound impressive. For all HDR’s ingenuity, it’s not particularly easy to explain (as I just discovered) and you’re unlikely to get a coherent description of its benefits from a salesperson in a store.
In terms of pixels, however, we can be given the oldest sales pitch in the book: you’re getting more for your money. It’s worth remembering, however, that numbers aren’t everything. LG and Sharp have already been touting their new 8K screens at electronics shows, but in terms of human perception, the numbers have already gone as high as they can go. If you’re looking for genuine improvement in your audio-visuals, the real innovation will be taking place underneath the bonnet.