With this month’s Consumer Electronics Show a large number of companies were pushing the same barrow – 4K. And no surprise, either. The 4K resolution is one of the newest technologies to hit, and its simple premise of a better quality image is an easy sell to customers who were surprisingly uninterested in 3D, and have really not bothered upgrading significantly since full high-definition became the default.

TV sales have been dismal over the last few years, and companies such as Panasonic and Sony need to try something new to stir customers into picking up the very latest and greatest.

Enter 4K. On paper, 4K is pretty easy to describe. It’s a significant upgrade in resolution from more standard forms of high definition, and is known as “ultra-high definition”. 4K means 4000, and refers to the number of pixels across horizontally. This is an interesting departure from previous measures; 720p HD and 1080p full HD both refer to the number of lines, or vertical resolution.

The 4K fallacy: Why ultra-hi-def is pointless for Kiwis

To put it in a comparable figure, 4K would be an impressive 2160p. This improvement isn’t double the resolution because it’s doubled in both directions, so it’s actually four times the number of pixels – an obvious and significant improvement.

New-gen console manufacturers have been vague on their support for 4K. Sony has muttered about maybe possibly supporting it at some time, stating that the PlayStation 4 "doesn’t currently support 4K output for games", while 4K movie output is “in consideration”.

Whether the Xbox One will support 4K is even more unclear. Surely either console could provide a fantastic ultra-HD experience? Yes – on paper. The reality is somewhat different, and here’s where 4K’s advantages start to look vastly less appealing.

PIXEL OVERLOAD

The first problem is content. Having four times the pixels means four times the data. Even a Blu-ray, currently the largest capacity disk-based medium, can only store about 50GB. A standard movie is approximately 30GB. A 4K movie would be closer to 120GB.

These numbers are estimates, and there is room to change compression formats and so forth, but they serve to illustrate that a standard disk-based content medium simply doesn’t exist to serve this market. There are specs in discussion, but nothing has progressed further than that.

When the first 4K TVs were shipped by Sony, the company included a loaner hard disk drive which contained several movies. Anything to make the US$18,000 display useful. Sony in particular is pushing 4K content as hard as it can, including a new “Mastered in 4K” range of Blu-ray that has seen it cop a lot of criticism for misleading advertising.

A “Mastered in 4K” film is not in 4K, but instead its a standard Blu-ray (1080p) with some quality improvements that make it scale up better than a normal Blu-ray would.

The 4K fallacy: Why ultra-hi-def is pointless for Kiwis

The solution to this content shortage is the next fatal flaw: online streaming services. Some streaming service providers – most notably Netflix and now Sony too – announced at the Consumer Electronics Show that they would be providing 4K streams for viewers demanding that higher-resolution content.

Netflix in particular will begin recording its highly-regarded House of Cards original series in 4K, and transition all other Netflix originals to 4K filming at a significant cost. With streaming services such as Netflix becoming the standard mode of viewing for many forms of visual media, it seems that this generation of upgrades may not be limited by the storage medium at all.

Except in New Zealand.

With the increase in pixels by four times, bandwidth increases also by a factor of four. There is no way around this, it’s a simple fact. Given that it’s rare for Kiwis to have internet access good enough to view a full HD stream clearly and without buffering, it’s unlikely many will be able to provide the 20Mbps stream required for 4K content.

New Zealand is simply too far behind technologically for the revolution of streaming to be viable here – a galling reality that shows no signs of improvement. Here, 4K itself isn’t the issue.

The 4K fallacy: Why ultra-hi-def is pointless for Kiwis

On console, there is a problem related to bandwidth. Console games don’t need “bandwidth”, they don’t need storage. They’re generating the content, so they can generate it at any size. But there are still costs and consequences to drawing that many pixels. There is a reason that many PlayStation 3 and Xbox 360 games don’t run at 1080p.

Even on the Xbox One, Call of Duty: Ghosts only runs at 720p and is upscaled to "full HD". And doing so is hard on the system, draining performance that is needed for other things. Especially if those details aren’t visible.

And that is the final, fatal flaw for 4K when it comes to console games. Our eyes just aren’t good enough.

MATH IS SEXY

What none of the marketing mentions is that the majority of people in the majority of situations simply cannot see the difference between 1080p and 4K. The reason is simple in principle, but complex mathematically. It’s called angular resolution. What this comes down to is that a viewer can see finer detail up close, and only coarser details further away.

Extremely high resolutions are only useful very close. The minimum visible size of an object is a triangle that shrinks as it gets closer to the viewing eyes – that’s the “angular" part. Angular resolution is measured in degrees, or more specifically “arcminutes”. An arcminute is a 60th of a degree, and is the finest detail the eye can resolve.

The amount of detail the eyes can see, therefore, becomes a balance of the resolution, the size of the actual screen, and how close the viewer sits. None of these are fixed numbers, and all that can really be talked about is averages. The same goes for the eyes.

The 4K fallacy: Why ultra-hi-def is pointless for Kiwis

Some viewers will have better vision than others. The term 20/20 vision literally means a person’s eyes are average, not good: they can see at 20 feet what an average viewer can see at 20 feet. All of what follows is subject to these averages. People with exceptional vision (or exceptionally large TVs) may get a different experience.

The average distance a person sits from a TV is nine feet, or about 2.5 metres. High-school math shows the following:

2 x π x 2500 = 15707.9632679 - total circumference in millimetres

15707.9632679 / 360 = 43.6332312997 - millimetres in a degree

43.6332312997 / 60 = 0.727220521662 - mm in an arc minute

What this means is that the smallest thing a human can possibly see at 2.5 m from the screen is 0.727 mm wide. This is a best case, assuming perfect contrast, and so on.

Spreading the 1980 pixels of a 55 inch TV set among its 1217 mm width, gives a pixel size of only 0.615 mm. As shown above, that’s smaller than the eye can detect, even as a best case. The math is slightly inaccurate here (there are spaces between pixels, for example), but it’s a ballpark.

The 4K fallacy: Why ultra-hi-def is pointless for Kiwis

This means that for most people, using most televisions, in most situations, 1080p is already at and beyond the limit of their vision. That's because a 4K pixel on a 55 inch television is only 0.32 mm. To approximate seeing this level of detail, a viewer would have to sit an uncomfortably close: 1.2 metres from the screen.

These numbers can be changed. Sitting much closer to the TV, for example, would have an effect. But the middle 60 per cent of our vision is the strongest part, and we seek to put objects we’re looking at into it – a physiological fact. When viewing a larger TV, we will naturally tend to move further back from them to fit them in that 60 per cent sweet spot.

Bigger ain't better

Of course, having a TV resolution that sits exactly at the edge of our perception isn’t ideal. It's better to go above that for things like sub-pixel blending and other techniques that let the image look better that aren't related to resolution. But these are minor factors.

It also has to be clarified that this is not relevant to PC gaming. PC gamers sit more like one metre from their screens, and so in that instance resolution absolutely does make a difference, even with the smaller-than-TV monitors used.

Some people also simply have better vision than others, and may well be able to see details others would not. Or they may prefer sitting closer than at the two metre mark where a 55 inch TV becomes theoretically “out resolutioned”. Or, indeed, they may have a TV that's much bigger than 55 inches.

This is not an attempt to poo-poo progress, nor is it a short-sighted “640K will be enough for anyone” statement. It’s a simple fact. While the numbers might get bigger, for the vast majority of people in the vast majority of situations with the vast majority of TVs, 4K resolutions will simply not provide any visible improvement over the 1080p of Blu-ray. For console gamers, graphics can and will get better, but they will do so within the current 1080p resolution.