So you finally got around to picking up a 4K TV or monitor? Congratulations… it’ll be obsolete soon. Or at least not state of the art anymore.

The HDMI Forum has just released version 2.1 of the HDMI specification and, among other things, it enables support for 8K displays at 60 Hz.

Don’t need an 8K display? The new specification also brings support for 4K displays with refresh rates up to 120 Hz, which is double the previous maximum.

The new standard also supports dynamic HDR and resolutions up to 10K for some commercial applications.

You’ll need a new “Ultra High Speed HDMI Cable” to get top performance, since existing HDMI cables can’t deliver the 48 Gbps of throughput supported by the new standard. But the new cable is backward compatible, so once you get around to buying new HDMI cables you’ll be able to use them with your old gear as well as your new HDMI 2.1 compatible devices.

It could be a little while before we start to see devices with HDMI 2.1 ports though… and it’s questionable whether you’ll actually need an 8K display in your home anytime soon. Even 4K is probably overkill unless you have a very large TV and plan to sit very close to it… although I suspect next-gen gaming and virtual reality displays will benefit from the new standard.

 

 

Support Liliputing

Liliputing's primary sources of revenue are advertising and affiliate links (if you click the "Shop" button at the top of the page and buy something on Amazon, for example, we'll get a small commission).

But there are several ways you can support the site directly even if you're using an ad blocker* and hate online shopping.

Contribute to our Patreon campaign

or...

Contribute via PayPal

* If you are using an ad blocker like uBlock Origin and seeing a pop-up message at the bottom of the screen, we have a guide that may help you disable it.

Subscribe to Liliputing via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 9,547 other subscribers

5 replies on “HDMI 2.1 brings support for 8K displays (at 60 Hz)”

  1. So you need a new Cable?

    Then why go HDMI, why not go with USB-C ?
    Its reversible, and (will be) more ubiquitous.

    Also wondering when we will get 4K-HDR-and-FreeSync on TV’s. While 60Hz is what’s most optimised for titles, it would be good to also get 120Hz thrown in there for some extra breathing room.
    I mean there’s the Xbox One X now, and it supports a HDMI 2.0 output of HDR10, 4K, 60Hz, and FreeSync all at the same time. However, there are exactly ZERO tv’s that support it. And there are only TWO gaming monitors that do: Asus VP28UQG and Samsung UH750.

    Guess we need to wait for 2018 and beyond for FreeSync 2 to come to the general market.

    1. The new HDMI cable is backwards compatible and the high-speed pairs are clocked 3 times faster. A relatively simple interface change for manufacturers. All the out-of-band signals are kept, along with their features (HDMI CEC, etc…). The cost to cable manufactures is expected to be very small, and testing the cables is straight forward as only change is to the high-speed pairs. A smart decision in my opinion.

      Freesync and Gsync signals to the monitor/tv travel over the existing HDMI cable. Another reason to have the new cable be based close to the previous one.

      USB-C only has 2 high-speed pairs, for a maximum 20Gbps in one direction… it is better than the current 14.4Gbps that the current HDMI cable offers. Moving to 48Gbps makes it more that twice the bandwidth of USB-C. If USB type C were to be upgraded to 40Gbps, it would need a new cable too.

      Just some thoughts.

  2. I completely understand why people would want to shoot footage in 8K, but few people will ever need more than 4K for TVs or monitors.

    1. Re-wind 5 years: “I completely understand why people would want to shoot footage in 4K, but few people will ever need more than 1080p for TVs or monitors.”

      1. To be fair, there was a big difference in requirements of the processor, memory, and storage to upgrade from 720p footage to 1080p footage. Maybe not exactly in 2013, but at a certain point in time that was true.

        However, the difference in actual observable visual quality jump from 720p to 1080p wasn’t that pronounced. And even today, isn’t quite a large difference until you blow it up on a large TV and view it from close proximity.

        So yes, shooting in 8K is pointless today, in a practical sense. You can even make the same rationalisation for 4K resolution, when looking up from 1080p. Case in point: look at movies released around 2002, some shot in HD look better than others shot in FHD, because resolution is only one factor when concerning quality.

        Imagine using a Note8 (or -like device) today to shoot 720p, 1080p, 4K, and 8K footage. Nothing too fancy, just a home video let’s say at a Park. So 720p would be passable on Big (70″) TV, decent but obviously not the best. Jumping to 1080p we would see some improvements, however, the footage would clearly still be sourced from a phone. Now bumping further to a 4K resolution, there would be almost no discernible improvements. Maybe HDR+ would make an impact, but that isn’t an improvement from the resolution. And finally let’s jump to 8K. Even if it was HDR+ compliant and our display supported it well and the display was physically large to make the improvements obvious… you actually would not get an improvement in quality. It’s because there are other limiting factors, in our case, it is the small/weak camera and lens. It would be clear that we are watching a Home Video, rather than a production-shot film. Its nothing like a expensive RED camera, to be frank.

        Maybe 5 years later, we can make the same arguments in the future.
        Where 1080p has become the (low) base resolution, and 4K is the mainstream resolution. However, 8K resolution is for the expensive products (prosumer) and out of mainstream market. However, if you salvage something you shot in 8K today with your Note8-like device 5 years later, even, five years later it won’t look better. You need to have the right hardware (lens, camera, etc etc) tools and the right software (standard, compression, colour-range, etc etc) at the time of capturing the footage for it to be worthwhile later.

Comments are closed.