Saturday, 27 January 2018

1080P Blu-Ray vs. 4K UHD Blu-Ray: Does "Interstellar" actually benefit from 4K!?

Compromised quality?
As you know, Christopher Nolan is known for demanding the best out of his work visually. A few weeks back, I had a look at Dunkirk and was able to appreciate the difference the 4K/UHD Blu-Ray made compared to standard HD 1080P Blu-Ray. Despite the plot holes and nitpicks (it is sci-fi after all), I very much enjoyed Interstellar in the standard digital IMAX theater when it first came out in 2014. As I was watching the recently released UHD Blu-Ray, while the HDR effect looks ok (I wouldn't say great), I was rather disappointed by the apparent lack of resolution enhancement moving from 1080P... Even in the IMAX 70mm scenes, everything seemed less defined than I had expected coming from Dunkirk. Seemed a bit strange especially since it's claimed that this was "released under the supervision of the director himself, approving the 4K scan and colour timing".

Let's have a closer look shall we?

So, I reached out to the friend who sent me the clips from Dunkirk to have a better look at what's happening here with some actual rips off the Interstellar UHD Blu-Ray...

To start, remember that Nolan likes analogue film, so the technical background is that he captured most of Interstellar in 35mm with quite a number of scenes in 70mm IMAX (about 1/3 of the movie). You can tell the difference as the aspect ratio changes throughout the movie; anamorphic 35mm at 2.4:1 widescreen and then into 16:9 for the material filmed for 70mm IMAX (cropped from the actual IMAX 1.44:1). By comparison it's said that his previous films The Dark Knight and The Dark Knight Rises contained 28 minutes and ~1 hour of 70mm footage respectively.

For the disks themselves, the UHD Blu-Ray video was encoded in HEVC 10-bit HDR10 at ~51.7Mbps (a very good average bitrate) compared to the standard Blu-Ray's AVC at also a very decent 24.2Mbps.

Like in my previous post, I'm going to use madVR to do the conversions which include HDR-to-SDR with this setting:

Notice that I used peak nits of 275. This is a higher value than Dunkirk which I used 230 for to try to achieve a closer average brightness level with the regular Blu-Ray rip. What I found with Interstellar is that the color and contrast were much more variable between scenes so although I started with this value, for the comparison images below, I had to make little tweaks here and there to get the brightness reasonably comparative.

And then there's madVR's upscaling settings for chroma and luma:

As suggested in a comment previously (by Jarrett E Hather), I went with "NGU Sharp" this time in the "image upscaling" tab.

Also as before, I needed to find a few scenes with less action and motion blur. Plus the image has to be in reasonably sharp focus. Unfortunately, I found many of the 35mm dialogue scenes tended to be rather soft focused; akin to using a high aperture lens such that the depth of field in many shots seemed very narrow and one has to pick the right frame to grab the sharpest image.

I. 35mm Scenes

Starting with some 35mm scenes, I captured a crop and composited them side-by-side as I did last time with Dunkirk. As before, click on the side-by-side image to have a closer look on a 4K screen or download the image and view 1:1 to appreciate the differences.

A. Teacher (00:13:10)

B. TARS Interrogation (00:25:26)

C. Computer Monitor (01:04:25)

Hmmmm... Anyone seeing anything significantly different here between the 1080P Blu-Ray and the 4K UHD?

If anything, to my eyes I would say the madVR upscaling was perhaps a little superior! For example, TARS looked a little sharper and the image on the computer monitor may have looked a wee bit better with the 1080P version although admittedly the HDR grading may have affected the brighter areas (eg. maybe some clipping in the HDR-to-SDR process). The scene with the teacher obviously provided an opportunity to look for improvements in the resolution of the stands of hair... Alas, I do not notice any improvement.

II. 70mm Scenes

OK then. Maybe 35mm film just isn't capable of delivering the resolution difference between 1080P and 4K in this movie's master... What about those high-resolution 65mm/15-perf IMAX scenes then?

A. Endurance (00:45:18)

Hmmm, okay, the color tone is a bit different. A more greenish tone to the UHD. I also noticed that the UHD Blu-Ray image was a little enlarged ("zoomed in") compared to the 1080P - this applies only to the 70mm scenes and was not the case with the 35mm portions. But is there a clear resolution difference? Certainly nowhere similar to Dunkirk.

B. Saturn (00:55:01)

This scene is interesting. You're going to have to click on the image or download it, display 1:1 and on a high quality black or dark gray screen to really appreciate this.

What color is space? BLACK, right? Notice something rather disconcerting with the UHD Blu-Ray version... Compared to the standard 1080P, it's surprisingly not as black! I don't think it has to do with madVR's HDR-to-SDR processing since I have not noticed this problem elsewhere (plus madVR has no issues with the pure black letterboxing of 35mm scenes). Looks like there's some kind of black level issue here. Even if the HDR version might be enhancing the brightness, the black level looks like a dark gray with slight greenish tinge instead of true black. You can actually see that the black level of the UHD Blu-Ray is not as dark in the example above also with the Endurance but this image of Saturn against the blackness of space is more obvious.

Resolution-wise, I would have thought I'd see more in the rings of Saturn, maybe more subtle bands would show up, or perhaps better demarcation between the different rings. No such luck suggesting that there's minimal resolution difference here.

Very disappointing!

C. Farm House (01:30:36)

Again, notice a significant difference in the color of the scene when they converted it to UHD Blu-Ray for the HDR release. They made the sky a warmer sunset appearance. Unlike Dunkirk where the color temperature did not change much between the Blu-Ray and UHD Blu-Ray, this movie was clearly processed differently.

Apart from the color, if we look to the left side of the house, there does seem to be some marginally increased detail to the roof and the stuff on the front porch just left of the door in the UHD version. A little bit more detail in the wood panels and maybe more definition to the dent in the rear bumper of the truck. 

D. Romilly (01:51:24)

Since the color tone was again dissimilar, I took some liberties to try to match the color between the 1080P and UHD versions.

You see that the UHD version of this 70mm scene seemed to be slightly "zoomed" in as noted above. There's a little more detail in the texture of the spacesuit just under the "Endurance" patch on the shoulder. Otherwise, really not much difference. I wondered how much of the apparent difference in detail could just have been a result of the scaling in the 4K/UHD version.

E. Brand (02:42:17)

Tough to distinguish the resolution difference especially given the significantly warmer and more saturated palette of the UHD version. If we stare at the patterns again of the spacesuit there's a bit more detail to the weaving. Also if we look at the metal abdomen piece, there's a bit more clarity there. Again, some of the apparent detail could just be the slight "zoom" in the UHD version.

III. Thoughts...

As noted above, I can't but feel a bit disappointed by the UHD Blu-Ray version of Interstellar.

Sure, the color and contrast of the HDR effect on a good screen distinguishes the 4K version; whether you prefer it is a subjective matter. But there's an issue with the black level not quite up to par even compared to the standard 1080P Blu-Ray!

From a resolution perspective, I honestly don't think the 35mm portions of the movie improved in resolution over the madVR upscaled 1080P. As I've said before, I'm certainly not going to run to buy a 4K version of movies filmed in 35mm unless it's one of my favourites and this movie certainly did nothing to change my perspective on this.

I did not expect much from 35mm, but I was clearly disappointed in the 70mm portions. Wasn't 65mm/15-perf supposed to be something like "18K"? At least a difference was evident with Dunkirk, but sadly not here. It's one thing to pixel peek at inches away from the computer monitor and noticing slight differences but sitting from a few feet away looking at my 75" TV screen, what little improvement here is simply insignificant. I had a look at the 70mm scenes from The Dark Knight Rises and thought that looked better than Interstellar.

On the digital side, the visual effects were supposed to be 4K+ renders and all this done in a 4K digital intermediate. Real or Fake 4K in fact rates Interstellar as "Real 4K". Again, where's the resolution enhancement??! I certainly did not see anything special with the CGI "Saturn" or the "Endurance" examples above. So while on paper it might be "real", in reality this is an example of how one can't just assume that the final output reflects expectations! Sometimes all the higher resolution is doing is just highlighting details in the film grain.

All this begs the question... Are we sure this is the best quality transfer of the original material for Interstellar over to 4K? Was this result truly what was intended? Is there no further resolution gain to be had and did the guys in the studio doing the post processing make sure everything was done to the highest quality level? Even if we think the resolution is the best we can get, what about that odd elevated black level between Blu-Ray and UHD Blu-Ray!?

For the videophiles who have a copy of the UHD Blu-Ray Interstellar, what do you think? Are you guys seeing any improvement in the resolution? Are you seeing that elevated black level?

Personally, I would pass on this UHD Blu-Ray and be happy with the standard Blu-Ray (alas, I've already bought the movie and will keep it despite dissatisfaction). Looking at other reviews online: like this one on The Digital Bits giving Interstellar A+ for "Video Grade" ("Now, it should be noted that the Blu-ray version was already very good looking in 1080p HD, but the full 2160p 4K image bests it by a wide margin."), High-Def Digest giving it 5/5 "HD Video Quality" ("often stunning HEVC H.265 encode, surpassing the Blu-ray by light-years"), and AV Nirvana likewise 5/5 with claims of "Now THIS is what I like to see for a catalog UHD title." I see surprisingly little critical commentary about the questionable image quality! Gee, unless I'm comparing a different mastering or there are regional differences, I think it's important to remember to remain cautious with reviews in general!

Psychology is interesting isn't it? As much as I enjoy Nolan's movies, there's a certain hype about a director willing to shoot 70mm IMAX "film". There's a strong expectation bias for UHD Blu-Ray to show off the greatness of these "18K"-equivalent analogue frames. Video reviewers can with a bit of ingenuity do side-by-side comparisons IMO much easier than with audio reviews. Yet these are the kind of reviews we see, replete with more than a little hyperbole? Hmmm, maybe just like the audiophile world, one needs to be a little cautious with video reviewers as well; heck, always be a little cautious when there's something to sell. :-)

Ultimately, just like with "high resolution audio", remember that not all content demands a bigger "bit bucket"; whether it be digital or analogue source. It goes back to the master source... And like the audio remasters, sometimes newer isn't necessarily better; but at least in the video world they're generally increasing dynamic range rather than crushing it!


Well, it's the post-holiday-post-CES lull as we head into February. Amazing how fast time flies!

The only tech "upgrades" I got recently were a couple of nVidia graphics cards for machines in the home. It was time to switch out a couple of old AMD Radeon R9 270X's in my computers. In the living room, since my kids still play a few games with visiting friends, I switched the R9 out for an ASUS GTX1060 card and for my daily workstation, the inexpensive EVGA GTX1050 has been serving me well.

For 1080P resolution, the 1060 really works very well for general gaming even with FPS titles I've tried. Even though I just got the 3GB model, I suspect it'll be awhile yet before any games "need" more VRAM for textures; by then, I'm sure there will also be a need to upgrade the general speed of the CPU & GPU. The GTX 1060 is probably reasonably equivalent to the Xbox One X although the GPU inside that machine has much more VRAM access but the CPU is weak compared to any decent multi-core desktop AMD or Intel.

As for workstation purposes, the 1050 is inexpensive and remarkably power efficient! As a half-length card, it's completely powered off the PCI-E slots, there's no need for extra 6/8-pin connectors to the power supply. It's also got a TDP less than 1/2 of the old Radeon which is great for a machine that's essentially on 24/7. I don't need much 3D processing since I'm not going to be playing games on it. This also translates to a single very quiet fan which reduces the ambient noise of the computer (which over the years has gradually been getting quieter with better speed).

One nice thing about feature maturity is of course things becoming easier and hassle free. The Radeon R9 270X came out in 2013 before widespread availability of 4K screens so while it supported my BenQ 27" 4K monitor just fine through DisplayPort, every few hours I saw some digital errors (eg. color flickering) or sync issues with screen going blank for a second or two to the point that at times I needed to turn off the monitor to reset (checked and not a cable issue). No such issues now in the last few weeks with the new nVidia card. Perhaps this is similar to the hassles back in the day with USB DACs and drivers. As time goes on, it has become plug'n'play with native OS drivers for most devices.

All these changes (price, power draw, size, lower noise, reliability) are the benefits of product maturity. And for us digital / computer audiophiles, it is wonderful to partake in these benefits as computing becomes faster, storage becomes more voluminous, and better accessibility with inexorably lower cost.

Have a great week ahead everyone! Hope you're all enjoying the sights and sounds...

Here's a video of Nolan interviewed claiming that 35mm has "at least 6K of resolution" and that he's "very much" involved in his 4K remasters...


  1. Can you show separate chroma/luma layer in your next comparison? IMO the most difference is going to be visible on chrominance layer sharpness.

    1. Hi SUBIT,
      Sounds like a neat idea. I can certainly convert to Lab and separate the image into luma/chroma pairs.

      Maybe not the next movie I'd like to do (the reason I think will be clear) but subsequent comparison it might be useful...

  2. Thank you for an interesting test for those of us who are continuing to evaluate the 'case' to switch to 4K.

    It does seem that MadVR is doing an excellent job at generating 4K images, with sometimes better results than so called 'direct transfers'. I suspect other less technical reviewers are not seeking to squeeze out as much as you are from 2K sources (using simpler tools to upscale... maybe just what is built-in into the review display), assuming they actually try to make an objective comparison, rather than rely on memory...

    1. Hi Eiffel,
      Yeah, I suppose that could be the situation. Obviously each device that upsamples to 4K - be it TV, upsampling Blu-Ray player, graphics card on HTPC's will result in different quality.

      However, it is a bit disappointing when in a situation like this the studio can't seem to squeeze much more detail out of the actual source itself! I've since had a look at The Dark Knight and Dark Knight Rises. Even though 35mm is clearly inferior and I still question the value of 4K/UHD for this frame size, the 70mm portions of those slightly older movies actually look better than what I'm seeing here with Interstellar...

      I can't imagine this limited resolution especially with the 70mm scenes is something intended!

  3. If you're comparing for resolution shouldn't you just compare raw screencaps? What if the madVR upscaling is doing something like Sony's DSEE (Digital Sound Enhancement Engine) or declippers do with audio, which use predictive algorithms to guess at what the original uncompressed audio once was? EnhanceNet is another example of how predicted detail can be added by algorithm (see:
    I don't know how madVR works, so maybe I'm assuming too much and misunderstanding how you're doing the comparison or why you need to for HDR.

    1. Hi Stalepie,
      Thanks for the link. Wow - that's some image processing! The reason I use madVR is somewhat selfish but practical, and somewhat "hypothetical"...

      On the selfish but practical side, it's because I have an nVidia GTX 1080 GPU I would use for HTPC viewing to my 4K screen. With this set-up, madVR visibly gives me the best upscaling image quality bar none currently. Since many enthusiasts use madVR routinely, I might as well demonstrate what the image from a 1080P Blu-Ray could look like with a high quality algorithm than something known to be of lower potential. From what I've seen with madVR, it's very good but I'm not seeing the kind of qualitative enhancement in that linked article!

      As for the "hypothetical" side, my rationale is that if a UHD Blu-Ray is nothing more than just upscaling a 2K --> 4K image, then the studio would be using the highest quality upscaler they have at their disposal. So again, I might as well compare the image with what a home theater consumer has at their disposal and see if ultimately what's on the disk actually looks like an improvement.

      As shown in the Dunkirk example, a truly 4K image source will still look significantly better than any upscaling. Things like the strands of hair and grains of sand would no doubt appear superior. I've already done other comparisons of different "types" of movies which I might write-up for the blog here in the weeks ahead... What I can say is that *true* 4K material consistently looks better than an excellent upscaler like madVR as expected (the fun is in showing it :-).

  4. On an unrelated point, have you ever measured (and if not, would you) the DAC performance of the processors inside a smart TV.

    I have, for example, a recent 58" Panny that has the "Studio Master HCX" processor and I stream ISO rips of music DVD's to it, taking its optical out into my DAC. Sounds good to me.

    If you read reviews of a Panny with this processor in it there is never any measurements done on the DAC performance, its mainly the video performance.

    So given you are an inquisitive person, this might be an interesting project just to see how good these smart TV's are, sound wise.


    1. Hey Peter,
      Do you send the audio output from the TV to a hi-fi system to listen to music? I would imagine that no matter the DAC quality, unless one actually pairs the output with a capable sound system, it would be rather moot...

      Practically, for me, I have a 55" 1080P LG TV and 75" 4K Vizio panel. Even if the DAC insider were amazing, I wouldn't connect them up to a decent receiver for audio when I could already hook up the HDMI to the receiver and use that as my HDMI switch... This is why I've measured the DACs in the receivers instead.

      It wouldn't be hard to measure of course. But I see the analogue from TV's as simply meant for the "utilitarian" function rather than needing to be high fidelity :-).

      Well, let's see, I'll remember this if I run into a Panny in the future!

    2. So...

      Like you I have a Transporter (used with an external DAC) and front end that with Roon.. so this is my digital music is chain.

      My Music DVD's are ripped as ISO's and I use Mezzemo as the DLNA server (cause it understands ISO's without needing to transcode). The Panny sees Mezzemo and I browse my DVD collection via the Panny.

      I take the optical output from the Panny into my external DAC and into the rest of my system (so obviously its all two HDMI)

      So for purely selfish reasons understanding the technical quality of the Panny DAC output would be interesting and I think it would be a first if someone measured the optical out performance of a smart TV (Panny or otherwise).



  5. Very nice! I'm surprised that even the madvr upscaled blu-ray is objectively better looking: finer detail reproduced resulting in a more solid in-focused image and lines reconstructed better (and as they would be). I at least expected the native uhd image to be sharper. Only thing going for the uhd is less grain if that's important to you. There's an option in the dithering panel, something about coloured noise, if you disable that the grain should be subjectively less visible.

    1. Thanks Jarrett,
      Yeah, madVR does an amazing job here. Hence I'm a bit concerned about the actual resolution of whatever master they used for this movie!

  6. Great analysis as always, Archimago! I have a suggestion though: in order to level the playing field, do the same treatment to HD and UHD. I.e., upscale them both to 8K. That may be indicative of how both formats will look like in the near future when 8K comes around. This way it could be possible to eke out some more details out of 4K. It may be difficult to convert your files to 8K directly, but you can always cut out a quarter of the image (i.e., 960x540 from HD and 1920x1080 from UHD) and blow them both to 4K. That would be equivalent to upscaling them x4 and x2, respectively. And would be a fairer comparison of the information content in both formats.

    1. Interesting idea Yuri,
      Yeah, further upsampling can highlight the differences even more...

      I'll see about doing this in the future. The problem is that I'd love to use the madVR algorithm to "simulate" what good quality could look like. Unless I have an 8K screen, won't really be able to do this. I could use Photoshop and bicubic upsampling but not sure how accurate that would be in a "real life" situation one day if/when those 8K screens are plentiful.

      As an aside, I actually don't think we'll have large scale adoption of 8K any time soon. Maybe computer monitors but really hard to justify even 4K with TV screens <65" I think unless one sits rather uncomfortably close!

      Rest assured, I'm happy to test out 8K at some point assuming I still care about video resolution when 8K becomes a practical reality :-).

    2. I hope you won't have to wait for 8K too long:) But, the point of my message was that you don't need to wait: you can do all the simulations on your 4K screen, just using the central part of the original image (1/4) and blowing it up 4x and 2x to fill up the 4K screen.

      I do believe that almost all 4K BDs are fake, as somewhere in the process was an element of lower resolution (special effects, using less than 4K capture, etc.) and that even a 4K DI is not a guarantee that the result is true 4K. And I think you are finding that too (exc., Dunkirk :) .

      However, one thing that even a fake 4K BD can have an advantage on the regular BD, even if that 4K BD originated from a 2K DI, would be 4:4:4 high-bit rate JPEG2000 encode, which should theoretically look better than a 4:2:0 only 8-bit 25-30 Mb AVC encode (at best). Properly processing chroma should deliver visible benefits even during a simple upscaling. Thus, upscaling to a higher resolution could show us if the 4K BD source really contains more information than its 2K BD counterpart.

    3. Right Yuri, I see what you mean with the 4K --> 8K processing for comparison using the blow-up.

      And that's a very good point with the possibility of the level of chroma subsampling differences even though ultimately all UHD Blu-Rays are 4:2:0. You'll see in the latest post with Pacific Rim that I've included a scene showing the chroma component for comparison...

  7. The "4:4:4 high-bit rate JPEG2000 encode" part relates to 2K DI source, obviously:)