Thursday 26 March 2015

MUSINGS: Gone 4K / UHD - A "Look" At Ultra-High-Definition...

This week, I thought I'd take a break from just the audio stuff and discuss a new "toy" I got 5 weeks ago. That's right, as the title suggests, a 4K / UHD screen; it's a computer monitor to be exact:

A view from behind the commander's chair :-). BenQ BL3201PH on the table.
Remember that there's a separate "body" defining 4K movies at the local movieplex - the Digital Cinema Initiative (DCI). They have "true" 4000 pixel horizontal resolution like the 4096x1716 (2.39:1 aspect), or the very close 3996x2160 (1.85:1) resolutions. Whereas for the smaller screens like computer monitors and TV's, we have the UHD "Ultra High Definition" standard defined as 3840x2160 (16:9 / 1.78:1). So although it's not exactly "4K" horizontally, it's close and I guess "4K" is a better advertising catch-phrase than "2160P". Needless to say 3840x2160 is 2x the linear resolution of 1080P or 4x the actual number of pixels.


Please allow me to reminisce a little on "ancient" technology history... Back in 1989, in my university undergrad, I worked for a summer doing computer science research and saw for the first time a SUN SPARCstation 1 "pizza box" with 20MHz processor, 16MB RAM, and a 256-color "megapixel" (1152 x 900) display. I was blown away! This was a "dream machine" compared to my 7MHz Motorola 68000, 512KB Commodore Amiga 1000 with 32 colors (4096-color HAM mode was cool but limited in application, before the 64-color EHB mode) and a maximum resolution of 640x400 interlaced (can be pushed a bit into overscan). Back in those days, even a relatively expensive Macintosh was only capable of 640x480 8-bit (256) color.

The closest to "true-color" I saw in the 80's was an old Motorola VME 68020 machine I worked on to develop a rudimentary GUI for image recognition software running an ancient 16-bit color Matrox frame buffer video card. Although limited to 640x480 interlaced, it was impressive to see an actual digital picture on a computer screen that looked like something out of a video!

[Even back then, although the sound quality was nothing to write home about, in 1989, the first PC Sound Blaster card was introduced. By then, we had been living with CD audio for a number of years already, and even this first generation card was capable of 8/22 mono already. It was just a matter of time before 16/44 stereo sampling was on option given enough memory and storage space. The Sound Blaster 16 with 16/44 stereo came just a few years later in 1992. Clearly, technology for imaging / video has always been behind audio in capability and relative fidelity due to complexity and storage requirements (this of course also speaks to the neurological sophistication of the visual architecture compared to audio in our brains).]

At some point in the early 1990's I saw a TI "true color" 24-bit graphics card machine at the university (remember the TARGA graphics file format anyone?). By 1994, I bought myself a Picasso II graphics cards for the Amiga capable of 800x600, 24-bit color (sweet!). By 1997, my first PC graphics card capable of >1024x768, 24-bit color was the Matrox Mystique. From then on, each generation of graphics card became more about 3D performance rather than 2D speed or resolution... My computer display also got upgraded through the years, from NEC MultiSync CRTs to 1280x1024 LCD, to Dell's UltraSharp 24" series (1920x1200), and last year I got the excellent 27" BenQ BL2710PT (2560x1440).

But one goal remained elusive on my desktop machine. A large screen monitor (in the 30" range) with at least spatial "high fidelity"; looking smooth, detailed, with clearly enough fidelity that my eyes/mind no longer would be able to distinguish those digital pixels anymore - in essence, something close to the limit of our visual spatial apparatus in 2D (perhaps like how CD is close to our auditory limits within the stereo domain). Although in the visual sphere there's still room for improvement in terms of color accuracy, contrast (dynamics), and black levels, finally it looks like we're "there" with pixel resolution (and at minimum flickerless 60Hz refresh rates with decent response time).

This goal of achieving pixel resolution meeting biological limits is obvious and technology companies have been building up towards it for years. Apple's "marketing speak" captured it nicely; they called it "Retina Display" - a screen resolution packed tightly enough that individual pixels would not be visible to the user. The first product they released to the public with this resolution designation was the iPhone 4 with a screen resolution of 960x640 (3.5", 326ppi) in June 2010 (of course other phone companies use high resolution screens and have surpassed Apple's screens; though I must credit Apple with their superb marketing prowess!). Steve Jobs back then made a presentation about the resolution of the human eye being around 300 dpi for cellphone use:

Realize that this number is only relevant in relation to distance from which the screen is viewed. When we test eye-sight, the "target" of 20/20 vision is when we are able to discriminate two visual contours separated by 1 "arc minute" of angular resolution (1/60th of 1 degree). Like I mentioned in the post a couple weeks ago, like hearing acuity, there will be phenotypic variation to this in the population and some folks will achieve better than 20/20 vision just like some people will have better hearing than others ("golden ears"). For those interested in the physics and calculated limits of vision, check out this page.

Coming back to technology then... As per Steve Jobs, when we use a cell phone, we generally view it at a distance closer than say a laptop or desktop monitor. Normally we'll view a smallish screen phone (say <6" diagonal) at about 10-12 inches. In that context, the 300 pixel per inch specification is about right... Just like in audio where we can argue about "Is CD Resolution Enough?", the visual resolution guys also argue if more is needed - witness the passion of the Cult Of Mac and their plea for "True Retina" (something like 900 ppi for the iPhone 4, and 9K for a 27" computer screen)!

Until that day when we can see for ourselves if 9K is needed though (the UHD definition offers 8K for those truly on the bleeding edge of technology), check out this helpful web site for calculating what viewing distance a screen becomes "retina" grade:

http://isthisretina.com/


Enter the horizontal and vertical resolution, then screen size, and press "CALCULATE". It'll tell you the PPI resolution, aspect ratio, and most importantly in this discussion at what distance the angular resolution of the pixels reach the 20/20 threshold. Using this calculator, my BenQ BL3201PH, 32" 4K/UHD (3840 x 2160, 137 dpi) monitor reaches "retina" resolution at a viewing distance of 25".


Considering that I generally sit >25" away from the monitor, it looks like I've achieved that "magic" resolution I've been hoping for all these years :-). With a 32" monitor, you actually wouldn't want to sit too close, otherwise you'd be moving the head too much to scan the screen all the time. Subjectively, the monitor image looks gorgeous and it really is wonderful not noticing any pixels or easily making out any aliasing imperfections in text. I think I can live with this for a few years!

There's something special about achieving high fidelity (whether audio or visual). For a machine to match (and these days surpass) biological sensory limitations is a milestone. And to do it at price points within reach of most consumers is further evidence of technological maturation. In just a few years, we've witnessed the transformation of high resolution screen technology with "retina" resolution starting in handheld devices, to laptops, and now to the desktop monitor...

In the Archimago household, there remains one large screen screaming for these high resolutions. My TV in the sound & home theater room. If I plug in the numbers into the website, it looks like I'll need an 80" 4K TV :-). Well, I'll be keeping an eye on those prices then! Although I'm willing to jump into the 4K computer monitor waters at this time, I think I'll wait when it comes to the TV. HDMI 2.0, DisplayPort 1.3, HDCP 2.2 all need to be hashed out and widely supported before I jump in with a big purchase. Also, OLED 4K could be spectacular... Maybe next year?

------

I want to say a few words about the usability of 4K monitors. I was actually a little apprehensive at first about buying one due to some reviewers complaining that text size was too small and it was too difficult to use with Windows 8.1. I suspect this would be the case with smaller 4K screens like 27" models (Huh!? What's with that 5K iMac at 27"?). At 32", I can actually use it even at 100% (1:1) although a 125% scaling made things easier on the eyes. Note that many/most Windows programs are still not "scaling aware", which is why having the screen usable at 100% from a standard viewing distance is beneficial at this time.

Use the "scaling"!
Firefox runs great with 125% scaling and you can go into Chrome's options to set the default scaling to 125% as well. Internet Explorer looks excellent out-of-the-box.

For digital photography, 32" 4K was made for Lightroom / Photoshop! The ability to see your photos on a large screen with 8 full megapixels is stunning. The bad news is that my quad-core Intel i7 CPU is feeling slower processing all those megapixels from a RAW file; not quite enough to make me feel I need a CPU upgrade just yet.

There are some 90+Mbps AVC 4K demo videos floating around providing a tantalizing taste of what 4K Blu-Ray could look like in the home theater. Panasonic showed off their 4K "ULTRA HD Blu-Ray" at CES2015 recently and I suspect that will be the best image quality we're going to get for awhile simply because of the large capacity Blu-Ray disks have to offer. It looks like the new encoding standard H.265/HEVC will be used for these future videos and this will provide even better compression efficiency and image quality for the same bitrate (supposedly similar image quality at 50% data rate compared to H.264/AVC). This could end up being the last copy of  Shawshank Redemption I ever buy... Hopefully :-). [Even here, we can get into a debate about analogue vs. digital... Arguably, unless the movie was filmed in 70mm, 4K should be more than adequate to capture the full image quality of any 35mm production.]

For the time being, 4K YouTube streaming does look better than 1080P but it's clear that Internet bitrates impose significant compression penalties (noticeable macroblock distortions with busy scenes). Netflix has some material but will not currently stream 4K to the computer (only 4K TVs so far - probably due to copyright protection). I have watched 4K shows like House Of Cards and Breaking Bad off Netflix, but like 4K YouTube, the quality isn't really that impressive at this point.

Finally, remember the hardware needed to run a 4K/UHD monitor. I decided at this point to get the screen because we now have 2nd generation reasonable-priced screens (~$1000) at 60Hz, with IPS-type technology. The BenQ uses the DisplayPort to achieve 60Hz refresh rate and is SST (Single Stream Transport) instead of MST (Multi-Stream Transport) which split the screen into 2 x 2K "tiles". SST should be hassle free as I have heard of folks experiencing driver issues with the tiled screens not handled properly (imagine only half the screen displaying if the software fails the tiling process). Note that for a bit more money, the Samsung U32D970Q has received some excellent reviews for image quality and color accuracy.

I'm currently using an AMD/ATI Radeon R9 270X graphics card I got last year. Not expensive and has been trouble free for 60Hz SST operation. Just remember to buy some quality DisplayPort 1.2 cables (the BenQ has both full-sized and micro DisplayPort input). Here's an example of a very high speed digital interface that requires about 12Gbps of data transferred to achieve 3840x2160, 24-bits/pixel at 60Hz. The 6' DP-to-miniDP cable that comes with the monitor does the job fine but so far I have had no luck with 10' generic cables just to give some extra flexibility to my setup (anyone know of a reliable 10' 4K/60Hz cable, maybe 26AWG conductors?). Even at data rates 25x that of high-speed USB 2.0 (and 2x USB 3.0 speed), there's no need to spend >$20 for a good 6' cable.

Modern high-performance gaming at 4K would really demand a more powerful graphics processor so I haven't tried on this machine. I suspect less demanding ones would run just fine.

------

As noted earlier, remember that pixel resolution is only one factor in overall image quality. The ability to display good contrast (like dynamic range in music) and also color accuracy are very important. Clearly it's in these other areas that computer and TV displays can further improve. Note also that UHD defines an enlarged color space as well (ITU-R BT 2020 vs. the previous Rec. 709 for standard HDTV - see here) so the improvement in this regard is another tangible benefit.

I hope you enjoyed this foray outside the usual audio technical discussions... Enjoy the music and whatever visual set-up you're running!

PS: Happy Dynamic Range Day (March 27, 2015)! Great to see a recent purchase; Mark Knopfler's Trackerwas mastered at decent DR11... Keep 'em coming - "rescuing the art form" is about preserving qualities like the full dynamic range and releasing music meant for listening with systems superior to boomboxes and earbuds!

9 comments:

  1. Could you tell me what the desktop monitor speakers are please?

    ReplyDelete
    Replies
    1. Hi Stephen,
      These are AudioEngine A2's. They were ~$200 when I got them many years ago...

      The A2+ with built-in DAC is available these days.

      For the size, very nice unit with decent bass response.

      Delete
  2. I've been following this "4k phenomenom" since the beggining. It looks like it's application will only make sense in the PC world, where you'll gain more free space to work on your desktop, and for games, where more resolution = larger field of view.

    4K for a good movie? Well, accordingly to the calculator "isthisretina", a FullHD 65 inch screen is retina if you sit at >2,50 meters (which is kind of "standard", unless someone enjoy some pain in the neck :)

    Honestly, all this 4k, 8k chat (for TV use, PC is another history, like i said) sounds to me like DSD 512 or PCM 384/32... pure marketing stuff.

    Anyway, cool rig!

    Best regards!

    ReplyDelete
    Replies
    1. Absolutely, that's why I'm willing to bite with a 4K screen on the computer desktop for the time being. I'll wait and see how it goes with 4K Blu-Ray by the end-of-year,

      I would not even be considering a 4K TV in my AV room unless the screen is 80"+. Definitely one area 4K could be beneficial is with passive 1080P 3D (I still watch a few 3D movie, maybe a handful a year). The larger color gamut is potentially nice as well.

      Today, I was in Costco and they had a 55" 4K on display showing some UHD material. People clearly took notice. When I see seniors stop and have a good look, this tells me that it will catch on... Maybe not everyone will rush out to convert, but it's just a matter of time and price.

      I see this as much different than hi-res audio. The common consumer would not take notice when playing 24/96 or anything above (much less DSD512 or 384kHz!) compared to the same mastering in 16/44 IMO even with excellent player/amp/speakers. Technologies like 3D TV and surround sound made people notice but I think the inconvenience and price only appealed to the enthusiast.

      Just have a look at this video review of the Pono:
      https://youtu.be/6VQUFCCcQ4A

      I don't think anyone here would be surprised by these kinds of reviews. As I suggested many moons ago, the kind of hype being spewed by the high-res-audio promoters likely will backfire since it was always unrealistic. You can see the audiophile press trying their best to keep the *hope* alive... That somehow, people will flock to 24-bits and high samplerates, or to DSD with higher prices.

      In comparison; 4K is a real upgrade for anyone with decent vision - even if we have to sit a little closer :-).

      Delete
    2. Exactly, 4K only makes sense in toooo large screens, and since i'm not a millionaire, i'll stick with 1080p ;)

      You know, i think you've touched a interesting point when you said: "...a 55" 4K on display showing some UHD material...". Sure, to show the potential of 4K, you need...4K!!!

      The audio industry seems to not understand this point! They are trying to sell HD audio gear using 50's, 60's and 70's recordings. It's like a 4K screen displaying the E.T. movie!! Not gonna work!!

      I don't know, i'm just a music lover and i'm not an ultra enthusiast of audio gear. I'll not spend hundreds of $$$ on something that will give me only some "air improvment". I think that too much research need to be done in the way we record music, and reproduce it. I think that multichannel audio is an huge improvement (when nicely done, of course, like Channel Classics SACD's).

      Of course, this has some implications... like 4K, it costs a lot of money... a LOT more than a 4K system. But, i think that this is the way to go.

      Anyway, i'm just a regular guy with my 2 cents ;)

      Keep up the good work!

      Delete
  3. Thanks for the article, and for reminding me of those days back in ancient history before 24 bit colour.

    I can remember, circa 1987, being obsessed with the idea of being able to store and display digital images with sufficient quality to pass as 'photographs'. Somewhere at home I still have the contraption I built: a framestore. It took in ordinary composite video and split it into R, G and B (using a surplus TV decoder board I think). I then made three boards, each of which contained a 6 bit 'flash' ADC, some static RAM chips and, for some reason, I ended up making my own 6-bit R-2R DACs. A separate board contained counters connected to the RAM address lines, and could be triggered and reset by the line and frame pulses from the decoder board. An unusual thing is that the oscillator to clock the horizontal position counter was analogue, allowing me to set the scan width with a pot. The frame and line syncs and three DAC outputs (buffered to 75R output impedance) were fed into a standard BBC computer monitor. The framestore resolution was 256x256 but the monitor was interlaced, so the same image was played out for both odd and even fields.

    In use, I got a 'live' digitised image on the monitor and could press a push button to freeze or unfreeze the image. I could read out the contents of the RAM into a BBC microcomputer and store the images on floppy disk. Later I could load the same images into the framestore to display them. I could also synthesise images in software. In other words, the objective was to have a true colour display - something you just didn't get with the PCs of the time - just as much as to be able to grab live video. I mainly grabbed images from TV, but took some snaps using a video camera, so I guess I was a very early adopter of the digital camera! Obviously there were VCRs and camcorders at the time, but these were analogue. Up-market VCRs had digital freeze frames, I think, but they didn't have provision for saving the images or re-loading them.

    Simply by waiting a few years, all of this and much more would have become available in ordinary PCs. On the other hand, a good proportion of my job now involves image acquisition and processing, so it could be seen as having been very educational.

    You have prompted me to write an article about it for my blog. I still have some of the grabbed images somewhere...

    ReplyDelete
    Replies
    1. Wow! Impressive, man... That's great work for the 80's. Seeing the evolution of computing technology from the days of coding assembly 6502 instructions (saving to cassette tape, thank goodness I missed the punch card days) to the complexity of today's computing has been marvelous.

      My first home image capture device was the NewTek DigiView on my Amiga:
      http://amiga.resource.cx/exp/digiviewgold

      On a student budget using money from my summer job in 1988. Got an inexpensive B&W CCTV video camera and used the cheap cellophane RGB colorwheel to capture pictures into the computer for editing. Each color image required 3 passes so all I could capture were static images. Brings back memories looking at those ads in the link above!

      Looking forward to your blog post.

      https://therationalaudiophile.wordpress.com/

      Delete
  4. Thanks for this post. there is a corollary to this discussion with regard to watching movies on big screens, i.e. big TV and home theatre. IIRC the correct viewing distance is dictated by horizontal angle, I think not less than 35 degrees or more than 45 degrees. Too little angle and you don't feel 'immersed', too much angle and you have to 'look around' the screen and can't take it all in as a whole.

    Then working backwards from that, you can work out the maximum resolution needed -- anything more and you can't discern pixels so you get no benefit from more resolution. The clever reader will have noticed that this means that the maximum usable resolution is the same no matter how big the screen or the room.

    And it's about 3000 pixels wide. Pretty much half way between 1080p and 4k. No matter what size your HT or screen is.

    If you want to get a feel for 45 degrees, sit the same distance from your TV or screen as its diagonal size. When you sit this close, 4k is worthwhile. If you sit at double that distance, 1080p is more resolution than your eye can see.

    ReplyDelete
  5. That's right, as the title suggests, a 4K / UHD screen; it's a computer ... monitor4k.blogspot.com

    ReplyDelete