Apple Special Event predictions from Kelly Thompson, VP of Product at 500px, on what’s next in color display technology.
For decades, Apple has been at the forefront of color and computing. Its ColorSync color management system, now almost 25 years old, gave the company an early leg up with designers — a trend that continues today.
While Apple’s displays have been consistently decent (with a few exceptions), the company remains typically quiet on the technologies it’s actually using for its screens (IGZO, Quantum Dots?) — and even more quiet about what’s coming next (MicroLEDs?). What we do know is that these new screens can produce much better images than ever before.
This combination of system-level software married with rapidly improving display technology has brought us preciously close to images being displayed in ways we couldn’t have imagined just a few years ago. We’ve long had the equipment to capture deep color images, file formats to store them (DNG/RAW/TIFF), and now displays to display them. But why ‘preciously close’ and not totally there? We need an effective way to deliver these files, exceptionally well compressed and web-ready. Video is progressing along quickly, with continually improving compression — there is no reason still imagery can’t reap the same benefits as moving pictures.
This article looks at where we’ve been, where we’re going, and what it’s going to take to get us across the finish line for the best looking images possible.
Screen resolution is one area where Apple has been an industry trendsetter. Jaws dropped when they introduced the Retina screen on the iPhone. As a photographer, seeing your work on a high DPI display for the first time is breathtaking: the space between pixels evaporates and images seem to glow with a deeper level of realness.
Apple’s updates to both iOS (in 2010) and OS X (originally hidden in OS X Lion in 2012) kept images and text the same size and used twice as many pixels in each direction to define them. Interface elements stayed the same size while looking better. This technology indicated a change in thought process – everyone else had just been adding more pixels and making everything smaller. Other ecosystems have struggled with super high resolution and many apps still aren’t HiDPI aware (we’re looking at you Windows 10).
On the Android side, Google had the foresight to allow Android to work on any screen resolution, which really drove adoption of the technology on phones outside the Apple ecosystem. It also meant the high resolution race was taken to the extreme with extraordinarily high resolution screens. Although they look great, there is only so much resolution the human eye can discern. At a certain point, there are diminishing returns, and it becomes a waste of battery and GPU power.1 If you’re 20 feet away from your new 50″ 4K TV, you won’t be able to perceive a difference over 1080p. Apple has largely ignored this race for even higher density screens, often to the chagrin of its users.
HDR: Deep and Wide Color
If you’ve wandered into a showroom to look at a new 4K TV in the last few months, you’ll have noticed that the high-end displays support High Dynamic Range (HDR) content (though lower end 4K sets often do not support it, “Ultra HD Premium” labeled sets definitely do). Note that we’re not referring to the hyperrealistic tone mapped HDR images composited from bracketed shots.
So, what’s so great about HDR on a 4K display? If you’ve ever walked into a pitch black room and seen your TV or monitor turned on, showing a pure black screen, you’ll understand — the screen is nowhere near pure black (and in the worst case, a medium gray). That’s beginning to change. Advances in display technology have allowed us to get much closer to perfect black and brilliant whites at each end of the spectrum.
Ultra HD Premium TVs must also support ‘deep’ color. What does this mean? Almost every image or video you’ve looked at on a computer monitor uses 24 bits – 8 bits for each of the red, green, and blue channels – to represent a color, resulting in a total of 16.7 million different possible shades ( referred to as “True Color” on the PC and “millions of colors” on the Mac2). These are far more colors than what the human eye can perceive, but there is a bit of an issue: our eyes are designed to see edges exceptionally well. Because each color channel only has 8 bits, each channel can represent only 256 different shades — and we can easily perceive the difference between these shades. Photographers have long struggled with this limitation as they attempt to render out the best looking images from their 10 or 14 bit RAW files. With 8 bits of color, images with color gradients can appear to have lines or “banding.”
These new high-end displays also support a much broader color space than the sRGB (Rec. 709 in HD video) space most of us are familiar with working in. If you’re new to color spaces (and the colorful “triangles”’ that accompany them), they’re easy to visualize. Start by visualizing every color that exists. Because our display technology can’t yet reproduce every color, the color profile specifies which subset of all colors will be defined within it. The new, wider profile featured on high-end displays and projectors is referred to as DCI-P3 (or just P3).
By having both a wider color space and more shades within it, we end up with smoother gradients, much richer colors, and, ultimately, better looking images.
Back to Apple
As improvements in resolution reached their maximum limits, Apple seems to have shifted focus elsewhere. Now, here is where things get interesting.
In late 2015, Apple updated the Retina iMac to include a panel that covered the P3 color space. Photographers were excited – this was an exceptional display with a Mac thrown in for free. What Apple didn’t mention was that the Retina iMac also has a 10-bit color display. And it gets even better: OS X 10.11.1 included driver updates for 10-bit color to any panel that supported it. These panels included some older iMacs and many higher-end external displays that creatives were using with their Mac Pros driven over DisplayPort/Thunderbolt ports. (If you want to check your setup, go to “About This Mac” in the Apple menu, click “System Report”, and then select “Graphics/Displays.” ARGB8888 is 8-bits per channel and ARGB2101010 is 10-bits).
Shortly after Apple started supporting 10-bit color, Adobe jumped on board with an updated Photoshop CC to complete the loop. If you were lucky enough to have a full 10-bit pipeline on the Windows side, Photoshop supported it as far back as Windows 7 and Photoshop CS4.
Then, things took a really unexpected turn: the iPad Pro 9.7” arrived with a P3 calibrated display. This development required another piece of software that didn’t yet exist: ColorSync on iOS. Sure enough, it arrived alongside the new iPad. Even people without an iPad Pro noticed, as CMYK PDFs started to render properly on all iOS devices.
Sample image from the Webket team showing their logo in wide-gamut color. Can’t see it? You won’t if you don’t have a wide-gamut display. Here’s a representation of what you’re missing:
Another example of extra detail achievable in the reds in the P3 space. Saved as 16-bit PNGs.
Android phones – specifically Samsung’s latest models – have gorgeous OLED screens capable of exceptional color reproduction, but without system-wide color management, they’re a bit stuck. On the S7, the mode of “Adaptive display” comes with a warning: “Optimized the colour range, saturation, and sharpness of your display automatically. This mode may not be compatible with third-party apps.”
Full color management in Android would mean not having to make these manual choices.
For Apple, the final shoe dropped at WWDC in June, when they announced that with the release of iOS 10, devices would be able to capture RAW 10-bit images straight from the devices’ camera sensors.
So, this got us to thinking and here are our predictions for today’s announcement:
If most of these hunches come true, there would be a huge shift in the way people view images on the screens they use most often. Even if they don’t, the stage has been set for everyone to be looking at better images on exceptionally better screens. But there’s a catch: we’re currently lacking an effective mechanism for delivering these images to you, the end consumer.
For photographers, a trend toward supporting deep and wide colour is significant. At 500px, our focus on showcasing our photographers’ work as best as we can means we want to get any tools that help you deliver these images into your hands as quickly as we possibly can.
This is the kind of announcement that can get glossed over by most media after an Apple event. It’s technical and mundane compared to hardware announcements yet impacts everything we do on our Apple devices and the ecosystems around iOS. More importantly to us at 500px and countless other technology companies these changes impact developers, designers, product teams and companies at large. You can bet we’ll be watching today’s event with bated breath.
In Part II of this article, we’ll look at what this means in the real world, how it could be successfully implemented, and the significant roadblocks still in the way.
1. Perceived resolution is a function of distance – i.e. it matters how far away you’re holding your phone away from your eyes. The rise of VR has huge implications for the resolution of our phones. What used to be extreme resolutions now starts to make sense when the phone is only a few inches from your eye. VR will continue to push resolutions higher.
2. You might notice that this is also (confusingly) referred to as 32-bit color. It’s actually 24-bit with an 8-bit alpha channel. This is NOT the same as the 30-bit color we’ll discuss next.