What Is 10-Bit (And 12-Bit) Color?
So, 10-bit color: It’s important and new, but…what is it? Before we dive into that, the first question is what is bit depth, and why does it matter for displays?
What Is Bit Depth
In computer programming, variables are stored in different formats with differing amounts of bits (i.e., ones and zeros), depending on how many bits that variable needs. In general, each bit you add allows you to count up to double the previous number and store double the previous amount of information. So if you need to count up to two (excluding zero), you need one bit. Two bits allow you to count up to four, three bits up to eight, and so on.
The other thing to note here is that in general, the fewer bits, the better. Fewer bits mean less information, so whether you’re transmitting data over the internet or throwing it at your computer’s processing capabilities, you get whatever it is you want faster.
However, you need enough bits to actually count up to the highest (or lowest) number you want to reach. Going over your limit is one of the most common software errors. It’s the type of bug that initially caused Gandhi to become a warmongering, nuke-throwing tyrant in Civilization. After his “war” rating tried to go negative, he flipped around to its maximum setting possible. So it’s a balancing act for the number of bits you need; the fewer bits you use, the better, but you should never use less than what’s required.
How Bit Depth Works With Displays
With the image you’re seeing right now, your device is transmitting three different sets of bits per pixel, separated into red, green, and blue colors. The bit depth of these three channels determines how many shades of red, green, and blue your display is receiving, thus limiting how many it can output.
The lowest-end displays (which are entirely uncommon now) have only six bits per color channel. The sRGB standard calls for eight bits per color channel to avoid banding. In order to match that standard, those old six-bit panels use Frame Rate Control to dither over time. This, hopefully, hides the banding.
And what is banding? Banding is a sudden, unwanted jump in color and/or brightness where none is requested. The higher you can count, in this case for outputting shades of red, green, and blue, the more colors you have to choose from, and the less banding you’ll see. Dithering, on the other hand, doesn’t have those in-between colors. Instead, it tries to hide banding by noisily transitioning from one color to another. It’s not as good as a true higher bitrate, but it’s better than nothing.
Read more: What Is 10-Bit (And 12-Bit) Color?