LCD Displays and Bit Color Depth

Explaining the difference between 6-, 8- and 10-bit displays

The color range of a computer is defined by the term color depth, which is the number of colors that the equipment can display, given its hardware. The most common normal color depths you'll see are 8-bit (256 colors), 16-bit (65,536 colors), and 24-bit (16.7 million colors) modes. True color (or 24-bit color) is the most frequently used mode as computers have attained sufficient levels to work efficiently at this color depth.

Some professional designers and photographers use a 32-bit color depth, but mainly to pad the color to get more defined tones when the project renders down to the 24-bit level.

Person Operating Remote At Home
mikroman6 / Getty Images

Speed vs. Color

LCD monitors struggle with color and speed. Color on an LCD has three layers of colored dots that make up the final pixel. To display a color, a current is applied to each color layer to generate the desired intensity that results in the final color. The problem is that to get the colors, the current must move the crystals on and off to the desired intensity levels. This transition from the on-to-off state is called the response time. For most screens, it rates around 8 to 12 milliseconds.

The problem with response time becomes apparent when LCD monitors display motion or video. With a high response time for transitions from off-to-on states, pixels that should have transitioned to the new color levels trail the signal and result in an effect called motion blurring. This phenomenon isn't an issue if the monitor displays applications such as productivity software. However, with high-speed video and certain video games, it can be jarring.

Because consumers demanded faster screens, many manufacturers reduced the number of levels each color-pixel renders. This reduction in intensity levels allows the response times to drop and has the drawback of reducing the overall range of colors that the screens support.

6-Bit, 8-Bit, or 10-Bit Color

Color depth was previously referred to by the total number of colors that the screen can render. When referring to LCD panels, the number of levels that each color can render is used instead.

For example, 24-bit or true color is comprised of three colors, each with eight bits of color. Mathematically, this is represented as:

  • 2^8 x 2^8 x 2^8 = 256 x 256 x 256 = 16,777,216

High-speed LCD monitors typically reduce the number of bits for each color to 6 instead of the standard 8. This 6-bit color generates fewer colors than 8-bit, as we see when we do the math:

  • 2^6 x 2^6 x 2^6 = 64 x 64 x 64 = 262,144

This reduction is noticeable to the human eye. To get around this problem, device manufacturers employ a technique called dithering, where nearby pixels use slightly varying shades of color that trick the human eye into perceiving the desired color even though it isn't truly that color. A color newspaper photo is a good way to see this effect in practice. In print, the effect is called halftones. Using this technique, the manufacturers claim to achieve a color depth close to that of the true color displays.

Why multiply groups of three? For computer displays, the RGB colorspace dominates. Which means that, for 8-bit color, the final image you see on the screen is a composite of one of 256 shades each of red, blue, and green.

There is another level of display that is used by professionals called a 10-bit display. In theory, it displays more than a billion colors, more than the human eye discerns.

There are some drawbacks to these types of displays:

  • The amount of data required for such high color requires a very-high-bandwidth data connector. Typically, these monitors and video cards use a DisplayPort connector.
  • Even though the graphics card renders upwards of a billion colors, the display's color gamut—or range of colors it can display—is considerably less. Even the ultra-wide color gamut displays that support 10-bit color cannot render all the colors.
  • These displays tend to be slower and more expensive, which is why these displays are not preferable for home consumers.

How to Tell How Many Bits a Display Uses

Professional displays often tout 10-bit color support. Once again, you have to look at the real color gamut of these displays. Most consumer displays don't say how many they use. Instead, they tend to list the number of colors they support.

  • If the manufacturer lists the color as 16.7 million colors, assume that the display is 8-bit per-color.
  • If the colors are listed as 16.2 million or 16 million, understand that it uses a 6-bit per-color depth.
  • If no color depths are listed, assume that monitors of 2 ms or faster will be 6-bit, and most that are 8 ms and slower panels are 8-bit.

Does It Really Matter?

The amount of color matters to those that do professional work on graphics. For these people, the amount of color that displays on the screen is significant. The average consumer won't need this level of color representation by their monitor. As a result, it probably doesn't matter. People using their displays for video games or watching videos will likely not care about the number of colors rendered by the LCD but by the speed at which it can be displayed. As a result, it is best to determine your needs and base your purchase on those criteria.

Was this page helpful?