Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Myndex

MYNDEX
RESEARCH

Myndex

Myndex

Myndex

Myndex

Myndex

Luminance Contrast and Perception

⬅ Back To Webdesign

How to Assess Luminance and Contrast.

Experimental Site Notice

This is not a production site. Some pages may have unusual formatting or colors as part of a series of live experiments in visual perception of self illuminated displays. If you have any problems reading any pages, or if you find any are particularly easy to read, we'd love to hear about it. Please discuss at the SAPC GitHub repo discussions tab

Introduction (Updated for 2020)

We are researching and developing contrast assessment methods to provide new and more accurate standards. This project originated with GitHub (WCAG Issue 695, )(link to a late-in-thread summary). Also updated for 2022, please see the Color and Contrast Resources page at our GitHub repo, which provides the very latest on this subject. The present article was written before the invention of APCA contrast calculations, but is left here for historical reasons.

Determining visual contrast is not as straight forward as one might think. There are a number of factors that affect human perception of contrast, and contrast perception is more than the difference between two colors. Contrast can (and must) be broken down into luminance contrast (light vs dark) separately from hue or saturation contrast (red vs blue, green vs grey). Not to mention spatial frequency contrasts, size contrasts, etc.

CVD: Color Vision Deficiency affects a substantial portion of the population: 1% of women and 6% of men have CVD. They rely on luminance contrast and not color contrast. We have a chart here that demonstrates that very issue, and also a CVD Simulator Web App, so you can see how someone with a color deficiency perceives colors.

Color: Strictly speaking, Color is not real. Light does not have "color" but light does exist in different frequencies or wavelengths. Color, as in hue, is a perception that results from the eye/brain vision system responding to different frequencies, not unlike how the ear/brain system hears the difference between a low note on the left of a piano keyboard vs a high note.

Spectral response simply means how strongly the eye responds to a particular frequency of light (i.e. a color).

Luminance is the linear measure of visible light that has been adjusted for the spectral response of the human eye.

Side note: the correct term is "luminance" not luminosity. Luminosity refers to light emitted over time, often used in astronomy. When we are talking about modern colorimetry, we use the term luminance, with relative luminance denoted Y, and absolute luminance as L.

As it happens, in 1931 the CIE created the XYZ colorspace, where Y is relative luminance. At that time they also described the human eye's spectral response as the "luminosity function", though that term has long since been replaced by "luminous efficiency function" which is a more accurate descriptor.

And while luminance takes into account the human perception of color, it does not take into account the human perception of variations in intensity. Luminance is linear as is light in the real world. But human perception of light is very non-linear, and we have other terms for describing how we perceive intensity, such as perceptual lightness which is often denoted as L* aka "Lstar." The mathematical definition that is best to use (L, Y, L*, etc.) depends on what is being modeled or predicted.

DETERMINING LUMINANCE

Luminance is a spectrally weighted but otherwise linear measure of light. The spectral weighting is based on how human trichromatic vision perceives different wavelengths of light. This was part of the measurements in the CIE 1931 experiments and resultant colorspaces such as CIEXYZ (Luminance is the Y in XYZ).

While XYZ is a linear model of light, human perceptions is very much non-linear. As such, XYZ is not perceptually uniform. Nevertheless, luminance is useful when discussing the physical quantities of light such as knowing what the equivalent luminance is for a particular color vs a grey patch.

Assuming you are starting with sRGB video (i.e. the web & computer standard colorspace) you first need to remove the TRC or "gamma" encoding, and then apply the spectral weighting.

I've made a lot of posts [on StackExchange] regarding gamma, but if you want a definitive explaination I recommend Poynton's Gamma FAQ.

Converting sRGB to Linear (Gamma 1.0).

1) Convert the RʹGʹBʹ values from 8 bit integer (0-255) to decimal (0.0 - 1.0) by dividing each channel individually by 255.0. The RʹGʹBʹ values must be 0 to 1 for the following math to work. Later in the code examples are snippets for converting a single number (like a 6 digit hex) into RGB channels.

2a) Linearize each channel. With each separate channel at 0.0 to 1.0, The lazy way is to simply apply a power curve of 2.2, which is how the IEC sRGB standard defines the display gamma. This is essentially how an sRGB computer monitor displays the image data — so for the purposes of judging the luminance of a color this is fine:

\begin{equation} \definecolor{DarkGreen}{rgb}{0,0.2,0} \definecolor{DarkRed}{rgb}{0.5,0,0} {\style{fill:var(--modeMathRed);}{(\mathbf R^\prime \div 255.0)^{2.2} }} \quad {\style{fill:var(--modeMathGreen);}{(\mathbf G^\prime \div 255.0)^{2.2} }} \quad {\style{fill:var(--modeMathBlue);}{(\mathbf B^\prime \div 255.0)^{2.2} }} \\ \end{equation}

2b) An alternate and in some use cases "more accurate" method: If you are doing image processing and going back and forth from sRGB to linear, then you probably want to use the precise reverse transformation method from the specification, which Bruce Lindbloom goes into, and which is also on wikipedia.

\begin{align} V_{[R^\prime,G^\prime,B^\prime]} & = \forall [{\style{fill:var(--modeMathRed);}{R^\prime_\mathrm{srgb} \div 255.0}}, {\style{fill:var(--modeMathGreen);}{G^\prime_\mathrm{srgb} \div 255.0}}, {\style{fill:var(--modeMathBlue);}{B^\prime_\mathrm{srgb} \div 255.0}}] \\ & \qquad \qquad \Downarrow \\ V_{Linear[R,G,B]} & = \forall \begin{cases} V_{[R^\prime,G^\prime,B^\prime]} \div 12.92 & V_{[R^\prime,G^\prime,B^\prime]} & \leq 0.04045 \qquad \\ \Bigl(\left( V_{[R^\prime,G^\prime,B^\prime]} + 0.055 \right) \div 1.055 \Bigr)^{2.4} & \quad & > 0.04045 \qquad \\ \end{cases} \end{align}

As with the simple (2a) method, first divide each channel separately by 255.0, so the values are 0.0 to 1.0, then if the channel is less than 0.04045 divide it by 12.92. Otherwise, add 0.055, then divide the result by 1.055, then raise that result to the power of 2.4.

Also, here's a code snippet from a spreadsheet which I use for a similar purpose (obviously A1 is one channel of RGB 0-255).

// OpenOffice Calc
	=IF( A1 <= 0.04045 ; A1 / 12.92 ; POWER((( A1 + 0.055)/1.055) ; 2.4)) 

What this shows is for values under 0.04045 you just divide by 12.92, but for values above, you offset and apply a power of 2.4 — note that in the "lazy way" we used a 2.2 power curve, but the 2.4 curve is nearly identical due to the offset near black.

You can do either step 2a OR step 2b — but not both.

3) Finally, apply the coefficients for spectral weighting, and sum the three channels together. For sRGB or Rec709, those coefficients are Rco 0.21263901 & Gco 0.71516867 & Bco 0.07219232 and they are commonly rounded to four places (including in the official spec):

\begin{equation} {\Large Y = \ } {\style{fill:var(--modeMathRed);}{V_{Rlin} × 0.2126}} + {\style{fill:var(--modeMathGreen);}{V_{Glin} × 0.7152}} + {\style{fill:var(--modeMathBlue);}{V_{Blin} × 0.0722}} \end{equation}

And that gives you Y, your luminance for a given color as a 0.0-1.0 value (refers to the Y in CIEXYZ). Luminance is also symbolized as L such as when it is an absolute measurement (i.e. cd/m2 or nits). This is not to be confused with L* (Lstar) which is perceptual lightness from CIELAB, not luminance.

Determining Perceived Contrast ....

EDITED Oct 2020 to add: This article was originally written prior to the invention of the SAPC method for predicting contrast on displays, and the Advanced (Accessible) Perceptual Contrast Algorithm (APCA) for determining contrast. For a discussion on this new method, please see the APCA Resources Page and play with the live APCA Contrast Calculator.

The rest of this article is being left essentially untouched for historical reasons, as it provides a good recap of previous methods to measure the perception of contrast, and useful resources such as code snippets and terminology. If you have specific questions about SAPC, the best way to contact the author is to start a discussion at the SAPC/APCA forum.

Before we continue, I just want to mention again that contrast as a perception is more than just the measured difference between two colors. But the depth of that subject is for a separate article.

...Now Back to our "Determining Contrast: chat...

If you want to determine the difference between two samples, there are a number of methods. Weber Contrast is essentially ΔL/L and has been a standard method since the 19th century (the Weber Fraction applies to all forms of human perception). But for computer monitor displayed stimuli, I suggest some more modern approaches. For instance the following modification for better perceptual results:

(Llighter – Ldarker) / (Llighter + 0.1)

There is also Michaelson contrast, RMS contrast, simple ratios, Perceptual Contrast Length, Bowman-Sapolinski, and others including some we are working on here at Myndex™ Perception Research. You can also convert to CIELAB (L*a*b*) which is based on human perception, and there you just subtract L*1 from L*2.

There are a number of other factors that affect contrast perception such as font size and weight (see Spatial Frequency), padding (See Bartleson–Breneman Surround Effects), light adaptation, and other factors.



Glossary and Code Snippets

Each step in the R´G´B´to Y process is important, and also must be performed in the order described or the results will fail.

DEFINITIONS:

sRGB: sRGB is a tristimulus color model which is the standard for the Web, and used on most computer monitors. It uses the same primaries and white point as Rec709, the standard for HDTV. sRGB differs from Rec709 only in the transfer curve, often referred to as gamma.

Gamma: This is a curve used with various methods of image coding for storage and transmission. It is often similar to the perception curve of human vision. In digital, gamma's effect is to give more weight to the darker areas of an image such that they are defined by more bits in order to avoid artifacts such as "banding".

Luminance: (notated L or Y): a linear measure or representation of light (i.e. no gamma curve, or gamma 1.0). As a measure it is usually L = cd/m2 or "nits". As a representation, it's Y as in CIEXYZ, and commonly 0 (black) to 1.0 (or 100) (white). Luminance features spectral weighting, based on human perception of different wavelengths of light. However, luminance is linear in terms of lightness/darkness - that is if 100 photons of light measures 10, then 20 would be 200 photons of light.

L* (aka Lstar): Perceptual Lightness, as defined by CIELAB (L*a*b*) Where luminance is linear in terms of the quantity of light, L* is based on perception, and so is nonlinear in terms of light quantities, with a curve intended to match the human eye's photopic vision (approx. gamma is ^0.43).

Luminance vs L*: 0 and 100 are the same in both luminance (written Y or L) and Lightness (written L*), but in the middle they are very different. What we identify as middle grey is in the very middle of L* at 50, but that relates to 18.4 in Luminance (Y). In sRGB that's #777777 or 46.7%.

Contrast: The term for defining a difference between two L or two Y values. There are multiple methods or standards for contrast. One common method is Weber contrast, which is ΔL/L, with contrast stated a percentage (70%).

LUMINANCE (Y) from sRGB

STEP ZERO (un - HEX)

If needed, convert a HEX color value to a triplet of 8 bit integer values where #00 = 0 and #FF = 255.


/////  Parsing RGB Values out of Hex, integer, or string.  /////

/////  Bit Shifting for Thrills and Excitement  /////

    // HEX to R,G,B Bit Shift
  var HEXcolor = 0xABCDEF;

  var R = (HEXcolor & 0xFF0000) >> 16,
      G = (HEXcolor & 0x00FF00) >> 8,
      B = (HEXcolor & 0x0000FF);

    // Going back, bit shift R, G, B into a single value
  var rgbTotal = ( R << 16 | G << 8 | B );
  
/////  Gosh, Bit Shifting is scary, what about regular math?  /////

    // Using Mod to extract (less efficient than bit shift):
  var R = Math.trunc( rgbTotal / 65536 );
  var G = Math.trunc( ( rgbTotal % 65536 ) / 256 );
  var B = rgbTotal % 256;

    // Going back using multiply add (less efficient than bit shift):
  var rgbTotal = R * 65536 + G * 256 + B;

/////  I Prefer Strings and So Does My Cat  /////

    // As HEX is often sent as a string, string parsing is common.
    // Assuming the order is RRGGBB, then .substr(0,2) parses RR
    // Using JS String Methods:
  var HEXstring = 'ABDCDEF';
  
  var R = parseInt(HEXstring.substr(0,2), 16); 
  var G = parseInt(HEXstring.substr(2,2), 16);
  var B = parseInt(HEXstring.substr(4,2), 16);


STEP ONE (8 bit integer to float)

Convert 8 bit sRGB values to float (0.0 to 1.0) by dividing by 255.0:

   float = Rʹ8bit / 255.0       Gʹfloat = Gʹ8bit / 255.0       Bʹfloat = Bʹ8bit / 255.0

If your sRGB values are 16 bit then convert to decimal by dividing by 65535.

STEP TWO (Version A Linearize, Simple)

Raise each color channel to the power of 2.2, the same as an sRGB display. This is fine for most applications. But if you need to make multiple round trips into and out of sRGB gamma encoded space, then use the more accurate piecewise version below.

   Rlin = Rʹ2.2    Glin= Gʹ2.2    Blin = Bʹ2.2

STEP TWO Alt (Version B Linearize, IEC Standard)

Use this version instead of the simple ^2.2 version above if you are doing image manipulations and multiple round trips in and out of gamma encoded space.

function sRGBtoLin(colorChannel) {
		// Send this function a decimal sRGB gamma encoded color channel
		// between 0.0 and 1.0, and it returns a linearized value.
		if ( colorChannel <= 0.04045 ) {
			return colorChannel / 12.92;
		} else {
			return Math.pow((( colorChannel + 0.055)/1.055),2.4);
		}
	}

STEP THREE (Spectrally Weighted Luminance)

The normal human eye has three types of cones that for simplicity we'll say are sensitive to red, green, and blue light. But our spectral sensitivity is not uniform, as we are most sensitive to green (555 nm), and blue is a distant last place. Luminance is spectrally weighted to reflect this using the following coefficients for sRGB:

   Rlin * 0.2126 + Glin * 0.7152 + Blin * 0.0722 = Y = L

Multiply each linearized color channel by their coefficient and sum them all together to find L, Luminance.

STEP FOUR (Contrast Determination)

There are many different means to determine contrast, and various standards as well. Some equations work better than others depending on the specific application.

WCAG
The current web page standard listed in the WCAG 2.0 and 2.1 is simple contrast:

   C = ((Llighter + 0.05) / (Ldarker + 0.05)) : 1

This gives a ratio, and the standards of the WCAG specify 3:1 for non-text, and 4.5:1 for text.

However, it is a weak example for a variety of reasons. I'm on record as pointing out the flaws in a current GitHub issue (695) and have been researching alternatives.

Modified Weber
The Hwang/Peli Modified Weber provides a better assessment of contrast as it applies to computer monitors / sRGB.

   C = (Llighter – Ldarker) / (Llighter + 0.05)

There is also this further modified Weber:

   C = ((Llighter – Ldarker) / (Llighter + 0.125)) * 0.8

Note that I chose the flare factor of 0.125 instead of 0.05 based on some recent experiments. That value is TBD though, and a different value might be better.

LAB Difference
Another alternative that I happen to like more than others is converting the linearized luminance (L) to L* which is Perceptual Lightness, then just subtracting one from the other to find the difference.

Convert Y to L*:

function YtoLstar(Y) {
		// Send this function a luminance value between 0.0 and 1.0,
		// and it returns L* - perceptual lightness
	if ( Y <= (216/24389) {       // The CIE standard states 0.008856 but 216/24389 is the intent for 0.008856451679036
			return Y * (24389/27);  // The CIE standard states 903.3, but 24389/27 is the intent, making 903.296296296296296
		} else {
			return Math.pow(Y,(1/3)) * 116 - 16;
		}
	}

Once you've converted L to L*, then a useful contrast figure is simply:

    C = L*lighter – L*darker

The results here may need to be scaled to be similar to other methods. A scaling of about 1.6 or 1.7 seems to work well.

There are a number of other methods for determining contrast, but these are the most common. Some applications though will do better with other contrast methods. Some others are Michaelson Contrast, Perceptual Contrast Length (PCL), LRV difference, simple ratio, and Bowman/Sapolinski.

ALSO, if you are looking for color differences beyond the luminance or lightness differences, then CIELAB has some useful methods in this regard.



SIDE NOTES:

Averaging RGB No Bueno!

There is a commonly cited equation for making a greyscale from a color:

    GRAY = round((R + G + B) / 3);

It has a bit of a problem: It is completely wrong. The spectral weighting of R, G, and B is substantial and cannot be overlooked. GREEN is a higher luminance than BLUE by an ORDER OF MAGNITUDE. You cannot just sum all three channels together and divide by three and get anything close to the actual luminance of a particular color.

I believe the confusion over this may have come from a color control known as HSI (Hue, Saturation, Intensity). But this control is not (and never intended to be) perceptually uniform!!! HSI, like HSV, are just "conveniences" for manipulating color values in a computer. Neither are perceptually uniform, and the math they use is strictly for supporting an "easy" way to adjust color values in software.

Sample Colors

Using #318261 #9d5fb0 as test colors. Here's how they look on my spreadsheet, along with each value in every step along the process of conversion (using the "accurate" sRGB method):

Spreadsheet of Calculations

Both are close to middle grey of #777777. Notice also that while the luminance L is just 18, the perceptual lightness L* is 50.

— Andrew Somers 2019