CRI80 vs CRI95, Which Would You Rather Have?

  • 1081
  • Jimmy at
  • March 02, 2019
Color rendering index (CRI) has long been used in the fluorescent and high intensity discharge (HID) world as a metric of how well a specific lamp illuminates color.

When I started in the lighting industry, I don’t think I fully grasped what CRI meant in practice until I attended my first training with a lamp manufacturer. I read about CRI in theory, but it was just that, theory. The impact of CRI didn’t really resonate.

It wasn’t until I witnessed the effect that lamps with different CRIs had on materials of different textures and colors at that manufacturer training that I really felt comfortable talking about CRI and its impacts on the lighted area.

So, How Do You Measure CRI?
Simply put, CRI is determined by comparing light sources of the same correlated color temperature (CCT) against a reference source with a CRI of 100. The testing method evaluates the light source’s ability to render eight standardized pastel colors.

The difference between the light reflected by the reference source (CRI 100) and the test source results in the numeric CRI value. So, generally speaking, the lighting industry uses the CRI value of a lamp to represent the quality of light it emits, or the “trueness” of the way colors look under that light; the closer the CRI is to 100, the better.

Here’s an example that showing the difference between poor color rendering light (left) and high-color-rendering light (right). Which would you rather have? 


Strawberries-w-CRI-notations