DISPLAY CALIBRATION 1010 EDUCATIONAL PIECE
Why calibrate?
Monitor calibration is an essential part of any colour managed workflow. If a user is grading an image on a display that does not correctly represent colour or gamma then the rendered result will differ from what they are expecting and may well contain out of range values. This can cause problems with QC and delivering files which ends up costing more money and time to fix.
For example, if a user has a monitor that has a tint towards blue and the user tries to correct this out, the resulting image will contain more red than expected.
How calibration works
The basics of calibration are that you would output a series of solid colours or “patches” to your reference monitor which are then measured against a known standard. For any deviations from the standard you can either manually correct your display, provided that it has controls to do so, or, and the most common, your software will generate a LUT or lookup table. This LUT is a simple transform that takes your input RGB values and modifies. You can find out more information on LUTs from our
The tools used to measure the patches on your display are known as spectrometers or probes. They are light sensing devices that detect each standardised patch and gives a value for it. The software driving the probe then compares the values given by the probe to the expected values for the patches and generates a LUT to account for this.
Limitations when grading with the UI viewer
Most computer displays do not operate at the colour critical tolerances or specifications required for broadcast or theatrical delivery. When viewing anything on your GUI display it can be affected by the OS, GPU and GPU drivers. As such it’s most common to use a video I/O device, specifically designed to output your image untouched from your software directly to a calibrated display.