Observations on Adobe Camera Raw for Astro-Image Processing


The traditional approach to astro-image processing is to calibrate raw exposures with bias, flats and darks. This approach works equally well with DSLR cameras and dedicated astro-cameras although in general few imagers are concerned with proper colour management. Calibrated DSLR exposures are then demosaiced (i.e. turned into colour), star aligned and stacked. Some kind of stretching (hopefully colour preserving!) is then required to make the faint structures visible.

In recent years, a modern approach to astro-image processing has been advocated, which uses the power of the raw converter and dispenses with the need for the bias, flat and dark calibration frames. The resulting colour exposures are then aligned and stacked.

The benefits of this modern approach are the following:
  • The raw converter can automatically deal with hot and dead pixels
  • Lens profiles can be applied to correct for vignetting, replacing the need for flats
  • Corrections for lens distortions and chromatic aberrations can be applied
  • Good noise reduction techniques can be applied
  • The raw converter applies white balance, chromatic adaptations and full colour space management which should mean the result is colorimetrically correct.

    However, is this "colour managed" approach everything it claims to be? Here is my attempt to find out ...

    Two Example Astro-Images

    Some data of the Pleiades and the Orion Nebula was acquired with an unmodified Canon 600D camera on a Takahashi Epsilon 180ED telescope. The resulting images are below - the same data processed using 2 different workflows.

    Image 1: Orion Nebula. Stock Canon 600D on Takahashi Epsilon 180ED telescope. 75 x 30sec exposures at F2.8 ISO 1600. Light pollution SQM reading of 20.7 magnitudes/square arcsecond

    Figure 1 - Orion Nebula - Workflow 1 (Adobe RGB image converted to sRGB for the internet)

    Figure 2 - Orion Nebula - Workflow 2 (Adobe RGB image converted to sRGB for the internet)

    Image 2 Pleiades. Stock Canon 600D on Takahashi Epsilon 180ED telescope. 60 x 60sec exposures at F2.8 ISO 1600. Light pollution SQM reading of 20.7 magnitudes/square arcsecond

    Figure 3 - Pleiades - Workflow 1 (Adobe RGB image converted to sRGB for the internet)

    Figure 4 - Pleiades - Workflow 2 (Adobe RGB image converted to sRGB for the internet)

    Depending on your preferences or what you are accustomed to, the Workflow 1 versions might appear too saturated or the Workflow 2 versions might appear too washed out.

    There are many different philosophies regarding the colour balance to be used for astro-images but since a DSLR has been used for acquisition and a DSLR is supposed to reproduce the colours that the human eye would see, then it seems to be a perfectly reasonable goal to produce images in "natural colour". Even if that is not your goal it is well worth understanding what may or may not be going inside the raw converter, when you are using Adobe Camera Raw (ACR).

    Which result is more true to the original data? Are either of the two workflows anywhere near correct? Read on to find the answers.

    Skip to the conclusion if you wish ...

    If you want to know the conclusion in advance then please free to skip to the end and read the conclusion before continuing. However, if you prefer to read an unfolding mystery then don't jump to the end just yet.

    Generating DNG files

    It was clear that the best way to understand what was happening in the Adobe Camera Raw (ACR) raw converter was to feed it with carefully designed known data. But how to populate a raw file with known data when every camera manufacturer uses their own proprietary format? The solution was to use the Adobe Digital Negative (DNG) format. Raw files from any camera can be converted into this format using either ACR or the free Adobe DNG Converter. DNG is a publicly available archival format for storing camera raw files. The format is documented and Adobe even provides a software development kit (SDK) so a programmer can write software to generate DNG files from camera raw files or from completely synthetic data.

    Blue Step Wedge example 1

    Here is the first file I generated, containing totally synthetic data. It contains a blue step wedge alongside the white step wedge:

    Figure 5 - Blue Step Wedge (Adobe RGB image converted to sRGB for the internet)

    The DNG file, along with other test examples can be downloaded from: Google Drive Test Examples

    The file has been carefully constructed so that the blue on each step on the wedge is exactly the same colour but each step is one exposure stop darker than the previous one. In other words the ratios of R:G:B on each step are identical.

    Open it up in Photoshop/ACR using the Adobe RGB colour space (or sRGB if you prefer) with the default settings and it looks quite normal as you can see above.

    However if the colour sampler tool is used to actually measure the colour ratios something odd can be seen:

    Figure 6 - Blue Step Wedge RGB ratios (Adobe RGB image converted to sRGB for the internet)

    The colour ratios change on each step of the wedge - the blue is becoming more colour saturated as the light intensity falls.

    This causes problems for a typical astro-processing workflow because the really faint structures need to be stretched out of the dark shadow areas. If this data is already colour saturated then the appearance of the faint structures made visible in the final image will also be colour saturated. This can be seen in the following example where the above image has been stretched in a colour preserving manner.

    Figure 7 - Blue Step Wedge with colour preserving stretch applied (Adobe RGB image converted to sRGB for the internet)

    You can find out more about how to apply colour preserving stretches in Photoshop here: Photoshop Colour Preserving Stretch

    As an aside, the colour ratios in Figure 6 were measured in Adobe RGB. If the DNG file is opened in a different colour space e.g. sRGB then slightly different ratios will be obtained but the overall effect is very similar.

    A Brief Introduction to Colour Science

    At this point it is worth giving an introduction to the science of digital imaging - especially the difference between linear and non-linear data because this distinction becomes very important when subtracting light pollution.

    When the eye looks at a real world scene it is capturing photons - you can consider the eye as a detector of photon rate (i.e. scene intensity). The camera, computer and display device acts as a complete imaging and display system. If it is working correctly then the eye will see the same intensities on the display device as it did from the original scene being imaged. If the RGB intensities are not in the right proportions we'll end up viewing the wrong colour on the screen.

    The camera, like the eye, captures photons. The camera stores the resulting pixel values. We call this data linear because the recorded values are directly proportional to the number of photons captured during the exposure. To create an image file (e.g. a JPG file) these values must be transformed into a known colour space such as sRGB or Adobe RGB. This involves a non-linear transformation of the data. The transformation used is a power function whose exponent is referred to as gamma. The Adobe RGB colour space uses a constant gamma of 2.2 whereas sRGB uses a variable gamma, starting off linearly (gamma = 1.0) and increasing to approximately 2.2 over a broad range of values.

    The display device (or its driver) knows how to interpret the colour space and is able to take this non-linear data and convert it back to linear intensities for displaying on the screen, whose gamma is typically set to 2.2. The image on the screen then has the same (linear) intensities as the original scene and it will appear to the eye to be just like the original scene.

    Obviously many details have been skipped in the above description but the relevant steps for our purposes of this discussion have been covered. It's worth emphasizing that the values stored in the JPG file are the non-linear values

    Blue Step Wedge example 2

    The second example is more realistic as an example of a typical astro-imaging exposure because it includes the effects of light pollution:

    Figure 8 - Blue Step Wedge with Light Pollution (Adobe RGB image converted to sRGB for the internet)

    As before, the DNG file can be downloaded from: Google Drive Test Examples

    It is exactly the same data as the first file except that a constant light pollution level has been added. Open it in Photoshop/ACR and subtract the light pollution. The approach I took here was to use the Blacks slider in ACR to subtract as much as possible without clipping the data. Following this, within Photoshop, a curves layer was used to set the black level, thus removing the remainder of the light pollution. Here is the result:

    Figure 9 - Blue Step Wedge with light pollution subtracted in Photoshop/ACR (Adobe RGB image converted to sRGB for the internet)

    Visually it's quite obvious that parts of the white wedge are turning blue. This becomes even more obvious when the colour ratios are actually measured:

    Figure 10 - Blue Step Wedge RGB ratios with Light Pollution Subtracted (Adobe RGB image converted to sRGB for the internet)

    Notice how the proportion of blue increases as the scene intensity reduces. The reason for this is that the light pollution has been subtracted from non-linear data when it actually needs to be subtracted from the data in its linear state. The bluing of the data is because light pollution is generally a muddy brown colour, strongest in red and weakest in blue.

    Summary so far

    The example step wedge files have demonstrated 2 effects:
  • Colour saturation increases as scene intensity decreases
  • Subtraction of light pollution causes a bluing of the data as scene intensity decreases

    Any workflow that involves the use of the ACR raw converter will suffer these issues and can lead to incorrect colours .

    Raw Converter Transfer Curves

    To better understand what is happening in the raw converter it is possible to generate the transfer curves i.e. the curve representing the output (image) value for every input (camera) value. The curves in the graph below were obtained by opening a synthetic gray step wedge DNG file whose raw input values were known. The input values were 14 bit i.e. in the range 0-16383 and the output values were obtained from Photoshop by placing colour samplers on the steps of the wedge:

    Figure 11 - Adobe Camera Raw Transfer Functions

    The above chart uses logarithm scaling on each axis. The advantage of this is that the gamma can be measured directly from the slope of the curve. On the same graph is plotted the true curves of the Adobe RGB and sRGB colour spaces.

    Points of note are the following:
  • The AdobeRGB colour space uses a constant gamma of 2.2
  • The sRGB colour space starts off linear (gamma=1.0) at low values and ends up very close to constant gamma 2.2 for higher values
  • The AdobeRGB curve implemented by the default process version 2012 differs quite substantially from the true AdobeRGB curve. In particular it is extremely steep for low values.
  • The sRGB curve implemented by the default process version 2012 also differs quite substantially from the true sRGB curve.

    It is clear that by default, process version 2012 is applying a number of additional adjustments even in its neutral state, at the expense of colorimetric accuracy. These adjustments do result in a more vibrant image, which is typically what photographers want to see in their images.

    It is also possible to open files using the earlier process versions: 2010 and 2003. When switching to process version 2010 it is immediately noticeable that the sliders (Blacks, Brightness and Contrast) are not in the "zero" position and that the tone curve is using the "Medium Contrast" setting. This gives some indication of the adjustments that process 2012 applies by default. To plot the graphs of process 2010 and 2003 I made sure that the sliders were all set to zero and the tone curve used the "Linear" setting instead of "Medium Contrast".

    Points of note for process versions 2010 and 2003:
  • The sRGB curve exactly follow the true sRGB curve
  • The Adobe RGB curve follows the true Adobe RGB curve very closely except it becomes linear for output values less than around 16.

    How the Transfer Functions Affects Images

    It is now possible to understand the saturation effects seen in the blue step wedge example in terms of the transfer curves. Remember it was using AdobeRGB as implemented by process version 2012. The very saturated colours in the low intensity parts of the scene are caused by the very steep transfer curve (i.e. low gamma) in this region. The variations in blueness over the entire range of intensities are caused by variations in the gradient of the transfer curve(i.e. variations in gamma).

    While the image is still open within ACR it is possible to use the exposure slider to move the data to different positions on the transfer curve. But the simple act of pressing the "Open Image" button freezes the shape of the transfer curve into the image data. The image is no longer linear but has become non-linear, using the local gamma of the transfer curve where the data happens to be sitting.

    Notice in the the above graph how the AdobeRGB curve implemented by process version 2010 has constant gamma 2.2 for all but the lowest data values. This means that we shouldn't see the variations in colour that were caused by the lumps and bumps in the process version 2012 transfer curve. Indeed that is the case - here is the blue wedge example opened with process version 2010 with all sliders set to zero and using the "Linear" setting for the tone curve:

    Figure 12 - Blue Step Wedge RGB ratios - Process version 2010 (Adobe RGB image converted to sRGB for the internet)

    Note how the colour ratios of the blue remain constant until the data values drop to the level where they are sitting on the low-gamma part of the transfer curve.

    Another example is this image of 4 colour charts synthetically created from real data. Each pane of the image is 2 stops darker than the previous one. While the brightest pane is sitting on the upper part of the transfer curve, the darkest pane will be on the low-gamma section.

    Figure 13 - Four Colour Charts (Adobe RGB image converted to sRGB for the internet)

    As before, the DNG file can be downloaded from: Google Drive Test Examples

    A rectangular selection can be drawn around the bottom right pane and a "Levels" adjustment layer applied, dragging down the white point to increase intensity.

    Figure 14 - Four Colour Charts - levels adjustment applied to bottom right pane

    Notice how the bottom right pane is much more colour saturated and more contrasty than the top left one. The "Levels" adjustment is not altering the saturation but it is making visible the saturation that is already there.
    This demonstrates that the increased saturation caused by low-gamma affects all colours, not just blue.

    Back to the Pleiades and the Orion Nebula

    We are now in a position to discuss the astro-images presented at the beginning of this article. Here are those images presented again. What is the difference between Workflow 1 and Workflow 2?

    Figure 1 - Orion Nebula - Workflow 1 (Adobe RGB image converted to sRGB for the internet)

    Figure 2 - Orion Nebula - Workflow 2 (Adobe RGB image converted to sRGB for the internet)

    Figure 3 - Pleiades - Workflow 1 (Adobe RGB image converted to sRGB for the internet)

    Figure 4 - Pleiades - Workflow 2 (Adobe RGB image converted to sRGB for the internet)

    The images produced by Workflow 1 show the symptoms of excessive colour saturation and bluing of the overall scene. Therefore it won't surprise you to learn that they used the default settings of ACR (i.e. process version 2012) and that the light pollution was subtracted partly using the Blacks slider in ACR and partly using the black level in Photoshop curves.

    There's a slight twist, however. Using ACR, it was possible to save a single Canon 600D exposure as a DNG file. This DNG file contains all the Canon 600D specific camera parameters to enable ACR to open the file and display it as colour correctly as possible. The raw data within that DNG file was replaced with stacked data, being very careful that the stacked data was just a less noisy version of the original and introduced no colour biases. The above Workflow 1 images resulted from opening a DNG file created in this way.

    For Workflow 2 the same thing was done (i.e. the DNG file contents replaced with stacked data) except that the light pollution offset was subtracted before writing the stacked data into the file. So the ACR raw converter is applying the correct daylight white balance, the necessary chromatic adaptations and the internal Hue/Saturation/Value adjustments specific to the Canon 600D. Importantly, the process version 2010 implementation of the AdobeRGB colour space was used, with all sliders set to zero and using the "Linear" setting for the tone curve. While the image was still open in ACR the exposure slider was adjusted to ensure the nebulosity near Merope was sitting on the constant gamma section of the transfer curve, so the RGB values were transformed using the ideal part of the transfer curve (i.e. the constant gamma section). Hence the result is as colour correct as possible (for a Canon 600D) in a colorimetric sense except for the very low data values. In particular, the colour of the nebulosity around the star Merope is exactly how the Canon 600D would render it if there were no light pollution. In that sense it represents the "true colour" of the Merope nebulosity.

    If you want to play with the DNG files yourself, they can be downloaded from the same place: Google Drive Test Examples

    PixInsight Workflow

    As a matter of interest, here are the same DNG files processed in PixInsight:

    Orion Nebula - My PixInsight Workflow in sRGB

    Pleiades - My PixInsight Workflow in sRGB

    The overall result is fairly similar to the images produced by ACR Workflow 2 but in both cases the result is slightly more blue.
    The difference is due to a combination of 2 reasons:
  • I use the DXO CameraRaw to sRGB colour matrix from the "Color Response" tab here: DXO Canon 600D Measurements
  • ACR additionally applies some non-linear camera specific colour adjustments saved as a table of Hue/Saturation/Values within the DNG file as some kind of Look-Up-Table (LUT).

    This needs a bit more analysis on my part, comparing Canon Digital Photo Professional, Photoshop/Lightroom/ACR and DXO PhotoLab against "developing by hand" in PixInsight.

    A complete description of the workflow I use in PixInsight is beyond the scope of this current article and will be the subject of a future one, once I have managed to completely reconcile the differences.
    Anyway, for those interested here are the main steps:
  • Subtract bias (2048 for Canon 600D)
  • Demosaic (i.e. create colour image)
  • Apply daylight white balance
  • Multiply by DXO colour Matrix
  • Subtract light pollution
  • Apply scalar multiplier so the green channel occupies the whole 16bit data range (this will clip the top end of the red and blue channels)
  • Apply "fake gamma" of 2.2 (this adjusts the colours in each pixel in the same way as gamma but without actually applying a gamma data stretch)
  • Apply the (colour preserving) Arcsinh Stretch. I used a stretch of around 400x for those example images.


    The above discussion has shown how two limitations of the ACR raw converter lead to colour inaccuracies in typical astro-images.

    The ACR limitations are:
  • It seems to be impossible to subtract the light pollution from the linear data before the gamma curve of the (non-linear) colour space is applied
  • No process version within ACR properly implements a constant gamma curve. Each one has a low-gamma section for low data values.

    The limitations lead to the following effects that are often seen in astro-images:
  • The faint structures pulled out of the shadow regions have high colour saturation.
  • Subtracting typical levels of light pollution from non-linear data results in a bluing of the data, especially in areas of low scene intensity.

    Does this mean that ACR should not be used in an astro-image processing workflow? Not at all, go ahead and use ACR if you like the convenience. Just be aware that the results obtained may not have the colour fidelity that you might be hoping for.

    Final Comments

    It is possible that there does exist a Lightroom/Photoshop/ACR workflow which reduces the impact of the issues observed in this analysis but I'm not very hopeful.
    But it's certainly an area for further investigation.

    As for myself, I'll continue to use a more traditional astro image processing workflow, avoiding Adobe Camera Raw.

    Useful Links

    Google Drive Test Examples
    Photoshop Colour Preserving Stretch

    Page Last Updated: 1 March 2018