Section 4 of this article (Working Space vs Output Space) talks about the importance of using the least restricting colour space in your post-processing software … which leads me to the following questions;
(As made clear in this article) a specific colour space is not “baked-in” to a RAW file - However, EXIF data does include the colour space prescribed for the camera to use (for JPG usage). Does PL take any notice of the colour space info in the EXIF data … OR, does it always work in its own natural working space ?
Which colour space is PhotoLab using as its (natural) working space ?
The article recommends the ProPhoto, 16bit colour space as the best colour space to use for post-processing work … Is this, perhaps, what PL is using as its working space ?
When exporting to Nik Color Efex Pro, which ICC profile should we be using for best results?
When “Export(ing) to application” we have the (advanced setting) option to specify a custom ICC profile: Is the ProPhoto colour space available as a custom profile ?
I decided long ago that i am in sRGB working.Less unexpected colorchanges along my workflow.
your monitor needs to be handling the color space correctly, (most are sRGB)
Coming down from larger colorspaces if your monitor is calibrated and viewing this Prophoto-colorspace correct and after that you go to AdobeRGB and the end wil be a sRGB jpg your keep correcting the borders of the colorspace because of strange color effects which occure if the color balance isn’t equal cut on “RGB” wavelengths.
See the “reddish glow on raw’s from LR to PL” post.
Prophoto and AdobeRGB are only usefull when monitor/graphiccard can handle this and you are aiming on “printing” the endproduct. because for viewing on a 4KTV it is stil sRGB and you need to convert/callibrate all images to get the correct colorpallet.
(edit: if you arn’t having a monitor callibrated and fit for bigger then sRGB you are fiddling with your eye’s closed in muddy water to assemble a device under water. Which is fine as you keep it underwater but it can be a surprise when it emerged above water. adjusting color in a colorspace you can’t see at full and correct is the same methode )
So yes AdobeRGB in PL is better,wider, but i narrowed it down because i use viewing on FHDTV as main purpose and edit on a non calibrated FHD monitor so my colors are “a gues” anyway.
BUT, for the placement of DxOPL in the more professional customers Prophoto colorspace as max workspace can help a lot to avoid headache’s when they transfer raw files between Adobe products and DxOPL. And they are proofprinting and delivering printing on several media which needs a wider gamut/colorprofile.
So for that matter extending from AdobeRGB to Prophoto RGB as default wides workspace can be a good decision.
So the internal color working space of DPL is Adobe RGB. That is, the RAW data are converted to this color space. I will duplicate here the answer I made to Wolf in the thread mentioned above because it’s more adequate to discuss this issue here…
I’m really wondering why Prophoto was not chosen instead of Adobe RGB ? Of course, we do know that most output devices are unable to display all the colors of the Prophoto workspace. That’s not the problem. However, computations made in a wider color space and using 16-bit bit depth will be more accurate . Even if the data are eventually converted to a smaller color space at the end of the process (export, printing), it’s better to make the intermediary computations as accurate as possible. In any engineering computation, rounding should occur only at the end . It seems that DxO has chosen to make roundings at the very beginning of the process. This is surprising and unusual when comparing with other similar software.
I guess that using Prophoto or something like the Melissa RGB color space of Lightroom wouldn’t have caused any particular difficulty when developing DOP/DPL. DPL could have been a good opportunity to switch to a more “classical” choice. In the DOP era, I can’t remember having read clear statements about this with an exception in 2012 when a staff member wrote this very funny post :
En novembre 2012, Andy du support technique DxO écrivait ceci :
DxO Optics Pro uses Adobe RGB as its internal working color space. As I pointed out in my previous email, you need 32-bit TIFFs for Prophoto color space. Adobe RGB is about right for 16-bit TIFFs and 16-bit TIFFs are far too small for Prophoto RGB. 32-bit TIFFs do not exist yet. What you have read on the various forums is incorrect. DxO will give you output files converted to ProPhoto color space if you wish it. However, you need to do a little more reading because a 16-bit TIFF file is not enough to hold all this information, much of it cannot be displayed on anything because it exceeds the capabity of human vision, and requires a 32-bit TIFF to hold it all. We do not have 32-bit TIFF files yet, the best most expensive monitors cannot yet display the full Adobe RGB gamut, and some of the “colors” in the full Adobe RGB gamut are invisible to humans. You can do this, but you will lose information rather than have something more accurate. See the attached.
Obviously, he still had a lot to learn about color management.
It’s also worth noting that even if we can export from DPL using the Prophoto wokspace, the result will not be different from an Adobe RGB export because the internal data are already limited to this color space. Again, it’s not a matter of reproducing these colors. It’s about the accuracy of computations.
I may be wrong but I think there’s something inconsistent with this approach.
And for those who are asking “Do I need such accurate computations ?” : accurate computations mean less artefacts and better color transitions. Let’s take a simple example…
Let’s assume a number with one decimal place, say 2.5. Let’s multiply it 10 times by another number with one decimal place, say 3.8. The result must be an integer.
Method #1 : the intermediary result is rounded to an integer value at each step. The result will be 1659388.
Method #2 : the intermediary result is not rounded. Only the final result is rounded to an integer value. The result will be 1569539.
Quite a big difference ! That’s why all computations in image processing should be as accurate as possible whatever the target device. Only when “realizing” the image, that is, when outputting it to some device or printing it, should the conversion to the color space suitable for that device take place.
Even if there are not that many devices to display colors outside of AdobeRGB, there is no good reason, I could think of, to restrict the working color space to AdobeRGB. This means only less precision for color calculations inside PhotoLab and inside Affinity Photo using 16bit TIFFs exported from PL. I think, it is just a historically grown decission.
m-photo
( Marc (macOS Sonoma on MBP16" Intel))
10
I couldn’t agree more.
At some point DxO might take this in consideration for a future major release…
3 Likes
m-photo
( Marc (macOS Sonoma on MBP16" Intel))
11
Thank you, I will read it too.
I found some help about color management here ( en français ici ).
The author of this site is Arnaud Frich, french photographer.
I am not that informed in colorspaces but this i can grasp.
Is there a difference between calulation colorspace( the one where the raw information converted to “pixel rgb” is done) and viewing/editing colorspace setting? (the one you “see” on your screen/monitor?) Little reading: as fas as i understand the “frame” which you lay over the horseshoe of human visual colorspace is moving over the colorspace the rawconverter can handle.
So the clipping of colors is not fysical gone it is just not shown on the display/viewing device. So changing exposure value to avoid highlight clipping or shadow lifting to get more detail is actual bringing “colorvalues” from outside your chosen colorspace inside this “frame”.
Which brings up next question: the blinkies sun and moon: if i change display colorspace and i got blinkies (clippingwarning) in sRGB but not in AdobeRGB : you got false information if you export in sRGB. (the blinkies free preview is in export file not clipping free) or is it blinkies on the conversion colorspace?) in white and exposure correction the colorwarning light blue doesn’t changed in both colorspaces. also in color saturation warning it did’t show a difference) So it seems not to be linked to the viewing colorspace of the devicesetting.
you have:
base(max) colorspace as in raw to pixel conversion, i believe now Adobe RGB.(a bigger one would be better for more precise colorcalculation in this conversion proces because you can’t make/export bigger colorspace then the conversion colorspace.)
setting of work colorspace ( original, sRGB, AdobeRGB,) the viewers workspace and where you edit the colors in. (this one we can’t set our selfs now.))wrong/ its in display preferences ICC profile used for display:
setting of output workspace, original, sRGB, AdobeRGB/ custom (given when export a file with in it’s posibilities.)
i am not sure but i think it’s true when you shoot raw and setting in the menu of the camera to sRGB instead of AdobeRGB you only limit the Jpeg’s colorspace not the raw because that isn’t “color” yet. just data of lightness measured during exposure of the sensor by light.
So the setting in DXO as “Original” is only referring to the camera setting in the exif data by output not the actual workspace the rawconverter works.
If you start the Nik collection plug-ins from within Affinity Photo the previews are not colour-managed. If you start them from Ps or Photoshop Elements or in the standalone mode they are fine working with ProPhoto 16-bit tiff images. It’s an often reported bug in Affinity Photo plug-in engine (it affects all the plug-ins I use, not just the Nik Collection). I reported it to them two years ago and it still hasn’t been fixed. Hopefully version 1.7 will address that issue.
After reading this, I think we mix up color space with bit depth a bit:
So if we have a given constant bit depth, and have an image, which only contains colors, which stay inside AdobeRGB range, there should be even less rounding errors, than if Prophoto RGB was used. The more colors are encoded in the same amount of bits, the less distance is available between two colors and the more likely is a rounding of these two colors to the same value in the output. Prophoto allows covering more colors of the visible spectrum, while AdobeRGB allows finer gradients and more precise calculation in the area where the two color spaces intersect.
As you can see Adobe RGB (the red plot) doesn’t completely encapsulate the gamut my camera is capable of. This is comparable to Bill Claff’s findings with respect to an older dSLR (Nikon D70). His conclusion is quite clear:
To preserve the widest range of colors I recommend using ProPhoto RGB as the working color space.
Rendering intent will take care of gamut-mapping for output device, but out-of-gamut warnings and soft-proofing can help in an optimal edit.
The color work space is in my eye’s only fixed after exporting out the rawconverter.
in jpeg tiff or dng.
before that it’s changeable because it’s converting in a cameracolorspace which is cut /clipped by the selected colorspace in the application, which you can change in some apps.
So i get the Adobe RGB camerasetting to get some extra “room” processing the oocjpeg.
Maybe i change camera settingto AdobeRGB.
The question i have is is PL’s AdobeRGB cut out of the camera’s one and is the outer section(colordata which falls outside theAdobeRGB) “thrown” away? And not recoverable when fiddling EV and hue sat and contrast. Because if so then is prophotoRGB as base workspace indeed a much better choise. Because if data is lost by rawconversion in smaller colorspaces the clipped data gone if you use a small colorspace.
stil iam convinced with noncalibrated sRGB screen i don’t see any difference until i print something.
This is NOT my area of expertise, so I cannot say for sure - - but, I would say, logically speaking, that this would have to be the case … because AdobeRGB is smaller than the camera colorspace.
This is probably only an issue for “future proofing” tho (were there to be future technological advances), as we’re not likely to be able to see any difference in colour detail on our screens (according to my understanding).