I too can confirm that the issue is related to Apple Neural Engine or at least to the combination of it and PhotoLab’s algorithms.
The first picture is processed with DeepPrime XD and the Neural Engine, the 2nd is processed with the graphics processor Apple M1. I have to admit that the color shift can’t be seen in this stylized form, but if you zoom in you can clearly see the previously mentioned strange pattern in the sky which is gone when using M1.
So, using M1 seems to solve the problem for now, but since it takes more time for M1 processing than using the Neural Engine does, DxO should provide an update asap.
Btw, any official comments regarding this issue? That would be highly appreciated.
EDIT: OK, due to the compression of the images you can’t see what I was talking about.
We can point fingers all over the place. But at the end of the day, I bought a product from DxO at a higher end price for photo editing — not from Apple. It is also a feature of the product. So as an end user, I just expect to get what I paid for. I don’t care who fixes the problem or who created the problem.
Ventura had not yet been released when PL6 was released.
Apple quite often change things last minute so even if PL6 late betas or rc was tested on Ventura betas - it not guarantee it will perform 100% identical on the final Ventura release.
We will have to wait for DxO devs to do their magic.
Today I upgraded to Ventura, thanks in part to the information in this thread. I processed only 5 photos today but made some interesting observations.
With all photos having DeepPRIME turned on, I exported with the default (Neural Engine) setting and did not detect any colour cast.
I then selected the GPU and exported again. I can detect a very slight change in some yellow in one shot, but only by quickly flipping between the two versions.
Export with GPU was faster than with Neural Engine!
I suppose that export times depend on the number of parallel exports and image customising. With the No Correction preset, my M1 MacBook Air 2020 (8+8) exports 60 images in 60 seconds or 1.29 GiB/minute, no matter if I set 4 or 8 parallel exports each on neural/GPU/CPU. I guess that No Correction only uses the CPU.
Hmm, did you use DeepPRIME XD or just the standard DeepPRIME version? Otherwise, your findings seem a little bit strange to me. I, however, can’t reproduce them.
The export with the Neural Engine is in my case (14’’ macbook Pro M1, 16 GB RAM) far quicker (14 s, one image export, Lumix G9 raw file) as compared to the GPU M1 acceleration (27 s).
I repeated the process with 5 images, but got the same result: 45 s for Neural Engine, 107 s for the GPU. So, in my case the Neural Engine export is always faster. Btw, I allowed for 8 parallel exports.
In addition I don’t see any color change when using DeepPRIME or DeepPRIME XD when exporting with GPU support. I only get the issue by using DP XD in combination with the Neural Engine.