DeepPRIME XD introduces purple overlay/chromatic aberration

On 6.5 I am sometimes getting weird artifacts on when using only the GPU on an M2 MacBook Pro 14.

The work with Apple reported by DXO in the release notes doesn’t seem to be being very productive.

I’d put the blame firmly on Apple for that. I had a “known bug” (so I was unlikely the first to encounter it) in macOS that was fixed 842 days after I first reported the issue. That’s only a handful of users affected, probably, but I cannot imagine their developer relations are much better (in fact, I know they’re not from listening to developers’ stories).


Yes, I agree. There are many occasions when Apple does not give an F about fixing bugs or interactions with developers even large companies. That could be the case here too.

But we don’t know where the bug is, in Apple code or in DxO code. It could very well be that DxO relied on a bug, which after Apple fixed it, broke the DxO NE code path.

Given the deafening silence from the DxO side about this issue, which degrades the main marketing feature of their software, I am leaning towards the bug being in DxO code.


I’m not. An app that I wrote developed a bug with a sub-version upgrade of macOS and it meant I had to spend a not inconsiderable time trying to track it down.

Apple publishes APIs, which we rely on to do the job. Those APIs are doors into a black box over which we have no control. If Apple decide to rearrange the furniture, we are fairly much stuffed.

I even have one “feature” of my app that I had to remove because the API was deprecated without any warning.


It doesn’t matter whose fault it is. PL is not developed by Apple. As DxO customers, to me, this bug squarely falls on DxO to fix. Most companies would have developed a module by now to work around this bug by perhaps creating a “filter” module (that can be easily deactivated by the user or a patch once Apple fixes their bug – assuming Apple sees this as a bug in the first place) that can analyze and compensate for these color shifts. I understand it is not that easy, but DxO isn’t a 1 person shop.

I truly believe DxO simply lacks the expertise to provide a workaround for this bug and them being silent on this whole situation speaks volume to this fact. Either that, or they really are just waiting on Apple. It’s like hoping the Titanic would change course instead of hitting the iceberg. The Titanic being DxO and the iceberg being Apple.

Let’s face it. By the time this gets resolved, it would be time for the next greatest/latest version - PL7. I would love to be wrong about this.

Then you obviously don’t fully understand how software development works.

In this discussion and elsewhere, this has been found to be connected to the new M1 neural engine, which is entirely out of reach for non-Apple developers.

However, if you look here you will see there is a workaround, simply by switching the processing engine.

1 Like

I think you are late to the conversation. And yes, I do know how software development works. I was an engineer and a software developer for a long time. I’m simply pointing out that as a developer, no matter whose fault it is, your customers are going to look to YOU (not Apple) to come up with a workaround. And it is your responsibility to find a workaround/solution for your customers.

I hope you see the name of the person who posted that workaround (as well as who/when this thread was started, for context).


I’m a novice photographer and have been greatly impressed by what DxO PhotoLab6 can do to automatically correct and fix photos out of our Sony RX10m4. I am very new to “post-processing” photos and waiting 10-11 seconds per 20MB RAW for optical correction and denoise is very acceptable to me. I applied DeepPrime to 306 photos and it took 56 minutes. DeepPrime XD takes over 3 hours to process the same batch but I honestly can’t see drastic difference between the two outputs and am more than satisfied with regular DeepPrime.

I was disappointed to learn about the color shift with Apple Neural Engine but honestly the GPU performance on my humble 16GB/512GB M1 MacBook Air is still pretty awesome, and of course faster than my older Windows 10 rig with a GTX 950. I have been considering adding an M1 Max Mac Studio or M2 Max MacBook Pro and appreciate @uncoy’s contributions to this thread. I am a little surprised the DeepPrimeXD neural engine output on my M1 is faster than uncoy’s M1 Max, although everything else is expectedly slower than the M1 Max. This is comparing the same RAW soccer photo that uncoy shared and made downloadable.

This is on Ventura 13.3.1 with PL 6.5.1 build 49.


Just to add a little more data:

The M1 Max (studio) that I recently bought is the 10c/32GPU version with 64GB of RAM.

I processed 302 images using DeepPRIME XD on the “automatic” mode, with 10 images in parallel. It completed the task in 1 hr 10 min 27s, about 14 seconds per image. For giggles, I re-ran the same images changing only to DeepPrime and the batch took 14m 46s or just under 3 seconds per image. Because I ran this in auto mode, I can’t say one way or the other if DeepPrime did NOT use the Neural engine or not, but it sure seems like it with this speed…

1 Like

Yeah, in “automatic” mode it does not use the Neural Engine for now. Standard DP is much easier to process and much faster than DPXD.
Thank you for sharing your stats.
It seems that because of much GPU power M1 Max does not even need the Neural Engine processing, it works on GPU just fine.

1 Like

Almost 9 months… psst… It’s time to wake up DxO

I keep hitting the ‘check for updates’ button in vain…

I was also hoping to be added to the beta program, but while that program seems open, they may not be taking new folks.

The PhotoLab beta test program recruits participants in November/December.


Ah - that makes sense… that said, they should note that more prominently as some of them are marked as “closed” and others as “open” … which leads one to believe that they may still be recruiting.

One hope I have is that the ‘max’ number of processed images can scale with the number of cores one has…

The issue seems to be fixed with the current PL 6.6 update. No color cast with ANE and Deep Prime XD anymore for my test files.


Merci pour ce retour :slight_smile:

I just updated it to 6.6. Something changed, there is no circular cast all over the image, but there is still purpling in the dark areas (you can see on the images, both are processed with Neural Engine, the first image is DP, second is DPXD).
It seems that DXO really works in the constraints of Apple errors. And Apple is the limiting factor.
Well, at least DXO seems to be looking at these forum discussions and all our feedback was not for nothing.

Please explain further - why is Apple the limiting factor? The Lightroom AI noise reduction uses the Neural engine without issues…

From the release notes for version 6.6 it seems there is something on the Apple-side needing to be fixed:

Bug fixes
• Temporary solution for Magenta color cast on Apple Ventura while using DeepPrime and DeepPrime XD.
• Temporary solution for chessboard patterns on Apple Monterey & Ventura for Xtrans with DeepPrime XD
o Those temporary solutions will be updated when Apple will deliver its own fix for macOS.