Impossible to export with AI masks

I am using a recent Dell laptop with Windows 11. I created AI masks and it’s working, but it is nimpossible to export any pictures when there is a AI mask (manual or predefined): each time an error occurs.
The graphic card is a NVidia 4050, but the bug occurs also when I select the CPU.
I am using Photolab 9.2.1

And what the GPU (graphic card) VRAM memory amount ? 4GB? 6GB?
Also, nVidia driver is the correct one?

Graphic card RTX 4050 laptop GPU with 6GB memory
Driver: 32.0.15.7683 date 16/06/2025

I can export sometimes 1 picture but I get an error if exporting another one

You may try nvidia 581.57 studio driver or 581.80 game ready driver. Some most severe issues were fixed by 581.57 but some still remain. Studio drivers are released more or less on monthly basis, so one can expect new one soon – 581.57 was released on Oct 14.

1 Like

You also write: “but the bug occurs also when I select the CPU.” . Hmmm, pretty strange , as in ‘CPU only’ mode everything need to works (at least like 32GB of system memory needed, best if free as much as possible). I use PL9.x in CPU mode a lot, and so far everything works. May you can follow in ‘Task manager’ the system memory usage.

Regarding GPU VRAM amount (6GB) - i write a comment a few days back about GPU VRAM expected used amount. I suggest to check thru in the link.

But TLDR section from that:

Described in one example:
You have ‘manual AI masks’ and use DP3 NR for export and GPU with 6GB VRAM.
GPU VRAM usage: PL client + ‘Manual’ AI mask + DP3 Export +Windows/Other apps need approx 5.5GB of GPU VRAM. ‘Math’: 0.5+2.5+2+0.5 → 5.5GB
As in this example GPU VRAM is 6GB → you can be fine. May in the ‘edge’ on GPU VRAM amount, but at general can works.

But only IF you use ONLY manual AI masks (and not pre-defined mask) and ONLY if NR is DP3 (and not DP XD2s).

As 'pre-defined (keyword, like: Sky) AI masks use lot more GPU VRAM than ‘Manual’ AI masks. DP XD2s NR use more GPU VRAM than DP3 NR → And it may not fit (definitely not) to to the 6GB GPU VRAM. May its need like 10-12 GB GPU VRAM.

In more details in there. (the not TLDR version)