DeepPRIME XD introduces purple overlay/chromatic aberration

We can take some time over coffee to go over shared knowledge of computer architecture - I’m quite well versed in this topic.

My system is a Mac Studio, 10core M1 CPU with maxed GPU/neural net/memory, and on this system, the CPUs are not fully loaded, neither is the GPU. The neural net seems to be the where the bottle neck is (10 simultaneous threads uses about 48GB of memory when processing R6 20-something MB RAW files), but I don’t have a diagnostic tool for this system to tell me for sure or not. BTW, it allows me to process up to 10 simultaneous images, but as noted elsewhere, the optimal setting is 2. The system just doesn’t seem to be stressed much with these, and that’s was the reason for asking the question.

Fully understand and agree. Settings beyond 2-4 don’t deliver shorter processing. If larger numbers of parallel processings were offered, the frustration would be all the bigger too :wink:

GPU and ANE are different units and the latter does not show in activity monitor.

Right - which is why I said that I don’t have the analysis tools on this machine to show it for sure.

Just looking at the patterns I see from when the CPU gets periodically loaded, to when the GPU is being partially loaded (you can kind of see the processing move through the different parts), it just doesn’t seem well optimized.

I use dual 4K 31 inch monitors with a MacBook Pro 16 Max. No slowdown here. What I do with my 4K monitors though is run them at 1920x1080. I would prefer 2560 x 1440 scaling but that quadruples the amount of memory and calculation required to render the screen. 1920 x 1080 is far more efficient as it’s simple doubling of pixels, which is Apple’s own default Retina render on its own screens.

MacOS scaling is interesting in that the default scaling at 1920x1080 does not simply double pixels. What it does in the case of your 4k screens is to render all UI elements at double the size but at full 4k resolution. Video, players, web content, photo previews are also rendered at double the size but in full 4k. Problems occur when you use a resolution other than native screen (way too small) or the default 2X. It will make things noticeably blurry and it slows things down by scaling everything to fit the screen.

1 Like

Right - and this is why systems can slow down when using a monitor not being scaled to a native size. The difficult part is that ‘native’ doesn’t translate to 4K in most cases, so it’s either 2K or 5K. Sigh.