Do I need to replace my NVIDIA Quadro P620 to use DeepPRIME or DeepPRIME XD/XD2s?

That’s great. I am glad you are on top of this. Don’t forget the heat generated by your processor, especially if it is very near to the new graphics card.

Mark

In the process of taking steps to ensure I can use PRIME, DeepPRIME, and DeepPRIME XD/XD2s on my Windows 11 computer, I made a couple of mistakes and miscalculations:

  • In my initial post I stated: “I was disappointed to get the “Low GPU memory detected. At least 1024 MB of free GPU memory are required for DeepPRIME or DeepPRIME XD/XD2s.” message. And those options are greyed out in the app.” Technically they were not “greyed out”, but the links were not active. I soon found the reason was that I was attempting to process a jpg file. When I opened a NEF (Nikon’s RAW) file, those links were active. Yes, with the tools using the CPU and not the GPU, it ran extremely slowly, but it ran.
  • I ordered a ASUS Dual NVIDIA GeForce RTX 3050 6GB card, and a 500 watt PSU to replace the 260 watt PSU. While waiting for them to arrive, I learned that the ASUS card only consumes 70 watts. Since the card arrived a couple of days before the PSU was due, I went ahead and installed it. (This card does not require a separate power source, which the Dell 260 watt PSU lacks a connection for. It derives all of the power it requires from the PCIe slot alone.) It ran fine, although I did not do any benchmarking. The 260 watt PSU was adequate to power the card, and no significant increase in heat was observed anywhere in the case.

My points are that if I were very patient, which I am, I might have been satisfied with the performance of those Denoising tools using just the computer’s CPU. And if I were to do it over, I would have waited to ensure that the more robust PSU was definitely needed prior to ordering it. (And I might have saved $270. Hey. I’m on a tight budget. Before I retired, I wouldn’t have given the expense a second thought.) As it currently is, the 500 watt PSU is installed and the GPU is working great. And I will continue to monitor the CPU and GPU and other temperatures with HWMonitor as mwsilvers recommended. Thanks to all who joined this discussion and shared their experiences with me. I greatly appreciate it.

2 Likes

@jmcphee I am sorry that we didn’t research the RTX 3050 card more thoroughly and discover that, while certainly slower than the 060 series cards, it needs about half the power, at least with the Asus card you purchased, and runs without the need for the additional GPU power connector.

While I feel that you would need extreme patience every time you exported an image and every time you tried to use the Loupe, with only a CPU, that is always the individuals choice.

My tests with the 1050Ti (4GB), and 1050(2GB) cards produced figures that I published here

Thank you for publishing the outcome of your purchases, while I feel that the RTX3050 was a good investment, the power supply was perhaps a step too far and it simply shouldn’t have been necessary to upgrade at all.

This was published in late October 2023 and I found it by asking the question “RTX 3050 does it need a GPU power connector” prompted by your post

Regards

Bryan

It’s never a bad idea to upgrade a PSU. The nominal power requirement for the 3050 may seem low, but several tests had shown frequent peaks way above the “average” nominal value.

A more robust PSU can cope with those transient peaks much better, without risks to stability or hardware failures and probably more efficiently (in terms of heat), thanks to the headroom.

@Lucabeer I understand what you are saying but when money is short then prudent spending is called for but the following suggests that it might be too tight for comfort!

I ran a test with my 5900X, with an RTX 2060 X2, 3 HDDs etc. exporting 40 Egypt images with PL8.1 and DP XD2s. The machine is idling at about 135 watts according to a power meter.

The CPU isn’t very busy during the export but the machine reached a maximum of 367 watts, easily within the 650W of the Seasonic Platinum power supply. The run looked a bit like this with the GPU taking about 161.6 watts maximum (TDP rating is 160 Watts)

I do take the power supply very seriously, as should others but the newer RTX
3050 does not require the additional GPU power cable and is slower than the original 3050 which did require the GPU connector.

The result of pairing my 2060 or my 3060 with the 5900X is that the GPU is bottlenecking the CPU in both cases so that the 5900X is not running anywhere near its maximum speed and consuming a lot less power as a consequence.

I repeated a shorter test with [M]aster and 10VCs of the Egypt image with PL7.10 and got this from the 5900X

The i5 8500 of @jmcphee is only slightly more powerful than my i7-4790K so I ran a test on the i7 which has an RTX 2060 X1 and 4 HDDs. The idle power was about 130 Watts which went up to 338 Watts while processing the [M]aster plus 10 VCs of the Egypt image with PL7-10 (PL8 is not installed on that machine).

I ran the i7 test twice because the first run had System Explorer updating the graph every 5 seconds rather than 1 second. The i7 is fitted with a 550W Seasonic Platinum power supply.

Providing that the RTX 3050 that @jmcphee purchased lives up to its 70 watts claim then it should be taking about 338 - 90 watts = 248 Watts with the Egypt images (Fuji GFX 50S).

I will admit that is too tight for comfort (with the Egypt images) but the RTX 2060 is a faster card than the RTX 3050 so the CPU will be throttled by the 3050 a bit like my 5900X is throttled by the 2060, potentially lowering the peak consumption a bit ?

I use the Egypt images for my testing because they provide the largest load of the various benchmark images on offer.

Edit 1:-

But the I5 8500 has a lower power profile than my i7 4790K, i.e. the Passmark is 9545 (multi-threading}, 2451 (single threading) and 65W (typical TDP) versus the i7 which is 8066, 2466 and 88W and for completeness the 5900X is 39109, 3469 and 105W.

My i7 is never idle, so doing nothing, i.e. I am typing this on another machine, we have a background CPU activity on the i7 varying between 25% and 32% with the power meter showing 106 to 119W.

The PL7.8 DOP for the 11 “image” test run with the Egypt image is
Egypte—copyright-Corinne_Vachon.raf.dop (100.8 KB)

Using a 7.8 DOP means that it should work for PL7 and PL8
and the Egypt image can be downloaded via the forum benchmark site DxO DeepPRIME Processing Times - Google Sheets

or directly via https://www.dxo.com/media.dxo.com/photolab/deepprime/raw/Egypte---copyright-Corinne_Vachon.raf

Process explorer is available here Process Explorer - Sysinternals | Microsoft Learn.

GPU-Z is available here GPU-Z Graphics Card GPU Information Utility.

It is worth knowing what your machine is currently consuming, while doing export runs in particular, before embarking on any upgrade, to assess whether the cost of the upgrade must/should include a power supply upgrade at the same time.

I am sure a power meter is available to buy in your location and it appears that my TAPO smart plugs also have an inbuilt power meter.

The meter I used is a much, much older variant of something like this

with the standard UK 13amp socket, of course, and a torch might be useful if it it is under a desk etc.!!