PL and RTX Video Cards

That is not really the case. It just depends on one’s priorities.

Mark

Oh no certainly not unlimmited.
I am dragging this decision of the need for a new desltop sins 2021.
That was when i bought secondhand 32gb ddr4 and a old secondhand 750ti.
Because the bitcoinminning and covid made everything muchmore expensive then a year before so i thought after covid the hardware flow would be going bigger and lowering the price. O boy i was wrong. Even when China shutdown mining for bitcoins the videocards stayed high priced.
(they now starting to decent a bit in pricetaggs.)
In a way there are two roads of hardware for software.
Road one: Buy high end and try to hoverfly as long as possible until it hits bottom performens level. I did 13 years.
That’s about 100 euro a year, less then 8 euro’s a month.
Road two: Buy cheap, secondhand most preferable, the yesterday’s highend fast cpu setup and hopefully decent gpu midrange… And do the same hoverfly til it’s useless.

A secondhand i7 12700 sytem is around a 700-1000 euro depending on the who’s selling and how old the system is…(edit forgot to mention “and it’s complented other parts.” my trigger to see if i buy a “new” desktop was a secondhand refurbisched i7 12700 system for 1250,00.)
Or do like @BHAYT buying parts on secondhand markets and build your own from parts that are upgraded by a gamer.

I knew i needed a ne one so i decided to spent my 13th salery month early’er then.December. Use some savings which i pay back to my saving account by 50 or 100 euro a month. I do months without buying clothes, or months without going out for dinner(just buy nice meat and other stuf and cook it yourself. )
I have a dmc g80 and i liked the g9 much but never bought it because the clicks on my g80 arn’t high enough to justify a new /secondhand body.
So i wait for the next g90 and see if that drops down the g9 even more.

Like @mwsilvers said, it’s about priority’s and choices. Every euro/dollar rolls by once. In your pocket and out your pocket. Sometime you can make it larger but staying on a account sitting stil will certainly make it smaller. (inflation).
So spare money needs to be invested in your life or in making money for your life later in time.
I choose present life for now for this part of spare money. And maybe i invest in bitcoin/stock’s to beat inflation with some other money. To speed up the refunding :thinking: .

@David_McA Certainly not the case, my setup at the beginning of 2023 was my i7-4790K and my son’s i7-4790K I bought from him when he upgraded to a Ryzen thread ripper with a 2080 in 2019.

I spent too long dithering over a graphics card and got caught by the covid/mining crisis and paid over the odds for a 1050Ti 4GB card. A little later I added a 1050 2GB card bought second hand from EBay.

That’s the way things stayed until the beginning of last year when I bought a 3060 GPU and put it into one of the i7’s.

But I wondered how it would work in a new machine, as you do, so I bought a 5600G processor, Tomahawk B550 board and 32GB memory + new fan for about the same as I had paid for the 3060 card!!

Later I added a 2060 Ebay card to one of the i7s and later still another Ebay 2060 to the other i7, roughly £120 each.

A month or so back I bought a 5900X, which had always been my intention, one day, for the B550 board for £228, exactly twice what I paid for the 5600G and it is about twice as powerful, it worked with the 5600G fan but I was a bit concerned so bought a new one.

But the 5600G got ever so lonely in its box so I bought it a new motherboard and memory and used its old fan and fitted the 2060 from the now “spare” i7.

I am still using cases I have had for years and years and …

My machines are intended to be backups for each other in terms of storage, which certainly is getting more and more expensive.

I have been a computerholic for a very long time and it is as much my “hobby” as taking photos.

@OXiDant I owned a G80 some time ago and upgraded to a G90, trading in the G80 to part fund the upgrade, but was very disappointed with the cropped video but more with its habit of jumping between focus points when taking videos with it , but not disappointed with the improved quality of the sensor.

I traded it in for a second hand G9, £600 less the G90 trade-in, which is by far the best MFT I have owned.

The G9 MKii is making second hand G9 prices tumble, but I just looked and the number of second hand G9’s on the market has fallen a lot, a new one costs about £750 and the Mkii about £1300 in the UK.

Graphics cards are worse though.
Hop onto eBay and you can pick up sub $1000 32 core gen 1 and 64 core gen 2 epycs with a bunch of ram already installed. The earlier generations had fairly low power requirements. You can also find 16 and 24 core gen 3 threadripper pros which are a lot faster per-core but have less of them… They also use more power, but still not insanity levels. 280W and it’s a hard cap by default, not like when Intel sells you a 100W TDP processor and it draws 700W when it’s boosting.

Anyway, the reason for doing this is that someday NVMes will be $10/ 2TB for 3 packs in the Walgreens up the street. A video card needs 16 PCIe lanes. An M.2 NVMe needs 4. Most consumer processors have 20 usable lanes. Tada, 2 things and you’re done without slowing everything else down. An Epyc / Threadripper Pro has 128 gen 3 (earlier) or gen 4 lanes. That leaves you with space to stick $10 4x M.2 slot adapters in all the PCIe slots and install 28 more of the things. You’ll hit the lower core count processors not being able to issue enough millions of write commands per second to the things before you hit any bandwidth issues. :stuck_out_tongue:

I tend towards overkill but if you’re building

With gen 1 you stand a fair chance of finding a dual processor model but then you’re getting into ram not included territory and at least 16 slots need to be populated on those to make them stable so that price usually kills it.

I wouldn’t touch current Intel chips with all the issues they’re having with them frying themselves. The only reason those gen 1 & 2 epycs are cheap is that gen 3 got avx2 and gen 4 got avx512 (and the fact that there are a gen 3 and 4 already).

I’m not a fan of DDR5 because AFAIK all of the current dual channel boards which have a second bank (aka 2 more slots each on the same channel as another) also tank the max speed of the memory to an inbuilt limit on the processor if you populate all of them so you might be in for a rude awakening later if you decide to upgrade, and larger ddr5 is still expensive. Unfortunately the threadripper 7000s are still current gen and will be too costly for a while (they’re also power hogs). It’s always kind of a bad time to buy computer stuff but it’s a really bad one every time a transition to a new memory tech is done. DDR4 was so expensive for so long that I couldn’t afford to do the upgrade from 32 to 64GB for 5 years after I built a machine that was the first gen to use it. By that time it tanked so suddenly and out of nowhere that it made more sense to just stick another 128GB in there for $350. I think 64GB more would have been $300 or something. My original 32GB of ram for that computer was $300, the most expensive part, and I quickly learned that “rated XMP speed” means “HAHA SUCKER we don’t actually guarantee that” because mine only ever ran at base speed (2133) or slightly above when it was supposed to automatically set itself to 2666. The problem was that they’d programmed an extra speed that was even higher into it, one that told the motherboard, hey, you know that 100mhz base clock you time literally everything to? I NEED IT TO BE 133. The motherboard would blindly obey, reboot 4 times trying to make it work, then reset all firmware settings unless I forced it to a low speed. It just crashed at 2666 too, so it’s not like that was working. The closer you get to finding memory without a stupid name, without RGB lights, and with a plain green PCB, the higher quality it probably is in my experience.

There’s also a guy out in California on there who’s an expert at soldering memory chips to boards and doing reworks who takes the 11GB RTX 2080Ti graphics cards he gets for cheap, solders on another 11GB of ram, and voila, 22GB RTX 2080Ti so you can have the cheapest high video ram card you can get. $500 shipped, pretty sure he warranties them for a while too.

Those are 250W cards, so if you ebayed a TR Pro 3000 series (actually the 5000s are going down too) you can get away with a cheap-these-days 850W PSU.

According to various tests and posts on reddit they’re quite a bit faster than the RTX4060 (20-30% without having the double ram mod done) despite being two generations older. I wouldn’t go back any farther than that.

About u 64gb vs 32gb my old i7 4770 doesn’t use the hole 32Gb. In full throthle it blocks around 22 24gb. 27 max.

I wouldn’t think of it like that. 27 max means the OS had to evict almost all of its own data and all program data from memory. Windows has a caching system called “standby memory” that you’ll find people who don’t know what they’re talking about all over the internet complaining about like it’s bad that keeps recently used data and programs and parts of the OS in memory. When you have a ton of memory installed, everything else starts seeming to happen instantly because, for example all 1000 photos you opened in the past hour are still in standby and now you’re scrolling through them without them needing to load from any storage except the fastest one (that you can install more of anyway) in the system. On my machines when I rip a 4k disk I just bought to transfer to the storage server (since getting a spec of dust on a 4k UHD BD seems to make it unreadable), the rip happens at the horrendous read speed of the optical drive, so about an hour. I usually just target the hard drive because standby makes nothing after that matter. My next step is usually to demux the audio track and mix the Rifftrax audio into it if that’s why I bought it. FFMPEG begins displaying the demux speed (multiple of realtime) in scientific notation at this point because it’s copying straight out of ram… and I’m talking about discs up to 100GB with that. If you only ever hit 27GB of ram used, think of how many photos would fit in the extra 96GB.

Just sayin’.

Yep bin there too. And set project of new desktop on hold because of that.
Videocards where nearly 50% of the total costs and often out of stock too.

Yes the photographic sensor was better of the g90 but in total package not very appealing to hop from a nearly new g80, dxo took with deepprime another step in stops so 6400iso was possible.

G9 is great but also bigger and heavier. And i like the form and size of the g80. Big enough for my hands and small enough for easy traveling and it’s less pronounced, visible , profi in your face when your in a crowed or museum.
That’s why i stil use the good old 14-140mm. ( mine has this holiday some AF mishits by the way can’t place the origin of this failure. Body or lens. I think the other lenses didn’t had this issue, but it’s alway’s when contrast is low by haze or amount of light. So could be a specs issue not a failure.)

50-200mm f2.8-4 is great in iq and size/weight ánd secondhand pricing. So i pass for now.

I hope that the g90 mk2 is such a leap forward that thát could trigger me.
Not too soon hopefully because my hobby money is dryed up for severel months/year(s) after buying tools for renovating and new desktop :rofl:

So my progression was G7 with 14-140 (28 to 280) Zoom lens with Panasonic Cashback and my youngest son gave me his 17mm f1.8 Olympus lens for Christmas., no IBIS.

Then onto G80 with 12-60 Zoom lens and Panasonic cashback and IBIS…

Added an FZ200 second hand (I started digital photography with an FZ5) which I later replaced with the FZ330 in a John Lewis deal, still have the FZ330, sold the FZ200 as part of a package deal to Park Cameras.

Bought the G90 body in another deal from John Lewis, later sold to Park for nearly the purchase price. But the G90 disappointed with the increased video crop factor and the major focus point jumping issues I experienced when taking videos, so bought an EM1 Mkii with a 12-200 (24-400) lens, for about £1,100 in one of the final deals from Olympus.

That lens is now “glued” to my G9 but it turned out that I qualified for another deal because I bought the combo after the first deal offer had ended but the shop honoured the deal! The next Olympus deal meant I got a free 17mm f1.2 lens as a result of the date on my invoice.

That lens is currently in use with my youngest son.

The EM1 Mkii has a UI to scream about/at and not in a good way, a small and light camera but I could never get used to the UI, I still have the body.

I considered a return to Panasonic and looked at second-hand G9’s and went to Park cameras to see if I could handle the camera, both size and weight with the 12-200 lens attached, i.e. I took my G90 and the 12-200 lens to do the comparison.

I had seen them offering a well-used G9 for £600 online which they seemed to know nothing about but eventually found a rather used model in the stock room! Used to the point of losing the rubber covering for the plastic cover of the card slots.

I took the risk and am still using the same camera today!

While using the FZ330 I decided there was enough room in my small camera case for the GX80 with the 17mm 1.8 lens and bought a body with the tiny pancake zoom from some stock clearance from Jessops.

It did fit and I used it from time to time but traded that in for almost what I paid for it and I now have a second hand GX9, a good camera but not quite as good as the G9.

So my line up is G9 (second hand - used), GX9 (second hand - as new), EM1 Mkii (used but only by me), FZ330 (used but only by me) with 12-60 (kit lens), 12-50 (Olympus, second-hand and used as pseudo-macro lens on the GX9), 12-32 (kit lens useful with any camera to make a small package, but particularly the Gx9), 14-140 (Kit lens), 12-200(Kit lens and my everyday lens).

The G9 mk2 is bigger than the G9 and sharing the body of the S5, which my youngest son purchased recently, he is a videographer (GH4 to GH5 to GH6 plus a BlackMagic body and now S5 using the current offers).

I will try his S5 to see if is a step too far for me with respect to size and weight, I then need to check if certain features on the G9 are on the mk2 and wait for a good offer. The second-hand value of the G9 will be too low so I am unlikely to part with that.

So your an active upgrader :crazy_face:
My son uses now a fz200 which i bought before the g80. and my daughter i bought her a lx100 but she likes her phone i think and forget she has one in the room. so maybe i take that as side kick back when i have a long lens on the other.

How does this jack of all traids oly 12-200mm lens preform on a panasonic g90/80?
i maybe like the idea to swap my 14-140 ( dual is2) for this as walk and snap lens.
i think it drops back to IBIS only
i often in my family walks on holiday swapping and deleying the group and then i use a general lens like 14-140mm and my leica 12-60mm in my shorts for when i need closer of brighter lens. ( oly and pana are twins in mirror as in you like one or the other not both.
Even the lenses are zooming the other way
Quick look on secondhand market a g9 body does 500-600 euro’s at this moment much are diverting to fuij xmount.

( it’s a bit side tracked but its conversation right ;-))

rtx 4060 or 4060ti preformens upgrade is less as the price upgrade

This guy says what i was thinking:

so i stick with the ASUS Dual GeForce RTX 4060 EVO OC Edition 8GB.

@OXiDant I apologise to the owner of the topic for wandering off topic but will continue in that for a moment. My wife has suffered for a long time waiting for me either photographing or waiting for the crowds to subside before I take a photograph!

The Olympus 12-200 provides no additional stabilization to the G9 but I have not found that to be a problem. It is heavier than the 14-140 but I like the 24mm at the wide end and the 16.6 times zoom is very useful. So the G9 + 12-200 is my normal photography rig carried in a £9 Jessops Fastnet remaindered camera case.

I didn’t suggest the Ti card because it does not offer enough increase. My suggestion of the 4070 was driven by the fact that the power of my 5900X is simply not available during export because the 3060 is bottlenecking the export process and the same will be true for the system you are proposing even with the 4060.

I would say that the 5600G is actually a better match for the 3060 (and 4060) when doing DxPL exports as shown by my tests times. However, the 5900X is a good machine to have because its performance is snappy in a lot of situations. Both the new motherboards are NVME capable, 1 slot of the two is gen4 capable in both cases and that is my most likely upgrade path at some point in the future.

I don’t now what the actual impact would be if I fitted a 4700 to my system but it would cost me close on £500 to find out so it isn’t going to happen in my case

I think he’s not angry of highjacking this thread. It was sleeping for a month or so.

So DxO’s exporting, developement, software isn’t capable of balancing aka spreading the load over cpu and gpu accoording to there capability’s? That’s what you write? It choose or gpu or cpu? That is compromising in programming a software package in my eye’s.
Devide and conquer should be the act.

I selected the 7900 for its powerconsumtion vs delivery of processingpower a nice balance between economic on the electricitybill and pure calculationpower…
The x and x3d’s are much more agressive in powerconsumption and there for in full throttle faster then the non x holders. The 7900 is marathon and the 7900x is a 400meter sprinter the x3d are 100meter sprinters.
In consumption of electricity and producing heat, (which is energy loss because that’s electricity which isn’t used for processing data.)

Faster highend video cards are in that point of view the same beast.
Much more money and energy consumption deverting in heating up the card for more processing power.
The 4070 super is twice the price of a 4060.and then wattage is also twice it’s sucking in. One bridge too far for my goal/idea of new desktop. :grin:
In my perspective i would upgrade easier a videocard then a cpu in the future.
Goal is not perse exporting faster(lighting fast) but more no delay in editting rawfiles in any state of the tools.
I believe that’s memory speed and cpu right?

@OXiDant It is less about balancing CPU versus GPU but rather the GPU is used to do a job that the CPU can’t or rather can but very slowly. The “AI” model used for Noise reduction in DP XD uses the Neural network of the Mac M chipset I believe or the “AI” elements of the GPU on a PC or something along those lines.

We have no exact measures with respect to how well DP XD processing compares to the GPU benchmarks that the various websites use but they are a starting point.

If you look at the CPU usage graph I included some posts ago you will see a slight drop from ‘NO NR’ processing to DP processing and a much larger drop when DP XD processing is involved.

Control is being passed backwards and forwards between the CPU and GPU during the Noise reduction phase of the process and when the GPU is being used the CPU is waiting, the faster the GPU the less the CPU has to wait but this comes at a price, increased cost of the card, more power consumed, more heat generated and potentially more audible noise generated.

The 7900 looks to be a good choice for the processor as for which GPU you selection comes down to how much of your day is spent exporting DP XD images, if it is 200 per day then 1 second “wasted” is 200 seconds “lost” but if it is 1,000s of images per day then the additional time soon adds up.

The card I was looking at was a 4070, not the super version and that came in a about £500 and you are right the 4060 is closer to £260, half the price.

I just did both and the GPU is certainly easier but a processor is O.K. providing you have got some isopropyl alcohol and new thermal paste to hand and have a motherboard that can take the new CPU!

It comes down to how much a second hand GPU will raise compared to the cost of a new GPU. The prices of the current generation will decrease, both new and second-hand market.

If you are happy with the 4060 then stick with that and use the money saved to get the NVME’s.

Ah i didn’t connect that in the way you explained it now.
The Increasingly more use of AI programming is thus leaning more and more on the GPU chipset because of the architecture and connected programming for AI purposes.
I didn’t realise that the AI they used was bound to a chipset rather then calling in calculationslots of one of the CU’s

My exporting of images is maybe 8k-10k images a year. Holiday’s, daytrips, and some random (test) shots. So if that cost me some extra seconds well i hit export and go do something else. :wink:

And i believe we stil have the setting “use CPU only” in preference aldoh this not connect with the knowledge that AI js bound to the GPU.
All and all interesting theme this AI and (hardware/hardcoded chipcode) infrastructure of PC’s.

@OXiDant Under no circumstances whatsoever select CPU only.

I did this when users where having problems with low powered machines and low powered GPUs as a result of a change that DxO had made to the product that resulted in not selecting the GPU, even though the user had set to use the most appropriate, but downgraded the product to use CPU only!

On my i7-4790K just browsing to an image with ‘DP XD’ selected and using ‘CPU only’ brought the machine to its knees, i.e. the CPU consumed for rendering a DP XD image for display took so much processor that the monitoring software I used could not update its graph!!

With the iGPU selected browsing was a tiny bit stuttery and export excruciatingly long but the machine did not collapse into a heap! With the 5600G, which I had at the time (and have again now after a new rebuild), it could cope.

I repeat that this is not export this is just browsing, if you monitor the GPU when browsing ‘DP XD’ images you will see a tiny “blip” on the graph but for the CPU to deliver that tiny “blip” took all the processing power of the i7-4790K an 8,070 Passmark CPU, the 5600G is a 19888 passmark CPU and coped and the 5900X is a 39,145 passmark CPU and I haven’t bothered to test it (yet).

The time when a 4070 or even 4080 GPU becomes important is when you are exporting 1,000s per day and to a customer deadline.

More and more software is using GPU processing power to undertake certain operations so even a cheap second hand 2060 is worth adding to an old machine, providing the power supply can cope, all mine are 550W, 2 are Seasonic 550W Platinum and one is an old Maplins power supply and I now have a 3060 and two second-hand 2060s (about 80% to a 3060 100% to a 4060 122% to a 4070 190% according to some gaming benchmarks)!!

Although DxPL will max out whatever GPU you give it, the power requirement is not continuous, i.e. as it would be if playing games!!

PS:- To re-iterate in my tables and graphs

NO NR - Image rendering only (CPU)
DP - Image Rendering (CPU) + applying DP de-noising (needs GPU) (& “some” CPU)
DP XD - Image Rendering (CPU) + applying DP XD de-noising (needs real GPU) (& “some CPU”)

So all three need what is shown in the NO NR figures but the de-noising then needs some CPU to feed the GPU (no way to know/measure how much CPU this actually is (?)) plus the GPU to do the “heavy lifting” of the noise reduction process.

1 Like

@OXiDant Can I go back to fiddling with DxPL rather than battling wood rot!?

if it’s a living area/ no closure to the inside of the house i would think of swapping a old and new frame. So thermopain glass hr++ would fit in it.
but this looks like a serre/ summer growhouse. extention of the house.
looking closer to the construction is see why it’s gone rotting.
No leadseal under the frame construction on the brickwall. so most/rain can crowl under the frame causing a wet cementspot and wood. it’s build straight on a slap of cement as it looks like on the picture.
it’s not lead any more what is used for this but it’s i believe still called “loodband” leadband.

i suppose you know all this but in case you don’t spent some extra attention to it like this:

and if that’s too much of an hassle garden houses have now day this kind of composiet supportbeams
place that on the brickwall and on top your new framing. so moist in the bricks never reach your wood again.

did some fiddling myself few years back on this kind of things… ;-0






Have fun. in de end it’s all worth it. :slight_smile:

1 Like