Mark, I agree with you in terms of software bugs. I can’t agree with you about new camera bodies and lenses. Those are added according to profiles. Once DxO has that data, it’s fairly easy to distribute the new profiles to older versions, without the kind of extensive testing about which you are speaking.
I.e. to take a hifi analogy - the profiles are just CD media and not a change to the core application. Of course, I’m with you in terms of a lack of obligation to support all cameras and lenses for eternity. DxO could win some good will by trailing camera profiles and lenses something like three years. Forcing upgrades based on equipment acquisition is a weak way of promoting software upgrades.
This is not a particularly hot issue with me. I’ll upgrade anyway if I’m using the software and particularly if I’ve acquired new kit. The mysterious DAM eating up limited development time worries me. I.e. there may be no significant improvement to RAW development performance or quality for a couple of years.
I’m afraid that the DAM is a done deal regardless of how little some of us may want it. At this juncture, it’s probably best to embrace it and make suggestions for things to be included.
I’ve been noting online in recent weeks that Luminar had been roundly thrashed for promising to add, and not delivering, a DAM to it’s software even though it’s been promised for some time. I understand in the last few days they have finally released a version with the DAM included. Clearly the presence of digital asset management is very important to a lot of people and signals that your product is a more mature offering.
DXO Photolab is a top-of-the-line professional-quality raw image converter. What it needs to compete better against some other products out there right now is an increased feature-set, a somewhat more sophisticated and mature interface, better, and more, updated documentation, and more training materials.
They have a difficult path ahead of them competing against already feature rich budget minded programs like ON1, Affinity, and Luminar, not to mention professional packages like LR, PS and Capture One Pro.
DxO has my money and is the center of my photo processing workflow now, Mark. I’m fully invested. I even do unpaid work for DxO at this point, promoting DxO software on DPReview forums.
But until DxO gives us more information on the extent of their plans for the DAM, I’m not going to drop the issue of DAM as my productivity depends on DxO working on 1. performance improvements 2. image quality improvements.It’s still possible to scale back the resources devoted to such a bottomless hole as a DAM.
Anyone rating their photos in DxO PhotoLab must have considerably more patience than I do as DxO as a photo processor is very slow to switch between images (not an issue for processing but a huge issue for evaluation). I’ve made my case above for [why DxO is barking up the wrong tree with DAM in PhotoLab](Digital Asset Management in PhotoLab and won’t make it again. I’ve offered detailed workflow notes for other PhotoLab users on how to manage photo selection and how to manage a library of finished photos.
DAM is a phantom need, a copycat feature and has little to do with the PhotoLab raison-d’être or USP which is ultimate image quality.
Admittedly there are lots of issues. But whether you are prepared to embrace a DAM at this point or not, it is still coming. As for a road map of what will be included and when, I doubt we will ever have one from DXO. Sharing details like that sets up expectations that may not be met in the expected time frame. It is more likely that what we will get, now and again, are hints regarding new features that are in the pipeline. I am not aware of very many software companies that give users a detailed road map of their long-term development plans.
DXO is probably running their business on a shoestring. They haven’t even been able to update documentation completely for their core package. Photolab’s PDF download still has a number of references in it to Optics Pro, and does not clearly differentiate between subtle and not-so-subtle differences between the Mac and PC versions. As a result, I’m tending to set my expectations lower than you, and I’m just hoping for the best.
When I talk about places where DxO could do some serious performance work on PhotoLab (instead of chasing this chimeric DAM), here are some concrete suggestions:
when trying to use the Contrast sliders to create an attractive cosmetic look (lowering microcontrast and fine contrast is a great way to soften a woman’s features), the “Correction Preview” spins for something like 10 seconds, even with Noise Correction turned off. This is way too slow.
the repair tool is glacial. In Photoshop basic clone/repair is instantaneous.
I’m using a Mac Pro 6 core 3.46 GHz with an Nvidia GeForce GTX 980 with 4 GB vram so I hope the slowdowns are not hardware related. On my Mac Pro 12 core 3.33 GHz with an Radeon RX580 with 8 GB vram, PhotoLab seems more responsive. Or even on my MacBook Pro 17" when I have the Radeon 6750M with 1 GB vram active. I haven’t seen anything posted about a requirement for Radeon graphic chips.
In any case, even on the other computers, DxO PhotoLab is simply too slow when working with certain of the sliders like the Contrast or Noise Reduction sliders. Optimisation while not as sexy as new features for programers would be a huge sell to photographers. We want as close to instantaneous response as possible. This was what Apple’s Aperture did really well. Unlike Lightroom, Aperture was really fast.
I seem to be getting somewhat better response on my I7 Windows 10 machine with 24 gig of Ram and an older nVidia 745 card with 4 gigs of ram. Applying contrast settings to my Canon 7D Mark II raw images below 75% zoom takes perhaps two to three second at most and its usually quicker. At 75% and above where sharpening is visible it can take a couple of seconds longer. The longest refresh is when I change to a different image that is above 75% zoom and has been fully processed including PRIME noise reduction. In that case it can take as much as 8 seconds. I generally don’t run batch updates.
The most time consuming task for me is exporting a fully processed raw image with PRIME NR to jpeg with a quality at 100% and ppi set at 300. It takes around 35 seconds. My raw images average around 25mb. All these timings are not guesses. I tested and timed them before posting. I always thought my processing speed was relatively slow but I’m starting to think its not so bad. ON1 2019 on my machine is positively glacial and I finally uninstalled it.
Same here. Luckily they gave me my cash back too even after 3 months
PL 2.1 can be slow and has bugs, but that has been true of nearly every graphics and photo software I have tried over the last 15 years. Ditto Windows, Mac and Linux.
It can be frustrating on occasion, but the results for me are worth it. It is still faster for me to use PL 2.1 than anything else as I get the look I am after first time and have very close control during the process.
The ability to export onwards via dng into Luminar is also very useful.
We are pretty much on the same page. With regard to ON1 I’m glad I’m not the only one who had it run dismally slow. So many people seem to be happy with its responsiveness. And I agree with your general observations about PhotoLab. I get things done to my satisfaction faster and easier than I ever could with other software, and with better results.
Colin, every piece of software on the planet has bugs. Every piece of Microsoft and Apple software has thousands of bugs. Every Banking system, airline management system, health care system, you-name-it system has identified bugs that will never get fixed. Every software developer has to triage the fixing of bugs to effectively utilise their resources. And to be clear, every bug fix creates a new version. There is no such thing as fixing the current release. The only difference is if the ‘bug’ fixes are significant enough to create a new major release or just to create an incremental minor release. Building and maintaining software is not a precise engineering exercise, it is more akin to a ‘craft’. Whether you like it or not, you are already living with it.
Export time doesn’t bother me - it’s about 30 seconds/image for Canon 5DS R images with local adjustments, touchups/repair, fine contrast adjustments and noise reduction - as well as the colour and perspective and crop fixes which are not normally slow. On my MBP 17" 2011 with 2.2 GHz quad processor, these exports take two minutes per image. Still it’s not a big deal - exports can run in the background. What matters to me is edit speed.
And edit speed is pretty atrocious if one is using the primitive repair tool (it would be nice to have both a clone and a heal tool instead of an awkward all in one). Adding a small repair takes over five seconds of “Correction Preview”. Each small repair adds to this, causing wait times of a minute in the case of six or seven quick touchups in a row. Even worse, undo on these repairs takes just as long.
There is some serious work to be done on performance optimization for both local adjustment and repair, as well as for microcontrast adjustments and noise reduction. The results are excellent but the delays are maddening. I don’t understand how DxO can add these wonderful features but leave them nearly unusable due to a slow image pipeline, while they run off to add huge new modules (DAM).
Pro photographers value image quality first but speed comes a close second. Amateur photographers follow reviews from the pros. If DxO PhotoLab remains this slow for sophisticated processing (I’m not discussing output rendering which is less of an issue as one can do other work while waiting on the photos), it will not win the hearts of pros as their primary photo processing tool. And without the pros, there’s no one to drag the amateurs and semi-pros (like us) into the system.
I loathe Adobe for their subscription only approach and intrusive telemetry/spyware but should note that Lightroom and Photoshop perform pretty quickly. I didn’t find myself waiting on them when doing some head to head image quality tests. It may be that the quick performance has more to do with their popularity than DAM. For those interested, DxO PhotoLab won the head to head image quality test.
Again you and I are getting very different results. I just used the repair tool to quickly remove a dozen objects from an image. I worked as quickly as I could and wasn’t worried about being neat or completely accurate for this exercise. It took me 9 seconds to select and paint out the 12 objects and the image was finished refreshing around 2-3 seconds after I painted the last repair.
Edit: I just did it again randomly painting over parts of an image as fast as I could, regardless of content.
I used the repair tool around 30 times in a 15 second period and the image still finished refreshing 2-3 seconds after the last repair
Finally, I used Ctrl Z (Windows 10) to undo all 30 repairs as fast as I could move my fingers, and finished in less then 10 seconds. The image finished refreshing in the same 2-3 seconds after I was finished.
I am clearly not seeing the problem you are seeing. My 7D Mark II raw files are a lot smaller than yours so that is probably a factor, but my lag time is only a couple of seconds even for 30 repairs, not the minute or so you are seeing for 7. You have a better video card than I do, and I have more ram, The program is also executing on my SSD drive although the images are on a normal HDD. One big difference of course is that you are running on a Mac and I’m using an HP PC running Windows 10.
I use 3 cameras, 7D2, 5DS and Sony 6000. I find with all three many tools are very slow, correcting or undoing. The “new” local adjustments are very bad and the delays in seeing what the changes will do, then as often, going back and changing (or removing altogether) as slow enough for me to think hard on if to use them or not. They really haven’t had much, if any, improvement since the first version of PL.
Exporting is no problem , I make a cup of tea and read a book, watch TV or something its the slowness in many of the processing elements that so poor and really needs improving. My wife tried PL1 and found the local adjustments so difficult to us and slow she has remain with OP.
Mark, actually I have 32 RAM in this machine. The 4 GB is just VRAM (graphics card). Thanks for sharing those numbers. They are vastly different than what I’m seeing on more powerful hardware. I’ll perhaps try updating PhotoLab to the very latest version to see if that helps.
I wouldn’t assume that the new version will resolve the issue. PL2 doesn’t seem to run any faster on my machine than PL1 did. I suspect your problem lies elsewhere.
I recently uninstalled ON1 2019 because it was positively glacial even though other people with machines similarly spec’d, and even more poorly spec’d compared to mine, have no performance issues
My point is if you just update to PL2 expecting significantly better performance you might be disappointed. Your EOS 5DSr raw files are more than double the size of mine, and I suspect at least part of your problem is related to that.
I think what may be needed is an analysis of what processes are running on your machine and the resources they are consuming while Photolab is active. I’m not a Mac person so I wouldn’t even know where to start looking. Is there a way of altering the foreground priority of a specific app on a Mac to give it greater access to resources?
Additionally, I have noted that with PL2 some Mac users have reported problems closing the application, and problems exporting files, if I recall correctly. These could be one off anomalies, but I certainly would review those threads before upgrading.
Windows 10 (updated), not running anything else (other than in background)
The laptop is
Intel® Core™ i7-4900MQ CPU @ 2.80GHz
NVIDIA GeForce GT 750M
Intel® HD Graphics 4600
Samsung SSD 850 EVO 1TB and Samsung SSD 850 EVO 500GB
I think the point is you might want to consider resetting your expectations based on how software is actually developed and supported rather then on how you believe it should be supported. It’s your choice, of course, and your time, but in the end you will continue to be frustrated when you don’t see a resolution to issues that’s acceptable to you.
Good software is not based on absolutes, although there are some standards. It is mostly a creative process, not unlike writing a book.
The goal of course is to minimize bugs, but it is never to remove them all, which would be a virtually impossible thing to accomplish.
A piece of code can be a very complex bit of functionality. But on top of that a system will have thousands of pieces of code that interact with each other in a significant number of ways. Add to that the integration into an operating system used with various hardware, various other software, various resources, and end-users that may attempt to do things in wholly unexpected ways, and you have a mind field of areas where bugs can occur. Testing software and fixing bugs and inconsistencies as they are discovered is actually much much much more time concerning that actually writing the software in the first place.
Although the effort for a single seemingly small bug fix, like perhaps an incorrect error message being displayed, may look like it’s 5 minutes of work, in reality the analysis and coding to fix it could take many hours, and the integrated systems testing necessary before putting it into production could take days. The amount of effort to test and distribute new versions of software is why bug fixes, functional updates, and new features, are bundled together in releases, instead of being installed and distributed to users one at a time. The point of all this is that software development is extraordinarily labor intensive. That’s why it cost approximately $100 Billion USD world-wide, and millions of man-hours, to re-mediate hundreds of millions of lines of code for Y2K. I personally worked 50-70 hours per week for two years doing nothing but that.