DxO PhotoLab vs Lightroom Classic – using PhotoLab for cataloguing

@Stenis that’s odd because I ran tests some time ago and did just that to overcome the fact that the most I could do with an external selection was about 239/240 at a time. I wanted to get 1,000 images into a project that way, i.e. it turned out to be 4 external selections which was a bit of a pain and I wasn’t sure what was causing the limitation!?

The limit is flagged as an OS limit, I was trying to go from FastRawViewer to PL6.3.1 and after two such batch transfers and CTRL A and creating the ‘Project’ for the first group and then adding the second group I have

  1. Two ‘External Selections’
  2. One project containing both combined

I can access the ‘External Selections’ and the ‘Project’ images equally without any problems whatsoever!? Which is exactly what I found the last time I did this.

So we have the ‘Project’

and one of the ‘External Selections’

In ‘Customize’ mode (sorry the images are not as interesting as Zanzibar)

The problem you appear to be experiencing is that the pointers are pointing at the image in the database but that image cannot then be located on disk and that makes little sense unless something is “locking” those images or they were on an external disk or …?

Does this happen when in ‘PhotoLibrary’ mode when moving from thumbnail to thumbnail or only in ‘Customize’ mode? If you terminate the program that passed the images to PL6 does the problem persist? Does the problem remain after you restart DxPL?

I have not experienced this in the past and am not experiencing it tonight!

Repeated the test with another piece of software (on another machine) that can pass more images in a single transfer and all collected into a ‘Project’ of 1001 images and all are present and correct as far as I can tell

For some reason it works from me but not for you!?

Regards

Bryan

DeepOdd® is the secret project name for the DAM parts inside DxO, don’t give it away in public. :grin: sorry, couldn’t resist. Good luck with further testing, it’s very entertaining.

1 Like

@JoJu If you find it entertaining then you really need to get ought more, so do I come to think of it.

Currently I have 2 (out of 3) machines running and starting up a machine certainly automatically triggers one of the HDMI switches and the issue of which machine is displaying where at any time is getting to be “interesting” rather than entertaining!

Changing the Taskbar colour might help but I rather like the boring grey I currently use (!) but with Machine 1 (the new build) using the centre monitor and Machine 2 (the old Main machine) using the two outer screens, all of which can be changed by the push of 3 buttons, I am getting confused already, time for a coffee!

Not as I understand it, JJ …

There are affiliate links and discount codes all over that site. He’s making money on this, so I think that probably says something about his objectivity.

Re-indexing does not help in its current implementation. Check out this example.

Thanks Bryan for your effort!

When I add images from an ordinary folder to a Project it works and I the can open these images without a problem in Customize.

If I use the External Search as a source and copy the images from that view in PhotoLibrary mode and create a Project from that search I have not been able to open and edit any of those images in Custimize mode.

Maybe you are right that the External Search doesn´t point to the folder but to a preview in the database still that is strange.

@Stenis I copied from the ‘External Selection’ which came from FastRawViewer in the case of the first tests or Photo Mechanic in the case of the second full test and in both cases I followed the procedure that I identified in my post, i.e. Ctrl A on the ‘External Selection’ and ‘Create Project from current selection’ for the first group of 239 images and then the same for each additional selection but now using ‘Add current selection to Project’.

image

In fact both the ‘External Selections’ and the ‘Projects’ use the same database structure of ‘Projects’ and here they are

Doing it this way means that at this point I have twice the number of pointers I “need” but I could also have created embedded ‘Projects’ etc. and the limitation of passing the images is the size of the “array” used to hold the data of all the selected images being passed and Photo Mechanic seems to be a bit better at using the space that FastRawViewer!

So ‘External Selections’ and ‘Projects’ are also pointing to the same images in the database as each other and those ‘Items’ in the database should be referencing images on disk via the ‘Sources’ Table entries

Which in turn reference entries in the ‘Folders’ (‘FolderId’=40 in this case)

which then allows DxPL to find the referenced image on disk by reconstructing the Directory path from the data in the ‘Folders’ table, or not in your case!

Sorry for my ignorance but what exactly do you mean by ‘External Search’?

@ platypus, I don’t doubt your results but will do some testing later today or tomorrow.

1 Like

Sorry for confusing you but that was just semantics.

I used “External Search” to describe that it is “External Search Path” seen from Photolab. See tese images doesn´t need to be part of the Photolab ImageLibrary scopy (Index). It could come from raelly any application outside Photolab.

In my case it is really the same 25 000 images that both Photo Mechanic sees and Photolab sees.

This is what I do:

I Open PM Plus and I selected 12 images in the "contact sheet of PM which is just the main list of images. After that I right click on one of the images which opens a dialog where I select “Edit selected photos with DXO.Photolab”

What happens then is that an “External Selection”-record is created under “External Selections” in Photolab´s ImageLibrary´left panel. This image selection is opened in Image Library. I the select these images and creates a new Project. Everything works fine and these images in the new Project works if I open them in Customize Mode - BUT NONE OF THE OLD SEARCHES IMAGES IN THE LIST DOES :frowning:

Well my conclusion is that something has happened when I have upgraded Photolab since last year. I have done so a couple of times when new service releases has been released.

From what I can see now everything seems fine with the new Projects I have created for this test and I´m very relieved to see that, because this is a very important integration feature that really is crucial to me.

I will have to delete all the present External Selections I have in that list and build new ones when I need them.

So thanks again for your help Bryan!

That´s great really because then it shouldn´t be all that complicated “promoting” an “External Selection” to a Project if one should want a project on such a selection.

Can I ask you what kind of interface you have used to access the tables in the Photolab Image Library DB. What kind of DB is DXO using?

Obviously written by DxO or by a fanboy.

Photolab is hopeless/useless as a DAM system.

However, anyone migrating from Lightroom can simply continue to use LR for the DAM even without a subscription, even simple develop tools continue to work when the subscription expires.

Just do development in Photolab and send a DNG or JPEG back for viewing in LR.

2 Likes

DxO uses take open source SQLite database and you can use any tool that supports this database to open and view it. Always make a copy of the database and work on the copy so you do not inadvertantly mess up your database. The database is just a single file so it can easily be copied to another location (after closing PL).

Sorry this turned into a big post but the main element is that the DxPL clean-up routines are “flawed” and I am interested in what other packages do or don’t do!

@Stenis I am glad we have resolved the issue because I was very puzzled.

Exactly, but there are other simple fixes I have suggested over the last few years and they just fall on deaf ears!

Super easy to use but don’t keep it running while you “bounce” DxPL because sometimes it causes problems because it keeps the database open and don’t be tempted to change anything in the database!

However, using it on a “real” database is a nightmare because it is difficult (impossible) to wade through all the data, hence I back-up the “real” database and open an empty one I created some time ago for tests to keep the data down to a minimum so that I can “learn” about whatever it is I am trying to investigate. Returning to the “real” database after the tests.

However, one problem your original issue highlights is the changing nature of data on disk versus the data in a database, and that software can find it difficult to track the changes and any mismatches between the data on disk and the data in the database, i.e.

  1. What is the software supposed to track, when it is running and when it is not?
  2. When is it supposed to track it?
  3. How is it supposed to track it and detect any changes?
  4. What is it supposed to do when anomalies are detected?

Some image databases may contain very large numbers of entries (images) and setting a reconciliation program loose on such databases is an “expensive” and (potentially) disruptive task.

One of the better programs for this kind of work is iMatch which “insists” on doing almost everything using background worker tasks including with delays to avoid “lock” clashes with other programs, e.g. I just opened iMatch and

I don’t have any experience of when and how other programs detect that either a directory that contained images or an individual image or images that have been “ingested” into a database no longer exist, e.g. deleted or name changed.

However, DxO certainly doesn’t or rather it actually does detect a directory name change or deletion when it is active, i.e. the process that it uses to detect changes in the file system and image metadata provide DxO with an opportunity to recognise changes and act accordingly.

But as tests I did some time ago showed it does nothing with that information or rather what it does has a limited impact on the database, nor does it adjust the database to reflect the fact that a directory has changed its name or been deleted, other than recognise the existence of the new directory.

So the following tests consist of a number of directories and sub-directories holding one image in each sub-directory, i.e.

2023-03-04_153040_

The directory “Test Sub 04” had its name changed to “Test Sub 04 - New” with DxO running and the new directory was added immediately and the old directory vanished from the display.

But the old image from the changed directory is still in the database (image 4) and the “new” one has been added (as image 5 in ‘Items’)

and the new folder added to the ‘Folders’ table (entry 16 and the old directory is at 15)

The image from the " - new" directory was then drag and dropped on a ‘Project’

and the (sub-)directory was changed back to its original name.

This then happens with the ‘Project’ entry, effectively “orphaned” from the actual image. The thumbnail is still in the cache so it can be displayed but the image itself is …!

Unfortunately the problem then gets worse @Musashi when I do a search on “red” I get the following

The images were “rated” according to their sub-directory from “" to "***” and given “colour” labels according to the directory, with “red” being assigned to “Test Directory 01” where there should be 4 active entries but, because DxPL has not discovered and removed the now missing entry we have 5.

Re-indexing “Test Directory 01” to ensure the correct number of entries in the database we (still) have 5.

@platypus for some time elements that you and then I have complained about have always seemed odd (counter intuitive) to me;

No Command to Delete a Directory:-

There is no command to delete an entire directory.

Its “omission” could be to avoid the risk of a user executing a delete on the whole library of images with a single command but it has always seemed strange that an obvious “directory delete” command is missing, i.e. the user can delete the entire contents of a directory but leave behind an empty directory “header” albeit such a destructive operation is “limited” to the boundaries of a single directory and does not include any other (sub-)directories!?

Or the ‘Folders’ structure is considered “sacrosanct”, hmm…

No “database only” delete capability:-

Possibly even more important than 1 is the fact that there is no command to only delete database entries, the only command on offer is to to delete (‘Remove’) everything, i.e. image, DOP and xmp sidecar file are eradicated!

The most obvious first implementation would have been to provide a less destructive command to delete database entries, i.e. entries within the DxPL domain and leave the images intact.

The ability to also delete the images could then have been added at a later date.

But instead DxPL can only delete everything or nothing, it doesn’t make a lot of sense @Musashi!

With such a database only delete command, the danger of deleting an image and the DOP etc. with a single command is reduced because “only” the database entries are at risk!?

How do other packages manage database to “real” world reconciliation:-

I don’t know how other packages resolve the issue of garbage collection (with the exception of the example from iMatch I included above which is working as I type) when changes are made outside the package.

But having a situation where “phantom” images are located via a search because the database can become littered with the “detritus” caused by the lack of detection and clean-up routines is not good (but I have only tested DxPL for this behaviour).

The (re-)indexing routine needs to be improved to allow its influence to be restricted to refreshing only directories already present in the database, rather than adding images to the database that are potentially of no interest, as I mentioned in an earlier post.

A further option could be provided to embark on a reconciliation process that should be able to clean up the database and remove detritus associated with data that is no longer present on disk.

iMatch is going to take some time reconciling the database with the disk contents (a consequence of moving test directories from F:\ to H:, I believe)

dismissing the main import leaves a progress window

So DxPL has some way to go but I am still interested to know what capabilities Lightroom and Capture 1 have to handle such situations and I believe you have experience of both @platypus (and @JoJu?)

@GrahamB Sorry but I don’t agree, it is certainly flawed but nothing that cannot be fixed if DxO put their minds to it, some fixes don’t require much effort at all (I suppose I must add , in my opinion) but getting DxO to do the work is …

The only database that was corrupted, excluding PL5 losing my database during the upgrade from PL4 and simply creating a new one, was when I was beta testing PL5 and used a trial copy of Lightroom to set metadata to test PL5’s ability to detect the changes.

The third time I tried to use the Catalog (database) Lightroom informed me that it couldn’t open it!

As a “newbie” to Lightroom I may well have made a mistake but there appears be good reason for the backup prompt every time you close Lightroom, and IMatch and Capture One and … but not DxPL!

Not having that prompt makes testing much easier but should there be an option to instruct DxPL to prompt …?

Ad that is exactly why DPL is not mature enough to replace Lightroom, which has the necessary tools to fix discrepancies between the DB and what’s on the drive…

All those who delete DPL’s database regularly can be (sort of) happy with DPL as long as search, which relies on the database, is not needed.

3 Likes

You are rocks Bryan and Keith! Thanks for the link to the DB Browser. I have never used SQLLite before but I have been an MS SQL Server DBA and developer for 20 years or so. Usually, I have used MS Access as a front end and an ODBC source for batch updates and ad hoc jobs and I tried that even here but of some reasons there was something wrong with the installation of that connection because a DLL seemed to be missing but the browser seems to work fine ad is very straight forward to use so that´s all i need.

Nice job Bryan, and thank you for your efforts. I like that you really drills it down all the way to the borrom in order to get your answers and not just guesses that is so common these days. You have a real developers heart :slight_smile:

Comment:

  1. I have also had a few database crashes with Lightroom back in time and that´s why I don´t understand exporting and keeping XMP in synk is mandatory. It ought to be.

  2. About reindexing routines. I agree with you that there are very poor maintenance tools included in ImageLibrary and about reindexing: It is definitely a good practice to put all the folders in a well defined structure under one single “top folder” if possible and stick to that. Then it´s super easy to reindex. The “reconciliation”-process in Photolab today is really to reindex from the “top folder”. It is by far the easiest way to do it at least if you have a pretty new and efficient computer.

If coming from PM Plus Photolab 6 seems very archaic but it works pretty well if we adapt to how it´s built. I am really impressed with “the sync system” now and it seems very stable. So far, I have not experienced any problems at all when maintaining XMP/IPTC in PM and keeping Photolab open in parallell with PM as long as I strictly update the metadata in PM. The updates are instant so I have completely stopped worrying about orphaned data. I have bought a pretty new and fast gaming computer because they come equipped with both fast processors and graphics cards (despite I never use games), so it doesn´t take me all that long to reindex if needed, but so far, I have done it just once.

Concerning PM, it´s far more flexible than Photolab and I always feel comfortable because all I do in PM revolves around the indexes when maintaining them and not by affecting the image files directly themselves - I mean for example deleting them. If I delete a folder, I only affect the folder in the index unless I for example explicitly for example moves them or so from the inside of PM.

Maybe DXO are reluctant to give us more powerful file management tools inside Photolab to avoid unwanted disasters. It is one thing to handle a virtual structure like the External Selection or the Projects and another to apply more powerful tools directly on folders and files.

Still @platypus , I don´t see all those big problems today using Photolab as a replacement for Lightroom. In fact, I left Lightroom more than 10 years ago and have never once looked back. I was never a great friend of the metadata management in that application which I then found very inefficient and hopeless to use. I could even have lived with the metadata management in ImageLibrary in Photolab 6 but it is together with more specialized metadata tools we really get the best of two worlds instead of the Lightroom or Photolab as is compromise. So, the real strength of Photolab is that it scales really well with PM Plus or another tool of your choice and even I sometimes have use for the metadata inside Photolab.

I wonder really how DXO has solved the sync but I guess they have done like they had in the FotoWare DAM where the indexing system was watching all the changes in the filesystem flags to trigger in the receiving system. in that case it was OK to for example add data in any of the folders under the topfolder and with any tool really and it also created new folders on the fly for every new 1000 images. PM Plus doesn´t have that if you dont´t count the ability to have a “hot folder” monitored by the import system (Ingestion).

I have asked Camera Bits to improve this but I don´t think they will since it might compromise the overall speed of the system that is their main selling point. It´s a difference with a big Enterprise DAM, since in that case the indexing system is not competing with the rest of the system resources on your machine. With PM it is a manual process to keep the systems in sync if you add new folders and images but at least it´s very easy to do that as long as you don´t forget it.

Feel free to “adapt to” whatever you like/need/want, @Stenis, just don’t use the word “we”. I for my part buy and use software suited to my needs and the best software is the one I don’t have to “adapt” for. It’s typical “programmer’s thinking”: a user has to “get used” first. How would you think about a salesman telling you “the sleeves are on the front and back, not at the sides of the shirt, but you surely can adapt to it”? Software has to serve my needs. I don’t need to serve it’s needs.

For me, that’s the main reason to use Apple Mac OS as I don’t have to adapt to quirky handling. Using windows at home? No way, for that OS I expect to be paid to use it as I have to work too much to do the OS’s job.

1 Like

I used to have a client whose Windows computers I used to “look after” - in other words, I would have to go in as often as once a week to fix misbehaviours and quirks.

One day, for no particular reason, the motherboards on two of the computers failed, almost simultaneously. I had recently moved to using a Mac, with Parallels for Windows development work and I suggested he might like to try Macs. At first, he was reticent but I showed him how much easier the UI was to work with and he agreed, so he brought a couple of 27" iMacs and a Mac mini for use as a server.

For the first couple or three weeks, he was saying how things were not the same and that he wished he had stayed with Windows… then the moaning stopped, as did my nice little weekly earner, as he simply didn’t have the same problems he was having with Windows and saved himself my consultancy fees in no time at all.

1 Like

When I had my first real Mac-contact with OS-X (OS 9 was the blueprint for Windows 3.11 and not so fab) I tried to use the knowledge “learnt the hard way” from the Windows versions I knew at the time. Lucky me, my friend always tried it the way she was expecting the computer to work and it did it that way - not the very complicated way which is “normality” for Micorosoft users. Simple, elegant, with no tons of extra mouseclicks, no endless driver hunts, no lengthy setup gymnastics or minefields of virus scanners. As soon as I could afford to buy a Mac, my house became Microsoft-free and still is. And I think, all the benchmark tests (which processor is the fastest) simply ignore the fact, that 20% of more speed only effect the time a PC needs to calculate - not the time a user need to set it up for that calculation, make sure to pamper all PC needs and get the results into a readable form.

2 Likes

This is something that I sometimes see creeping in to PL. Possibly the Windows version of a new feature was developed first and then the Mac version “inherits” a somewhat clumsy implementation. I have often thought that the simpler stuff got written for the Mac first.

As it is, Apple provide the, extremely powerful, Spotlight metadata and search mechanism, which completely obviates the need to write a custom metadata database in PL. Unfortunately, because that, or similar, doesn’t exist in Windows, DxO have been obliged to use their own, somewhat flaky, SQLite database.

1 Like

Did you know that Lightroom and Photo Mechanic Plus also use SQLite for their DAMs? It’s all in the implementation!