Two questions relating to disks

I am thinking of upgrading/changing my system to improve PL performance and have a couple of questions;

(1) Is all the information needed about a photo (adjustments, keywords, ratings, …) included in the .dop sidecar files, or is any of this information stored in a central database? Put differently, if photos are stored on an external drive, edited in PL and then the drive is moved to another computer also with PL, will the second computer have access to all the information, presumably once the folder has been logged? I am wondering about having photos on an external drive, so that I can move it back and forth between two machines, doing some editing on a desktop, some on a laptop depending on where I want to do it. Or would conflicts/loss of information arise?

(2) With Lightroom, it is important that the app and catalog are on a fast drive, and it is OK to use a slower drive to store the photos, because previews and other information are all in the catalog. As PL stores stuff in .dop files, does this mean that it is more important to store photos on a faster drive?

#2: You’ll have noticeably faster indexing, searching, and folder loading with your photos and sidecar files on a fast SSD. I do all my work on C: which is a 1 TB SSD, then move it over to a slower HDD for archiving when I’m done. For other operations, it doesn’t really matter IMO.

#1: This has been much discussed elsewhere. Presently, PhotoLab relies a lot on its database for a complete record of keywords, projects, and history. WIth PL5, I understand you can export more metadata to sidecars - but the process isn’t complete.

PhotoLab can work as you require but can also work in a similar manner to Lightroom, you just need to set the appropriate settings.

Many people on this forum, myself included, do not rely on the database but instead use .dop files and in some cases .xmp files too for things like keywords etc.

And many people “not relying on a database” have simply no real world experiences how much easier it can be using one (at least properly developed, stable one) and therefore prevent all real progress of PL as they are long term users and “would stay away of this devil’s work” as far as they could.

For DxO devs this might be a nightmare which goes directly into the weird “dop AND database” situation.

For you @Roger it might be important to know that if you go the workflow path of @Egregius you have to move, rename or delete your files within DxO PL, not in Explorer. Else it can become a nightmare to find point out the new location for them.

1 Like

Thank you for all of these quick and helpful replies. It was the point Joachim makes that concerned me, as my thought was to edit files as Greg suggests (albeit possibly using a very fast external SSD rather than the C: drive) and then sync this with slower drives using SyncToy2 (W10) or SyncFoldersPro (Mac). I thought this could not be done within PL5 because if I right-click on a Folder in PL5 PhotoLibrary, in W10 I get options to Create and Rename, but not to Move/Delete/Cut/Paste, and in MacOS right-clicking does nothing.

I had read some of the threads on sidecars versus database, but I had not found a list of what information is stored in xmp, dop and in the database. I guess the answer is to experiment. I am not clear of the relationship between XMP and DOP (other than that DOP are used only by PL, and whether, if I only use PL5, there is any point in creating XMP as well.

The real answer, for me, of course, is the difficult one, is to be more rigorous in pruning the rubbish from my photo archive so that it fits on to a smaller (and faster) disk. But software is improving so much that its always possible that a photo that was unusable will later become usable.

Thanks again for the advice.

I was not experimenting deep or long enough with DxO PL to be able to determine what is what and especially, where are the differences. In various threads I read database and dop files can create extra troubles as some data is stored in both and very soon there’s a collision with a “single point of truth” concept. So, after nearly two years of happy editing with PL I changed my course again as I’m not disciplined enough to always tag keywords to all pictures. I want to find them in a project/album structure and PL’s attempt is not up to my needs. File folders are not flexible enough and I hate copying big RAW files just to use the same picture in different projects.

The number of images still is growing on my drives and the current DxO concepts to handle them are dead end for me.

1 Like

There are two major drawbacks to relying on the data base.
JoJu rightly points out it need to be a stable one. But DxO have never provided database management tools, this was promised years ago but as with many things vanished into the where ever such thing go.
The second and for me significant one is lack of flexibility. I as many probably do process images when away on a laptop where I keep the last two years of images. I use Photo Supreme as my DAM.

When I return Photo Supreme adds all the images and any changes to its data base when I run verify. PL adds images using the dop’s and I than do a final editing and weeding. I than copy the final changes back to the laptop, verify in Photo Supreme and if I open images in PL it will read the current dop as I delete the data base to prevent any conflict. If PL just used the data base I couldn’t do this as far as I can see as how would such changes be carried between PC’s?

Thus not using the PL data base enables flexibility as the tools and verify in Photo Supreme do not exist in PL. If you use PL as DAM you have to use the database, how you move between computers others may be able to explain but I don’t know how you would do it. Likewise some process on one drive, a SSD one and keep the finished results on another. How does the database only approach with PL cover that? I know Photo Supreme can do it, but I have never done it with that program but PL?
To me relying on the database is OK if just keeping image’s on the one PC, keep backing it up and hope it doesn’t get corrupted as there are no database tools to help in this. If using more than one PC its, to me, not a very good thing to rely on.


So far, PhotoLab is not built for parallel use on several computers. It can be done though and the current best bet imo is, to use the .dop and .xmp sidecars with PhotoLab set to NOT automatically read and write these files. Manually reading or writing sidecars puts control in our hands. Downside of working on more than one computer: PhotoLab’s databases grow endlessly, because images will be added automatically, but removing images requires extra caution and additional steps and limitations.

1 Like

Why I keep deleting the database.

…which makes PhotoLab unusable as an archive, because search will only find the latest images…but all who manage their archive without PhotoLab can happily delete PhotoLab’s database whenever they wish.

I’ve tried to keep things completely within PhotoLab, but I couldn’t do it with reasonable effort.
DxO needs to provide means to fix discrepancies between what is in the database and what is out there, on the drive, in its folders etc.


Here is my workflow which has been working very well since PL1.

  1. All my recent photos are on an SSD external drive
  2. I backup to large capacity drives regularly
  3. I do not rely on the database and can very easily delete it as I exclusivity rely on .dop files
  4. I use Fast Raw Viewer to cull photos and check for sharpness and over/under exposure
    5.I use Photo Mechanic Plus as a DAM where required
  5. My photos are arranged in folders based on date taken. I have a number of standard folders inside the top level year folder for various purposes
  6. I can move my SSD to any computer to work on the photos without worrying about losing changes or confusing databases as I give .dop files priority

I hope this helps readers understand that there are very flexible ways of working with your photos without needing to rely on the database. I can easily find my photos based on my directory structure and, if required, my DAM of choice.

There are other things I do to make everything work flexibly but those details are not required for this discussion.


I suppose you proceed like this:

  1. On a computer, copy files from the camera to the external SSD
  2. Backup external SSD to backup volume
  3. do your FRV, PM, customising etc. and another backup
  4. eventually, format the memory card in your camera

If people are unstructured and use the database as a spider in a webb of scattered imagefolders even on different storage addresses like some do they are asking for problem the day a mayor crash happens. So there is nothing wrong with a file folder structure using one single topfolder holding all the other session folders. Nothing wrong with a base naming structure with descriptive names on folders either.

In fact that can be a very good and simple start when indexing it all with a real DAM. … and in fact, the indexing in Photolab is not bad at all. Just select the topfolder and it will for example index all my images in that hiarchy and also create a flat list of my keywords from Photo Mechanic on the fly - if I wanted to migrate from PMPlus this is one of the fastest and easiest ways I have seen before.

DOP, XMP and database is like the rings of a tree and sooner or later they will put the users in a very deep confusion. The DOP files are the pretty old fasion legacy system. The database solution seems to be the way the manufacturers think we shall go.

The things not working all that good yet is the handling of XMP. It´s better than Lightroom that doesn´t write to XMP by default but the solution is far too immature yet. We have to be able to choose if Photolab owns the XMP data or some other external software. If we use another software for XMP maintenance, Photolab cannot be allowed to update that data too. We need a new section in “Preferences” to control this.

My processed images I keep on my internal 1 TB SSD. Your way is more flexible and one day I might need to do the same. Don´t you lose some performance using an external SSD?

I have always organized my images after “sessions” and they are named like: Afghanistan 1972, Laos 1976, Afghanistan 1978, Kenya 1972, Kenya 2012 etc.

One good thing with that is that the folder namnes are indexed by PM Plus without having to add any text to begin with at all in PM. I can reach them to add XMP data both with PATH in PM or by searching. I can focus on Afghanistan 1972 or search for all afghan images without adding a single XMP line. The creation date is standard data that I don´t have to add myself.

Almost! This is what I do:

  1. On a computer, copy files from the camera to the external SSD
  2. do your FRV- viewed all photos so no corruption!
  3. Format memory card
  4. PM, customising etc.
  5. Backup external SSD to backup volume

I have not noticed any performance issues as the SSD is very fast and you only read RAW files and write sidecars and export images.

My database is on the internal drive D: so no interference with image read/write operations.

Your way to proceed makes you have a single copy only until the last step is done… I’d only format the memory card after completion of the backup.

@platypus True, but I do not make money from my photos as I am just a hobbyist but I have looked at all my photos and seen no corruption. RAW files are read only so chance of loss is minimal and I am prepared to take that chance.

Well, you can also backup corrupt files :wink:

Years ago I tried a 5:4 proportion on a Nikon D7100. JPG previews looked good. Loading them into a RAW converter showed 1/3 magenta area in some images. This kind of camera related stuff did happen more often than any SD-, CF- or XQD card failure, but I never buy cards in “special offer: 50% less” situations. Lost images are simply not worth saving a couple of bucks.

Usually I download my images into C1 archive and before into DxO archive and I know I can use a Sandisk or WD (I bleieb) repair tool to bring back freshly deleted files. I did so for two friends and was amazed how well it worked out.

Very important and difficult-to-understand things!
I suggest that DxO would arrange a series of webinars on just workflow procedures, indexing, projects and alike. Here I feel many users need help - myself included.