Based on licencing changes from Capture One, just announced, I think DXO has a great opportunity to grab new customers.
It would be great if you can offer some goodwill/discount for those willing to migrate.
Based on licencing changes from Capture One, just announced, I think DXO has a great opportunity to grab new customers.
It would be great if you can offer some goodwill/discount for those willing to migrate.
Is that so? What kind of guarantee do you have that DxO doesn’t walk the same path in one or two years? And then still stick to their stupid “we’re the only ones able to create proper lens-profiles”-strategy? Wake up, @Waveuk: They all see the tons of money Adobe is raking in, and while they have no serious innovations to offer, so why not become greedy as well?
Btw. Capture One’s DAM is far from excellent, but DxO’s is close to non-existing and insignificant. Too much, too big holes to fill in.
I don’t use Capture One. Do you know how long have they have had a DAM?
I don’t remember exactly the version which first offered sessions or catalog (maybe 8), and I’m well aware, C1 DAM is a half-baken, poorly thought piece of software. But the one from DxO is not even poorly thought… My problem was, I took it for granted to find another software like Aperture, but no, everywhere only a maximum clutter of additional apps and workarounds.
Edit: If you’re not allergic like me on marketing babble, here you can find various release notes. In version 9 they were talking about “export as catalog” and “keyword libraries”, so I guess, version 8 might be not completely off. 8-9 years ago, I’d say?
Thanks for the information.
With a grain of salt, please, I’m really not super-familiar with C1’s version history
Edit: After browsing through old release notes, I think there was a DB-version of C1 since version 3. It’s a bit embarrassing they could not improve their DAM to something able to search for content with a single full text search within the last 15 years.
Fifteen years? I hope DxO will do better than that. They really only started serious work on their DAM a little over two years ago. They may just leave it as is, but they may surprise us by turning it into something at least usable if not feature rich.
PhotoLab v6 can manage metadata with the following TOOLS
Moreover, PhotoLab v6 can
However, PhotoLab v6 does not
While the first two groups allow some degree of metadata management, the third group lists features that I consider to be essential for robust asset management. As long as the third group features are absent, I do not consider PhotoLab to have serious asset management capabilities.
Capture One’s asset management capabilities are similar to DPL’s imo.
I’d not consider them to be better than PhotoLab’s though.
Pity. No, @platypus Capture One’s asset management capabilities are not similar to DPL’s, they are much more versatile. You don’t have to “consider them better”, it’s enough that I know they are. It starts with catalog backup, export of projects/groups/albums and import of catalogs, repair functions, more options to group links to images than PL’s lousy “projects”. Not to mention “intelligent albums” (=saved and dynamic search conditions). PL’s DAM is far from usable. At least, if workflow (instead of workaround) and reliability are of some importance.
Thangs for setting the record straight, @JoJu !
I’ve used Lr from version 1 on and I’ve never seen any database inconsistency so far.
C1 is probably a) as robust as LrC and b) ticking the boxes of the third block in my post above. It’s just that I’ve not really cared about C1’s DAM features so far.
Me neither, at least not as long as I could use Aperture. But now I have to. There were two or three inconsistencies since I started with version 8. Version 13/20 was a happy crasher, often as spectacular as my cursing afterwards and if the Danes ever find out how to program a working search field, they for sure will sell it as an apocalyptic innovation.
But hey, DxO PL also can’t search for projects if I’m not completely mistaken? LR can (it’s running on my office PC). I’m rather happy because LR’s interface is rather fat and ugly, in short, adobelike, so I don’t have to fight against the temptation to use it. But I guess, you are familiar to it and it plays a bigger role than a subjective judgement: “can I work with it?” and if the answer is yes, there’s nothing to debate.
Well, I used Lightroom since version 1.0 many years but left it about 10 years ago and experienced a couple of database crashes. Got Lightroom for free after Adobe’s hostile take-over of the Danish Pixmantec company, the maker of the then very popular RAW Shooter that Adobe saw as a dangerously fast-growing competition. The first years with Lightroom were terrible. Banding when printing and very poor image quality that dramatically improved with version 3.x causing a need to remake a lot of earlier work. Even up to version 6.x when I left LR the metadata maintenance was so terribly ineffective that I found it useless. LR’s previews quality have also always been under par.
I looked at C1’s database implementation some month ago and it has it´s upsides compared to Photolab. The most prominent upside is that it´s migration friendly compared to Photolab that can´t either export or import keyword-lists. That is a must in a changing and scaling metadata world. The most irritating downside is the mandatory import process and that goes for Lightroom too. … and it´s even more inefficient when it comes to metadata maintenance than Photolab.
Photolab on the other hand integrates much better than both Lightroom and C1 with some third-party metadata tools like Photo Mechanic. In that case the lack of an import function in Photolab turns into a gigantic plus instead of something negative.
I agree with Joachim, I don´t think DXO will stay away from that user-taxation licensing model that has made Adobe doing all time high year after year after introducing it when money is just pooring in to that company without really needing to meet up the users with an offer to match.
That licensing model is a perfect choise for tired companies without any other ambitions than skimming as much cream it can with as little efforts as possible. It might be the case that these converters now are beginning to reach the end of the road and they might see that it gets tuffer and tuffer to sell new versions by offering new and revolutionary features no one has seen before.
I also got the message from Capture One by mail:
I won´t step into the subscription model. I just upgraded to version 23 and I think I can live with that for many years to come without any problems so when it comes to me they won´t make any upgrade money more on me.
Some “third party workaround fuzz” I read… How does PhotoMechanic then show the edits in PL? NOT at all. Unless you export the RAWs and import it into PM. What a waste of diskspace. I prefer the previews and a working catalog without the need of adding endless keywording sessions but to each his way.
I’m thinking along the same line. I have enough cameras and lenses which are covered by C1 by now and all their iPad and Cloud and styles blablah is simply nothing I’m interested in.And given the “energy” and “intelligence” they so far showed in making the DAM a round one in a powerful package with their RAW converter stuff, I’m rather relaxed to not become surprised by any breakthroughs any time soon. I’m suspecting these conpanies all want to wrap us into clouds and becoming cows to milk.
Maybe I misunderstand you now. I talk metadata now. The RAW-files are on the same place both for PM and PL. They both read the XMP, that´s it. There is no waste of diskspace if you don´t mean that PM is creating smaller more efficient JPEG-files of the RAW. It´s a very well used diskspace since it is relieving PM from the performance problems PhotoLibrary have in version 6 and that have been seen both in Lightroom and C1.
I have just changed a disk and just created a new index for a new database with more than 45 000 images. The diskspace used totally for previws and the indexes of the metadata is about 19 GB which is a piss on the Nile. A 4 TB disk costed about 100 U$ - discs has dropped a lot just in half a year or so at least here in my country. One could have expected the opposite when the inflation we now import with the dollar.
One of the huge advantages of using a more advanced tool for metadata-editing, searching and -culling is SPEED. I have two databases, one with 25000 processed images and one with 45 000 unprocessed. In PM I can search over both of them at the same time even more easily and flexible than in an Enterprise DAM like FotoWare, where you have to build “Union indexes” in advance. In PM you just click a box at each database (as many as you want) and search. I can scroll from the top to the bottom of the applications “contact sheet view” through 70 000 images in about one second - or the time it takes to grab the vertical slider and pull it from top to bottom - without even the slightest lag. Try that with Photolab or any other converter with built in PhotoLibrary/database.
So no no, no problem at all. Everything happens automagical now. Just turn on the “Synchronization” in PL 6. If both PM and PL are open simultaneously a change committed in PM immediately will be replicated to PL. If PL isn´t open the changes will get replicated immediately you open that imagefolder again. I don´t care at all really about the edits in PL in PM besides refreshing the JPEG-folders. That is not a problem.
In version five that was a manual process unless it wasn´t the first time you opened that folder. Version 6 always synch when that folder will get opened. The automation works lovely now so I don´t feel that I need to think about that at all now. There is nothing that stops manual IPTC-updating in PL though which I have asked for but as long as people are not doing updates in PL and don´t use hierarchical keywords it works really very well now. Reexporting metadata in XMP to JPEG has worked fine since version 4 of PL.
There is only one mayor thing to think about and that is always to do the metadata work before creating virtual copies because PM can´t see these and PL doesn´t update changes to those automatically yet. If you still have to copy metadata from the master to the virtual copy it can be done pretty easy with Alt+Shift+C and Alt+Shift+V. Copy and pasting metadata in PL is pretty OK I think since you have the possibility to paste just the fields you want if you don´t want them all.
I don´t do “endless” metadata or keywording sessions on images from one photo session. I do it once and for all at the same time initially. If I need to batch a few new keywords on a certain batch of images it´s very easy to do since the templates used for that can be rigged to append keywords after the ones already present on those images.
Tell me more about… I’m not aware of any performance problems of C1 23. Or you mean “have been seen around 2017”?
Well, and it will look like that. 19 Gb ÷ 45.000 = 422 kB/ picture (including all database stuff) doesn’t sound as if you get 5K full resolution. My catalog has 33.500 pictures, 232 Gb (including 17.5 MP Previews at 4 MB and 22 kB thumbnails) and I can take it with me and work on it without the originals (which bring another 1.74 TB on the table). With PM alone you can’t work / edit on any picture, you just can tag them. To work on the pictures you need to take them with you.
And I can scroll through 33.5k images in ½ second. I believe neither you nor me were able to use a precise stop watch? I thought so. So, it’s a bit guesswork. I was trying a screenmovie, but it has a slow fps, so I put a camera with a mechanical shutter next to it to get an idea what a ¼ sec sounds like and I think, it’s something between ½ and ¼ sec. But that doesn’t matter: The moment I choose an image I can work on it, without any “send to PL” command and change form PM to PL in between. I tried, and C1 is as fast or faster than PM. Now, what do we do with this useless information?
My daily process is less “how fast can I scroll through 70.000 pictures? In one go?” but more “how fast are they ready to be edited?”. I’m no big fan of metadata tagging. For private use it’s time consuming luxury, which I’m simply too lazy to perform, I definitely enjoy more the work on images. And I orient myself in albums, groups and projects, maybe I use C1’s filters. “Yes, but what if you have to find a certain picture” - then I usually know where to look for it without searching for tags and narrow down the search more and more until I realize the butterfly’s latin name has a typo.
Also, there’s this neat “Excire” app which claims to tag images by using AI. And coming up with dolphins and orcas. On a tiny freshwater river… well, some mean ducks clearly have the character of orcas. But occasionally it’s hitting a bulls-eye and it can find more faces of the same person than I could do, without one single keyword. Magic. If it works… It also can find similar pictures - which is not helpful if I’m looking for the only pink-elephant picture in a bunch of typical grey ones.
I use metatags when I really need to identify very similar pictures, maybe for some kind of machinery, documentation, test-shots (with lenses without EXIF-transfer), but for my kind of shooting a working catalog structure is far better and less effort. You organize your images by keywords, I prefer something with a visible structure.
[quote=“Stenis, post:16, topic:30151”]
that have been seen both in Lightroom and C1.
JoJU:Tell me more about… I’m not aware of any performance problems of C1 23. Or you mean “have been seen around 2017”?[/quote]
All my converters with integrated image libraries / databases have had speed issues
Historical experiences and present in all three of my converters. C1 23 seems to be fine but there were quite a few first disappointed with the performance when upgrading to version 21. That was a big and very important upgrade that at first got some bad reputation because people were complaining about the “speed”.
Lightroom have had performance problems from time to time and I have experienced it myself too. There have been all types of recommendations to either change the settings and lower the size of the preview quality to splitting the database on a year-based manner - not all that appreciated either because Lightroom has never had the possibility to have more than one catalog active at the time and then the whole idea of on image library implodes.
Even Photolab shows sort of a rendering problem from time to time in version 6.0.1 that they might have fixed in version 6.1 but even in version 5 we had a rendering/scaling/zooming-problem. DXO tried to handle the preview rendering and scaling problems with forcing the users to zoom in the image to activate the full rendering to 1:1.
… and it´s because file system limitations and the “on the fly” preview rendering costs
With the compromise the integrated ImgageLibraries and cataloges imposes on the user there is not a question IF we have to pay a rendering time cost every time this rendering work is taking place but WHEN it will happen and how much it will affect you.
[quote=“Stenis, post:16, topic:30151”]
The diskspace used totally for previews and the indexes of the metadata is about 19 GB which is a piss on the Nile.
Joju: Well, and it will look like that. 19 Gb ÷ 45.000 = 422 kB/ picture (including all database stuff) doesn’t sound as if you get 5K full resolution. My catalog has 33.500 pictures, 232 Gb (including 17.5 MP Previews at 4 MB and 22 kB thumbnails) and I can take it with me and work on it without the originals (which bring another 1.74 TB on the table). With PM alone you can’t work / edit on any picture, you just can tag them. To work on the pictures you need to take them with you.[/quote]
Who says people shall work with PM alone?? It´s integrated in my case with Photolab. The people using PM have also Photolab installed but the main files are at exactly the same place both for Photolab and PM. There is no redundance there. Nothing will stop you from taking your images with you even if using PM in the workflow. It´s just a matter of system configuration.
The whole idea behind speed centered solutions like Photo Mechanic Plus is the indexing and the light previews. I never would like to have a full 4K preview when working with applying metadata on my images or culling them and I`m sure nobody else want that either. The 1:1 4K previews I only want to see in the developing part of my applications whether it will be Photolab “Customise” or in Capture One.
Joju: My daily process is less “how fast can I scroll through 70.000 pictures? In one go?” but more “how fast are they ready to be edited?”. I’m no big fan of metadata tagging. For private use it’s time consuming luxury, which I’m simply too lazy to perform, I definitely enjoy more the work on images. And I orient myself in albums, groups and projects, maybe I use C1’s filters. “Yes, but what if you have to find a certain picture” - then I usually know where to look for it without searching for tags and narrow down the search more and more until I realize the butterfly’s latin name has a typo. [/quote]
We have the images we have and without any order in it it´s not an Image Library but just a totally useless mess. Not even the people at the City Museum of Stockholm managed to handle their images efficiently in their job before they got a DAM. Many many times they did the same work again processing the same images for some new customer because they had forgot where they had put those goddamned images they knew very well; they already had processed. I found the same images in many places when I made the initial conversion of the data for The Digital City Museum of Stockholm and I often did when scrolling to all the images via the indexed views.
I also have learned a lot about my own images looking at others images on the Internet. Things I have refined my own metadata with.
I must say adding metadata to my images gets less of a pain the more I have done it since I know this will give me great joy when my memories are fading. I´m also doing it for my grandchildren. You have to have a good purpose to do it to get over the crest but after that the value of all your previous metadata work will increases with all new data you add.
Here some of the limitations 1:
Photolab does not manage to display any image at all when I open the topfolder of one of my two indexes “Bilder Behandlade” - that´s the basic of a PM-index
Here some of the limitations 2:
Capture One does not manage to display any image at all either when I open the topfolder of one of my two indexes “Bilder Behandlade” - that´s the basic of a PM-index
**Sure if I want to choose a catalog approach to the underlaying folder structure PM also has a “Browse”-mode BUT the big difference is that it is NOT opening the folders in the file system. Instead it goes to it´s blazingly fast index over the same file system structure.
… and you are not the only one dreaming about smart AI-driven keyword tagging but that is no real option for either me and people using controlled vocabularies. Two system and one of them without any control will most likely just create a total mess in worst case. How much time did we save then net**
An example of browsing a file structure in Photo Mechanic not using the slow real folder structure but a much faster indexed mirrored structure and when we select some images in PM and open send them with the Edit-command to be opened in Photolab not all images in a certain folder are opened but just the ones in the selection - a way smarter way to do it then to render new 4K previews all time for all images in any folder you open in Photolab. This problem got even worse with the problem of “frozen images” in the first version of PL 6 (6.0.1 I think)
@Stenis, word of advice before: Don’t take my reply too serious.
I’m sorry, but your post is a mess when it comes to proper quoting. I’m also not the best in HTML but it’s hard to see where my quoted post stops and your reply is starting. That makes your statements about DAM and so on appear weaker than it is. Form also matters…
I do make a big difference between co-working on a database and single-user work on something like C1 which is not a well-made, well-thought DAM, just a bit better than PL. Heck, none of these hobby hackers understands how to program a full text search into their so called catalogs. It’s embarrassing.
So, for a museum and its basic tasks it is essential to bring their assets into a well thought structure, so that with a bit of training everybody could find specific items. There’s no point whatsoever to come up with your Stockholm museum over and over, but maybe I catch myself also in recursive posts when I get older. For now, I’m working on daily basis with a content management system to create, edit, translate, publish, layout manuals and technical documents with a very high percentage of reuse of informations, graphics and data. We are tagging! Even more so, we need to make sure to find and connect the correct informations into documents with up to 600 pages, so please don’t talk to me about a bit of image research
You just confuse a multiple user situation with a single user situation. Nobody knows more or better about my pictures than I do - and I’ll keep it that way. If you went to make your heirs happy, I hope they’ll appreciate it. And also use sort of (their own) system to find images, but you can’t be too sure how long PM will exist and how well other apps can read your tags. Only ten years ago I wouldn’t have thought Apple would give up Aperture… Oh, an by the way: How do your relatives treat their own pictures? Putting them into the cloud and what else? Ok, what do I know…
I don’t question your choice for PM but to work with such a super-ugly app I want to get paid. Or drowned in chocolate. No, actually, I don’t want that, I’d just refuse it, it looks worse than Windows on a bad day.
What you’re calling “index” appears to be a folder, no?
Pardon Joachim but the Forum app quoting doesn´t do what I expect it to do despite I think I don´t do anything wrong what I can see. I don´t know why it´s like this.
I do too. It´s a big difference organizing the work in a single user environment from in enterprise systems… but I don´t agree about that the searching doesn´t work in Photolab. It does for me and surprisingly well - It even reports where it finds the matches and that is a nice bonus. I don´t miss anything because I don´t use structured keywords.
I think it works very well for my needs. It´s just that I think that when they have indexed my images (and they have) it ought to display all images in the database but doesn´t in a way I expected. My topfolder “Bilder Behandlade” (means processed images) is the folder holding my entire folder hierarchy. If I select it I get nothing since it just contains other folders.
The same happen if I select “Barbados 2013” that also lacks image-files but contains a few other folders. I get zero hits! In PM this selection would have displayed 223 RAW-files and 223 JPEG a total of 446!
But if I select one of the folders inside the main folder like “Barbados 2013” or “Barbados 2013_4K” I will get in this case 223 images.
Since I know the images has been indexed with exactly the same technique with pointing out the topfolder that contains it all I don´t understand why it´s not possible to see everything beneath a topfolder or a “main folder” lacking images but holding for example differens derivates of the files in “subfolders” like we can do in PM.
Look at how different this works in PM. Here I will see everything in an index if I select the topfolder. If I had selected the main folder “Barbados 2013” I should have got 466 images with 233 RAW in the under folder “Barbados 2013” and 233 JPEG in the “4K folder”.
If I use the Search-funktion instead and use the default search “X” I will get me 69990 images if both my catalogs are marked and 24498 if i just mark my “Bilder Behandlade” before i search it for all.
No the Stockholm museum or any other museum for that matter that knows how to use the DAM tech to it’s full potential is not about a well thought structure like catalog structures on disks anymore. It´s frankly the total opposite as in most other larger DAM-systems today. A physical order is not necessary at all really since everything is virtually handled according to the metadata mark up on the file objects. There is dedicated process metadata that is used for that.
No I´m not confusing multi or single usage at all and nether I or anybody else will at all be dependant on PM Plus itself as long as XMP is there as a standard and supported by other vendors of Image-DAM-systems. As long as we have our files we can always index them with any other XMP and IPTC-compliant image-DAM.
I have already migrated all my images metadata in my image structure from PM to Photolab just with pointing out the topfolder with the indexing-function in Photolab. It read all the XMP-sidecar files for my RAW and all the XMP in the JPEG-files and it created perfectly and automatically the keyword list in PL. I have nothing to complain about that function at all.
A month ago I decided to test the Image Library in Capture One. The images are still on exactly the same place. I have always used C1 with the session model before but now I found it was very simple and effective to create a monolithic catalog for all my images by just linking them to this database. Even Capture One did a good job creating a keyword list automatically be using the image metadata.
The only problem I have had was that I lost the keyword list to PM when my old computer crashed
but since C1 has a function for exporting and importing keyword lists that was also possible to solve. I think still it´s a must that even Photolab gets such a function.
If Aperture have had a proper migration path and your image had been possible to tagg with XMP, even those images had been possible to migrate, but proprietary applications without a support for IIPTC/XMP is and will be a dead end.
Cloud storage is nothing for me since I have to many images. I have tried it with my backup solution EaseUS and I wouldn´t like to try to restory image libraries over the net from a cloud storage.
What my relatives are doing with their mobile images they have to decide themselves. All I can do is trying to give them some advise if they ask for them.
I don´t question any persons choise of metadata tools what so ever, or if they don´t want to use metadata at all and PM is definitely not a sensible choise for most people - not necessary because it is “ugly” but because that the threshold is too high. Of my own experiences from Lightroom to Capture One and Photolab I have found that to “unsharp” and ineffective tools eventually will make you give up tagging at all. Tools for applying metadata has to be very effective, otherwise it won´t be done at all after a while and you have to have good reasons to endure to do the job over time. One for me is that my “picture illustrated stories” will be easier to find on the Internet for others and that ought to be interesting for most reasonably serious photographers. Another is to help me remember all the things I have done and seen during 50 years of photopraphy and 15 years of backpacking.
And there is no need not to stick to the subject Joachim and get unnecessarily condescending. Age is really not a particular useful variable when discussing this matter and if you don´t want me to take your posts seriously in general, then I don´t know, if there is that much of an idea to comment on them at all. What´s the point then to have this discussion??
… and the limitations you are talking about are not just coming from me despite I normally use to use sessions and that is because I normally mainly use C1 for tethering and more demanding retouch. An early reason was also the performance and stability issues C1 had. As you also can see I did delete the C1 catalog since I decided to continue to use Photolab as my main converter since it integrates far better with PM than C1 does. … and I will always use the folder approach when postprocessing because I always keeps the images from one single session in the same folder and the derivates in subfolders. Of that reason it´s essential that the application is able to show all images downstream a specific folder regardless if it happens to be a topfolder or a main session folder with subfolders. Sometimes that is the most convenient way to handle the images. I use that a lot in PM since that gives me a natural way to see and tag both the RAW and the corresponding JPEGS with specific common details for those images after I have done most of the core batching. Sorting them by capture time will give me that possibility.
So sometimes the search tools are optimal and sometimes a session-based folder approach. Nothing strange with that I think and I don´t think I´m very different than most users when it comes to that kind of workflow.
And the XMP-index I am talking about is an index and definitela a folder but it can manifest itself both as the backbone for the search-functions or as a virtual folder structure (in the browse-function in PM) that has nothing at all to do with the physical folder structure on the disk after the index once is built. That is one of the reasons that it can be so much faster in many situations than scanning the physical folder structure like Photolab does when opening a physical folder. The other part is the light weight pre-rendered previews.