I just can't stand using Windows as OS anymore - Microsoft has gone to far - where is the PhotoLab for Linux?

No, the reason Windows dominates the corporate world still today is the A PPLIOCATIONS and NOT the OS itself. It started with MS-DOS-applications that gradually were replaced with Windows-applications in a very soft and fuzzy transition in the beginning of the nineteenninetees. Under Windows almost all MS-DoS applications could coexist side by side with Windows-applications at least since version 2,21 that I have personal experience of. Version 1.0 or 1.03 that was the first version I have seen was more limited. Windows was a perfectly natural but technicalluy"uglu" extension of MS-DOS.

Microsoft’s DOS lineage began with 86-DOS, created by Seattle Computer Products in 1980. Microsoft acquired the rights in July 1981, refined it, and provided it to IBM as PC-DOS 1.0, released together with the IBM PC on August 12, 1981. When Microsoft started offering its own version to other PC makers, it adopted the name MS-DOS, with the first broadly distributed release being MS-DOS 1.25 in 1982. That unassuming command line became the technological bedrock for the PC era. That is the beginning of Microsoft a a “platform”-company. What secure Microsofts success wasa that every PC was bundled with MS-DOS. That was a monopoly already from start.

A few years later, Microsoft introduced its graphical future: Windows 1.0, released on November 20, 1985. It wasn’t a full operating system, more a graphical shell resting on top of MS-DOS. Windows 1.0 didn’t allow overlapping windows, giving it a tiled, experimental feel, but it provided the conceptual jump that shaped everything that followed—Windows 2.x, the breakthrough with Windows 3.0 in 1990, and eventually the sprawling Windows ecosystem we recognize today.

These two releases—MS-DOS in the early 1980s and Windows in 1985—form the twin origins of the modern PC landscape, small steps that became an entire digital world.

So the origin of this dominans is in fact 20 years older than what happened in the beginning of the twenties.

What might have started the “PC” desktop revolution was the birth of the “spreadsheet-revolution” with the “Visicalc”-application 1979. It was made initially for Apple II and not MS-DOS and IBM PC. Among others especially corporate accountants had grown increasingly tired of the dependancde of IBM Mainframe IT-staff in their white robes to get data for business analyzes and longed for a more free solution and exactly that was Visicalc promising. But already after a couple of years Visicalc was marginalized by MS-DOS alternatives.

The original Apple Macintosh — the very first version — was released on January 24, 1984.

The glory days of Apple II and Visicalc were short. The later totally dominant PC spread sheet app Lotus 1-2-3 arrived in the world in January 1983. It landed on the IBM PC like a meteor and instantly became the machine’s “killer app,” the thing that made businesses say: we need one of those beige boxes, now.

If you’re tracing the evolution of productivity software, this release sits right next to VisiCalc (1979) and precedes the great migration into the land of Windows and Excel in the 1990s—each step a little monument to human impatience with arithmetic. Excel was the first killer app for Windows.

Ah, nostalgia! I did use Lotus in the late 80s but not for long, SuperCalc was the main spreadsheet application where I worked, until Excel took over.

1 Like

Lotus screwed themselves completely when Windows 3.0 was released with so called DOS-protected mode memory management that monopolized memeory control to Windows. Lotus that thought that they owned the PC-world had just released a version of 1-2-3 with their own monolithic memory management system called VCPI. The result was a severe memeory-conflict that made 1-2-3 totally incompatible with Windows. In order to use Lotus 1-2-3 you were forced to start 1-2-3 without Windows. There Lotus decline started because that convinced many to migrate to MS Excel and Micrrosoft gladly helped them with a feture that converted Lotus-macroes to Excel-macros. People preferred one unified environment to start programs, to prointg from and to handke their files with.

TL;DR but the very thing that started this thread was… that applications gravitate towards the dominant OS. :man_shrugging:

1 Like

It is not sometimes all that simple to mix Windows and Mac in “international” corporate environment. I have seen at least two reasons to avoid mixed environments in the rteal world.

  • The first is that a mixed environment gets way more complicated and way more expensive to maintain. And it is not all about two different systems demanding different skills. The Mac-world has never been relevant as a server software alternative. Wondows on the other hand uses practically the same software core both for laptops, workstations and servers and with Windows 8 (beware us) they even tried to make the interface (before theyt woke up) to a mirror of their Windows-phone interface. The fact that it is practically the same basic user interface from Laptops to big multiprocessors datacenter-servers can´t be overestimated. That dramatically reduces the “learning” curve througout an organization.

  • The other thing that I know about from hard personal experiences but most anglosaxons are not avare of at all, since they rarely use other languages than English is the character problem issues.

In the nineteen ninetees most of my work at the IT-product distributors of Esselte Datasoft and Scribona (the biggest distributor in the Nordic Countries between the nineties to 2009 when it was bought by the Americanb software distributor giant TechData) was to help almost all of that times biggest US software companies to soft out these character problems.

The problem was that the rest of the world used different 8-bit ASCII-encodings. In practise MS-DOS had totally land-specific MS-DOS extended character sets. Windows relied on *Windows ANSI, and many Unix systems used ISO-8859-1. On top of that EBCDIC-texts from IBM Mainframes poured into Windows through Windows-applications used to access these mainframe computers in many corporations. I helped Crosstalk ands IBM with that of pure personal reasons sinced I had problems using IBM-software to replicate mainframe data to our Windows Data-Warehouse.

Opening a text file from the “wrong” world often produced digital surrealism: stray symbols, mysterious boxes, or a German ü that had shapeshifted into a mathematical oddity. It was a total mess and I leave the problems with printing we had.

In my final job I was working in the City Museum of Stockholm. Stockholm had standardized their 30 000 computer plus on Windows of a reason BUT two or three of this museums photographers insisted on using Mac.

Result: When they sent their pictures into the museums Digital Asset management Systrem of FotoWare they caused the system to stop because it could not handle some of the Mac-characters.

This system is still up and running and the problems still exist with many legacy systems that don´t use the more modern Unicode-system (UTF-8).

So in an international enterprise world characters still matters and still are able to fuck up the peoples days trying to manage international corporate IT-systems.

UTF-8 is only a panacea if everything is using it. We’re still, I think, decades away from that time.

But even with UTF-8, Microsoft will still ruin your day by turning straight quotes into “smart” quotes and other such “helpful” substitutions.

2 Likes

You do know that in the MS Office applications you can turn off those sort of auto changes?

1 Like

I can turn them off in my installation. The problem comes when other people use Word as a text input tool then copy that text into a server-based tool. Arguably, the server software should sanitise the input, but age and cost means many, even most, don’t.

1 Like

I really appreciate the fact that some are trying to explain that a VM or a dual boot will do the trick… to people who express a desire to completely ditch Windows. Especially since they’re talking about abandoning it because Windows 11 (the current version, maintained with security and bug fixes) is getting worse with each update “they” release.

Furthermore, Proxmox uses its own kernel, so you’re recommending breaking the Linux environment of the person you’re suggesting this to. Unless you’re seriously suggesting investing in dedicated Proxmox hardware for a single VM for a single software?

I could also write a long piece about my professional past with Atari, DOS, Windows, blah blah, which wouldn’t really add much, even if it does bring back pleasant memories: obviously, we’ll never see DXO products if the decision to develop or not is based on market share. It’s wrong, however, to say that this criterion applies to every company: in that case, there would be no professional solutions in the GNU/Linux world, since a few years ago the desktop market share was even smaller. Yet there are (many) such solutions, and some are distributed for free (which I’m not asking for here regarding DXO, please don’t digress)… and some cover cutting-edge areas.

Before it’s a question of market share, it’s a question of mentality and therefore of will in the truest sense. To be frank, if I had read this forum before acquiring a PL license, I wouldn’t have acquired one. Because by browsing through various threads on different topics, one thing becomes clear: DXO has its own vision (which is perfectly normal), a bit too dogmatic in justifying certain choices (that’s starting to smell fishy), supported by a segment of users who act like fans, forming a bulwark against it.
The sum of all this is toxic for anyone who comes here and deviates from the established path.

Personally, I’m starting to consider completely changing software. To begin with, because I have no use for a license for 2 activations when I would only need one by switching from Windows to Linux… without the possibility of having PL on it.

Some say that DXO is a small team with limited resources. I respect that !
I also have to apply it to myself so: a small team of one member, with limited resources. If I can save the cost of a license used at 50%, then I have to. Because the initial intention (remember: getting rid of Windows) won’t change, and the requests from some people here won’t come to fruition either.

Regards

1 Like

See this post I made recently:

… or Google when passing data between the Microsoft Office-applications to Google’s apps that I often have to do of different reasons - applications that only are semi-compatible.

I don´t know if it is all that wise to add more layers and system resource demands to Windows - Photolab when many of these systems struggles to handle the AI-masking tools as it is already.

Already in the nineties: At my job at a Swedish Software distributor we got a new guy to our support. He was asked to present himself and said: I hate everything Microsoft makes (he was a fan of Apple II - not MacIntosh but Apple II). It was a little strange he had passed the “tests” since Microsoft was our biggest software vendor already then. But he turned out to be a very good technician in the long run despite those attitudes. :slight_smile:

1 Like

You brought up some very good points. But…

As a computer service technician myself even after about 10 years I need help from the all present internet search machine to solve some issues coming up from time to time. Nothing different here from windows.

Linux for users basically consists of the two main branches: Debian (Ubuntu, Mint, ea.) and Redhat.

1 Like

Notable exception: DaVinci Resolve. Apparently VFX studios have relatively high Linux usage, so supporting Linux makes sense for them.

Which is exactly the same argument, I think, for why none of the RAW editor vendors do. A VFX studio has WAY bigger demands of their systems for which the (relative) pain of Linux may be justified.

I recently had my late father’s old 8mm movie films digitised. The company (really one guy) who did it have a Linux-based system, but that’s only because he built a bespoke software and hardware solution to get the best possible quality, rather than just buy the same gear/software that everyone else does. He even contacted one of the open source software projects and worked with them to customise some aspects for his operation.

It makes total sense to me why he went with Linux. He’s doing stuff no-one else is and he needed a very high level of control. Which is not where you find DxO’s customers. Even professional photographers probably just want something that works consistently.

1 Like

Perhaps that is precisely the reason: the perspective on the systems is simply different.

Yeah, I fully agree with the logic.

Would still love to see a Linux version myself, if only because while I use macOS (and recently got a newish Mac, investing back in that ecosystem for a few years), I also like Linux and would love to have an escape hatch from whatever monopolistic nonsense Microsoft and Apple tend to do. But it does look very unlikely from DxO for both market share and technical debt reasons. I guess I’ll have to be on the lookout for other solutions, maybe something less bad than DarkTable will pop up in the next few years ^^

In my opinion, there have only been two significant reasons to use Linux as a professional platform: maximum customization options for systems with a high number of server processes (e.g. video systems) and Linux implementations designed to put price pressure on Microsoft in order to obtain cheaper licenses.

You forgot all the insane amount of telemetry that Windows sends back + the way they keep trying to lock you in, the idiotic enforcement of TPM etc that stops older fine working PCs from running Win11 etc etc.

MS is not doing anything for consumers. They have become more greedy and more focused on stupid AI than ever. As much as I actually like using Win11 Pro I hate everything that MS has been doing the since WinXP.

I’m not sure if TPM was only pushed by Microsoft. Without the dependency on TPM, the market for PC manufacturers would have continued to decline because the systems are already powerful and there would have been significantly less demand for new PCs. The need for new hardware is determined by AI, i.e., mass storage, system memory, and GPU performance.

I think that telemetry and the use of the data collected have now found such widespread application (not only at Microsoft) that a return to the old ways would only be possible at the expense of greater inconvenience for users. But people love the comfortable life…