Linux: You can mostly stick to the GUI to install software, touch the terminal for obscure/command line applications and install GPU drivers and you have a functioning system
Windows: Forced to go into regedit and services.msc to fix high resource usage on a fresh install, debloat scripts to remove bloat on Windows and need to update system, scower the internet for drivers and all the software you need
I can see why I got fed up very fast trying to use Windows 11 in QEMU tbh…never trying that shitshow again…
Edit the only packages I had to install through Bash are: Neofetch, Htop, OpenSeeFace, Brave Browser, Wine, Nvidia drivers and ProtonVPN. Linux is very user friendly imo
And keeping your software up to date is a giant pain.
Use Winget or Chocolatey. If you use an app that’s not packaged yet, it’s easy to package it yourself.
But on Ubuntu I don’t have to use the terminal to update my apps?
Scour
7 packages from the command line isn’t that many, but you’re failing to account for the fact that to most Windows users, the amount they’ll realistically install is 0, both because they don’t know how to use the command line and because they don’t know what to install. See also: https://xkcd.com/2501/
I mean that only matters for people like us.
99.99% of the Windows user base doesn’t give the tiniest semblance of a shit about any of that. Hell I run Windows on my gaming pc still and have never had cause to do any of that.
what if you wanted to show a presentation but windows said That is on you for using a Home license and not a pro license.
I’m going to be honest with you, as often as this has been memed and for as long as I have been using Windows on my work computer, I have never once been forced to restart on the spot by an automatic update.
I’m sure those who have will be quick to reply but at this point I’m 90% confident it’s a loud minority.
I’ve seen an entire factory shut down for hours because two critical Win10 computers tried and failed to update. It’s never an issue until it becomes one.
Plus a failed update is the whole reason I nuked my C: drive and switched to Manjaro (now running Arch, put down the pitchforks).
Well, running Windows 10, a consumer user-oriented operating system, to control mission-critical machines is mistake number 1.
This wouldn’t have happened if they had used Windows Server or something actually designed for that task (like Linux!).
Well, running Windows 10, a consumer user-oriented operating system
Huh… I wonder why there’s versions of it called “Enterprise” then. You might want to talk with Microsoft about their clear mistake. I mean clearly https://www.microsoft.com/en-us/microsoft-365/windows/windows-11-enterprise is in error since you’re correct right?
I work in IT. Windows 10/11 Enterprise is still a bad choice. If it’s a mission-critical system and you must choose Windows, pony up the cash for Windows Server.
The difference between Windows Enterprise and Pro/Home editions is that there are features on it that make my job easier, but it’s still the same shitty operating system under the hood. Windows Server is much more robust and reliable in my experience. Still shit, but slightly less so. It’s designed to run on machines with 24/7 uptime. Windows Enterprise still expects you to regularly restart it for updates and upgrades. That’s alright since we can just set Susan from Finance’s computer to update at 03:00. It’s not okay if that computer controls the entire factory.
I too work in IT… Just because I have some HR users that need to run Quickbooks, I don’t buy them Windows server 2022.
Or even just running the critical machines offline
Neither of those options were available. It was written by a third-party for some old .NET Framework version, and the server and GUI components were written as a single application. Putting it on a server wasn’t an option either because the application’s GUI was constantly used for the management of assembly machines, and other applications were used for monitoring and administrative stuff.
If you had been there, you’d know why this was a low-priority risk. That place was bleeding from a thousand wounds. At least this had some redundancy, for all it was worth in the end…
(edit) I actually contributed to that software, even though it’s not open-source! I managed to nail down an issue where loading a project file using one locale would result in a crash, but not in others. The .NET stack trace was printed to an XHR response’s payload and I used that to locate a
float.ToString()
call whereCurrentCulture
was passed as the cultureInfo instead ofInvariantCulture
, so depending on the computer’s locale, it would try to parse CSV data either using a decimal dot or a decimal comma. I mailed this to the maintainer and the fix was released within the month.Windows Server is an option.
The operating system is called “Windows Server”. It doesn’t necessarily have to run on a mainframe. It has the regular Windows GUI (with a few differences, the first you’ll notice is “Cntr+Alt+Del to log in”) and can run regular Windows programs.
Ive not had “must update on the spot right this very second,” but ive had countless, “we will update the second u power off or attempt a restart. If you try and restart into ur linux partition, we will somehow ensure u fail to boot right up until u got thru with our forced update.” Which also sometimes goes hand in hand with, “oops, i was supposed to update, but i shit myself instead. Youre going to need to try again at least once or twice. Dont worry, whether the update goes thru or not, itll only take a maximum of 90 minutes.”
Windows can fuck its facehole thru its ass as far as its auto updates are concerned for all i care.
Use LTSC and this and never worry about Windows deleting shit in Home or updates breaking something, do it ONLY when YOU want to update.
Near 25 years Linux desktop user, only use Windows when for example helping out family, need to do crap on windows at work, that sort of thing. I’ve seen this so SO many times, especially when you want to shutdown or reboot now, but WAIT! THERE IS MORE! Windows is updating without asking for the next 30 minutes, don’t shut down, screw your planned date. This most have happened more than 30% of the small amount of times I touched windows and it taught me to stay the f away from that stuff, don’t want to touch it with a 10 feet pole
I am with you but it’s the wrong place to be discussing this.
It’s more like when you shut the laptop down, then turn it on only to be greeted with such message. So, I also haven’t seen much of those back when, but only due to the unhealthy habit of maximizing uptime.
Yes, because even once is too many.
In a corporate, I spent an hour and half every morning waiting for Windows to update. Then my coworker handed me Fedora DVD and I never looked back.
I’m saying it’s never happened to me. Not once. Zero times. Zero is less than one.
Normal Windows updates don’t take an hour long. Give me a break. The ones that do are the version upgrades. That’s like the equivalent of a distro upgrade.
Normal Windows updates don’t take an hour
Correct. But who can tell the difference beforehand between a normal update and an abnormal one? The problem is Windows tends to hide those details. I’ve sat on support calls where a server needs to be rebooted for some configuration change, and Windows insists on applying updates because hey, you’re rebooting anyway, so what if it takes 1/2 hour to do this thing that should take 5 minutes…
Sure, your experience may be different.
That happened in 2013 with random laptop they gave me. I kid you not it took that long, could have been a bug somewhere in the OEM, never cared enough to find out.
But my experience is just as real as much as yours.
Sure Windows gives you warning, but after a while it FORCES you to install, even if for whatever reason that new branch bricks your computer. I had a good 6 months of that where every time my computer got shut off, it would force the update and fail like 40 times before it finally let me revert and use my computer. There was no way to tell it to STOP UPDATING
deleted by creator
Open services.msc, disable the windows update service, and set the start type to disabled. The go to C:\Windows\SoftwareDistribution and delete the entire folder.
Winaero Tweaker makes it as simple as a checkbox.
I just spent the last 6 hours trying to get my home assistant VM to run on boot up because I’ve spent the last 6 months unable to get Linux to stop automatically rebooting for unattended upgrades.
I’m far from a power user but it shouldn’t be so fucking hard. It’s like 3 clicks to disable automatic upgrades/reboots in Windows.
Unpopular opinion: The Windows Registry, a centralized, strongly typed key:value database for application settings, is actually superior to hundreds of individual dotfiles, each one written in its own janky customized DSL, with its own idea of where it should live in the file system, etc.
deleted by creator
deleted by creator
To be real with you, I find windows more complicated, the syntax…
Ah, a man of culture I see 😌.
You right click the candy crush icon and press remove.
Whotf does the other two things?
100% agreed. Once you disable all the unnecessary stuff Windows comes with you’ll be left with a stable system that is compatible with everything from professional to hobbyist software. Meanwhile under Linux you’ll spend all your days on getting a basic system to run properly (for some distros) or trying Wine, virtualization and other subpar hacks to get any kind productivity and ability to cooperate with others. :)
Nah, Wine runs most XP-times software better, Windows 10+ sometimes not at all. Some people do hacks like running wine in WSL because of this.
There goes your compatibility. Not to mention that Windows is probably the most incompatible OS (to the rest) in wide use (filesystem, non-POSIX, drivers, etc.)
You’re full of shit. Wine still fails at basic Win32 API calls available since Windows 95 and most thing that fail under modern Windows versions are usually GPU related tasks like games and the fact that you don’t have specific DirectX or DirectPlay features available on your modern OS and/or GPU and those cases there’s dgVoodoo2.
Do you really want to talk about compatibility? Just try to install MS Office 2003 and Photoshop 6 under Windows 11 - you’ll fine that both will work just fine without hacks.
They have app specific “hacks” in place for old popular software. Like Winamp and the ones you mentioned.
This highly dependent on what you do, do you do graphics or video editing then you are right, do you do non-Windows specific coding? Its the exact opposite.
do you do graphics or video editing then you are right
Not just that, same for every advanced MS Office product, any other enterprise desktop MS application, architecture, a lot of engineers…
do you do non-Windows specific coding? Its the exact opposite.
Jetbrains is available for all platforms and runs equally good on all of them.
It’s not about the IDE lol. It’s about everything else. Windows does not conform to the standards everyone else does, probably deliberately.
Also: WTF is a QWORD? And why are we SHOUTING in C? Even “qword” would have been more sensible, but no, MS decided to shout in a primarily lowercase language because why not.
Ahaha while you aren’t wrong, you’re a bit off as well. I know lots of developers doing their jobs under Windows and they’ll be really annoyed if forced to deal with Linux for their jobs. Not everything is lower level shenanigans and to be fair almost nothing is low shenanigans nowadays. Your run-of-the-mill average developer is doing stuff in web related technologies and can get just fine under Windows. After all there’s always WSL and Docker and Microsoft did a nice job with Windows Terminal.
You may find it hard to believe but Microsoft positioned themselves very well to take a large chunk of the current and future developer ecosystem, be it via WSL, nice tools, IDEs or whatever.
a stable system
Until next update, where they may just blacklist your CPU just because
Or an update where they delete all your files again.
2018: https://www.windowscentral.com/windows-10-october-2018-update-file-deletion-bug-story 2020: https://www.howtogeek.com/658194/windows-10s-new-update-is-deleting-peoples-files-again/
These both happened to me. I was already looking into switching to Linux in 2020, and the last update did it for me. Have been only been using Linux at home ever since.
Just wait for the next version of insert-non-debian-linux-distro and you won’t be able to boot after trying an update. :)
I find it very interesting that I always see a lot of people complaining about those kinds of updates breaking Windows all the time. In my experience I’ve only seen it happening with old ass , cheap hardware computers. Never had issues myself with mid range and hardware from reputable brands. If you’ve a computer from Aliexpress or some Chinese brand, oh well, you get what you payed for.
This is some weird tribal emotional stuff.
I run both on the same machine in VMs, had this fedora install since 2019 and kept up with the version upgrades every year. Its just worked without issue during that whole period. I use office in the web app. Windows is there to run cad/cam software. It feels more gross with all the candy crush etc that you have to remove, but it works fine to run the software I need it for.
What’s with all the hyperbole above? Did someone hurt you?
I run both on the same machine in VMs
And now you suddenly have to manage two operating systems with all their quirks. Nice!
Going full Linux desktop kinda adds the same pains of going macOS but 10x. Once you open the virtualization door your productivity suffers greatly, your CPU/RAM requirements are higher and suddenly you’ve to deal with issues in two operating systems instead of just one. And… let’s face it, nothing with GPU acceleration will ever run decently unless big companies start fixing things - GPU passthroughs and getting video back into the main system are a pain and add delays.
Why not just give in an manage A SINGLE yet productive OS that is widely supported by every vender and tool you might need? To Microsoft’s credit they made WSL and Windows Terminal very well and it’s way easier to run the 1 or 2 Linux-only applications on those than the other way around.
Err OK. I passthrough a card to each, switch with KVM. Its like having 2 native machines. According to you I have loads of issues, I guess I just haven’t found them yet? What should I be giving in to?
This is really weird,
ps, I sometimes game on either system, still can’t tell any performance difference from when it was bare metal. I guess I could be super lucky considering all those issues I should be having. Or maybe things aren’t quite as dramatic as you’ve portrayed them
How are you getting your video back into the main system? Some kind of remote desktop protocol? That adds delay. Unless your VM is attached to e dedicated screen you’ll have issues there.
The host is headless, no video output. The 2 VMs have a GPU passed directly through vfio, so there is no additional delay. Both GPUs connect to the same 2 monitors and USB by a KVM, so its one button press to flip between systems. Though I often run the cad software over RDP, as a little extra latency when using that doesn’t bother me.
insert-non-debian-linux-distro and you won’t be able to boot after trying an update. :)
Sure buddy, I’ll take your unsourced claim as an equivalent to my sourced one! /s
In my experience I’ve only seen it happening with old ass , cheap hardware computers.
Do you know what anecdotal evidence is?
deleted by creator
I won’t consider using dozens of random debloating scripts made by reverse engineering Windows binary files a stable experience.
The thing is that this isn’t correct.
Windows’ bloat/spyware can be disabled via group policy and it works really well because it was designed to allow it. There are countess companies and government agencies that force Microsoft to have group policy settings to disable the “spyware” otherwise they couldn’t use it.
Microsoft provides very detailed documentation into the bloat that you can follow to disable what you don’t want. https://learn.microsoft.com/en-us/windows/privacy/manage-connections-from-windows-operating-system-components-to-microsoft-services. Those “dozens of random debloating scripts” are usually just following that guide, not much else.
I’m not saying Windows is good, I’m just saying it delivers and for the hassle that it takes to run Windows-only software (that most people require) under Linux, most people might be better off by spending a quarter of that time debloating Windows.
Reserved bandwidth??
It’s used for updates. I’m not sure if it works all the time.
I think that it used to be called
superfetch
in the old days. https://answers.microsoft.com/en-us/windows/forum/all/superfetch-service-disable-helps-to-increase-speed/3c4d5b4b-edef-4eb7-9456-52fd304e606cIf you’re using an “unofficial” license, it’s probably normal to disable updates and afferent services.
I remember from years ago when I was modding Windows XP installations with nLite to try to purge all the unnecessary bits and install some useful stuff. Superfetch was this annoying service that supposedly ruined online gaming due to lag. :)
Prefetch and superfecth are just obnoxious services that waste disk space. You can safely disable them, there is no downside to not using prefetch or superfect on modern SSDs. On regular spinning drives, yes, they did make loading programs a bit faster.
Superfetch was keeping an index of file relationships in RAM and pre-loading files you were probably going to use next. It didn’t ping your network at all, but it could easily eat up a ton of disk resources and RAM. It was really only an issue on old 5400rpm laptop HDDs from what I remember.
Might be thinking of windows search indexing.
Yes, disable Windows search indexing as well. No point in having that on an SSD, it’s pointless, it just wastes disk space.
https://www.makeuseof.com/windows-limit-reservable-bandwidth/
It’s not as scary as it sounds.
It’s not, and in a vacuum I don’t think anyone would mind. It is the fact that it is concealed that is really shitty.
“It reserves bandwidth for high-priority tasks such as Windows Update over other tasks that compete for internet bandwidth, like streaming a movie”
As much as I’d like to keep my system up to date (and I really do), if I’m watching a movie then that is my priority. Any task I’m currently using the bandwidth on, should be considered my system’s priority. This is akin to rebooting the computer when it determines it is necessary, with the user having little control to stop it; it’s intend isn’t malicious, and it is meant to protect the user, but all it achieves is upsetting the user and make us find ways around it or turn it off completely.
Some sort of hidden, concealed, clandestine internal QoS implementation in Windows. Reserving a portion of network bandwidth for high priority traffic sounds like a good concept, but I don’t like the fact that this is so hidden (I’ve been working with computers for many years and I’ve never heard of it until now), and that the mechanism to determine the priority of a packet is unknown.
We know windows spyware traffic have the top priority.
I love shitting on Windows as much as anyone, but that is a completely baseless, fictitious accusation. And if not, give me a credible source.
If anything, I’d keep spyware traffic as low-profile as reasonable in Microsoft’s place.
I tend to agree.
Nevertheless, some unknown implementation can have bugs and things can go wrong and there’s nothing you can do about it, short of “rebooting” or d̷o̶w̸s̸i̷n̴g̸ ̴t̶h̸e̷ ̸h̵a̵r̵d̷ ̵d̷r̶i̴v̶e̷ ̵w̶i̴t̸h̷ ̸̞̺͠h̵̺͙̎̍o̸͔͠ͅḻ̷̀̇y̵͚͍̎ ̷͉̅̅w̸͎̔a̷̧̫̒́t̶̼̉̓ę̵̾͗r̶̫͑͑ ̴̣̿͒(̷͙̎a̸̬̺͝͝n̸̞̓̓d̴̬͌̍ ̸͇͕͌͝s̷̡̯̓͝u̸̡̳̇͝b̴̳͜͠s̷͍̘̽ë̵̜q̷̝͐̄ȕ̵̞̐e̷̲̠̐́ń̴̨̙͝t̸̛̬͝l̶̮̔͠y̴͕̪̑͝ ̵̖̆ḃ̴̪̟u̶̢͓͑̌y̵̜̤͌̏i̵̦̋ň̴̨͚̀g̸͓͑ ̴͍̬̽à̶͜ ̴͇͔̓n̴̬͂͜ì̷̢̛̯c̴̤̖̈́e̶̼̫̐̊ ̵̹̏͝f̸̙̀̑r̷̪̩͆͆e̸̤̫͛͋s̷̢̙̏h̷͇͔́ ̸̭̆͝N̷̰͗͛͜V̶͇͒̚M̸̟̍͜ě̷̛̟ ̸̢̞́͝a̷͙͔͒͒n̷̻͇͝d̸̘̥͌̾ ̴̜͓͑p̷̬͑͊ŭ̸̮̏t̸̲̀t̴̡͚̽í̶͎͓̑n̴͕̘̒̈́g̴͓̰̓͝ ̵͓̎a̴̻̼͗ ̷̦̍̈́s̷̥̅̈l̴̝̂e̴̞̅͊ḛ̴̊̅k̷͚̕ ̵̛̼̬͗D̴̻̾̽e̵̙͂̊b̷̝͘ī̵̢͇ą̵̂n̴͖̑ ̶̼̚h̴̼͂͑e̷̲͆̆a̵̡̋d̸̢͔̈l̶͕̍̍e̸̛͕̙̒s̶̞͔̀͠s̸̯͖̕ ̵͍̦̈́̉ ̸̨̨̓i̸̙͖͗̌ņ̶̯̍s̸̡̖͗̇ṯ̷́̒ä̵̦́̎l̶̼̄l̵̨͊̊ ̴̳͑͗ó̵͎̅ǹ̴͈̚ ̷͖͊͝i̷̠͇̊t̷̼̞͒͘)̵͎̤̔͌
Well not spyware per se but over the years they found over and over bugs which are really just highways left open in your system ready to be exploited. But to be honest that’s not limited to windows.
Yeah, if I were Microsoft I would implement spyware in a way that is least intrusive to the user experience. Prioritizing the telemetry data using QoS would only incentivize users to find ways to disable the telemetry, while providing no benefit to Microsoft. What’s the use for them receiving the telemetry data slightly faster, it’s much more important to them that it arrives at all
deleted by creator
Windows is easy to use if you don’t care about privacy.
I am currently dual booting and trying to get feature parity in my Linux install as a reletave newbie.
So far the largest hurdle I’ve been able to solve was getting my RAID array recognized. That sent me down a rabbit hole.
To get it working in Linux I needed to:
- switch from LMDE to Mint proper
- add a PPA repository
- install the RAID driver
- manually edit my grub config file to ignore AHCI
- run a command to apply the change
- reboot
- format the volume
To get it working in Windows I needed to:
- format the volume(Windows gave me a popup with a single button to do this on login)
Are you using hardware RAID? yeah, that doesn’t go too well with Linux… works perfectly in Windows though, cuz their softraid solutions are shit.
Server-level hardware RAID is fine on Linux. It has to, because manufacturers would cut out a huge chunk of their market if they didn’t. Servers are moving away from that, though, and using filesystems with their own software RAID, like zfs.
Cheapo built-in consumer motherboard RAID doesn’t work great on Linux, but it’s also hot garbage that’s software RAID with worse performance than the OS implementation could give you. I guess if you’re dual booting, you’d have to do it that way since I don’t think you can share software RAID between Windows and Linux. It’s still not great.
Cheapo built-in consumer motherboard RAID doesn’t work great on Linux
That is what I actually meant.
I guess if you’re dual booting, you’d have to do it that way since I don’t think you can share software RAID between Windows and Linux. It’s still not great.
That’s why you don’t do RAID at all on a daily driver. You make/buy a NAS for that kind of thing. Maybe just RAID1 in hardware, cuz that’s easy to set up and generally just works, even with low end hardware solutions.
It’s called FakeRAID for a reason.
After a while, you’ll hit a point where parity is impossible going the other way.
I’m running a striped partition and a mirrored partition with only two drives, and using an SSD to bcache the whole thing. I’ve even got snapshotting set up so I can take live backups.
I have no idea where to start with that setup on Windows.
I had a hell of a time just trying to get Mint to write to an external drive, including unmounting and remounting the drive countless times trying to get it to mount as rewritable (adding it as a mount option wouldn’t work in terminal or in “Disks”), it would just refuse to let me write to it, I could still read everything fine. I finally quit, got a second drive, backed all my stuff up and reformatted the first one, which Mint now sees and writes to just fine despite being configured exactly the same way it was before.
That is a massively condensed story, and if I ever have to look at fstab again I might just have an aneurysm. Y’know how hard it is to write things to an external drive in Windows? You plug it the fuck in.
Anyone who says Linux is ready for the masses is deluding themselves. It’s fine for nerds, people who like to work on their computer, but it is absolutely not ready for people who like to do work on their computer. Not when something as simple as “yes I’d like to save this to an external drive please” turns into a days-long rabbithole of bullshit that culminates in me buying an extra 8TB drive off Amazon.
Why have I never thought about this? Dual boot and bit by bit work on feature parity while still having an OS that’s my daily driver.
Beware of the W̷̞̬̍̌͘͜ĭ̴̬̹̟͕̒̆̈́n̸̢̧̙̈́̅̂̆̕͜ͅd̵̟̟̪͎̀̀ő̴̼̺̺́̐̂͘w̵̨͊̀s̵̡͎̭̊ ̸͔̬͔̜̊́̈́̌̈́ͅŬ̴͉͚̳̌̉͘͝p̸̼̅̆͐̃̑d̸̜͂ǎ̵̛̯̏͝ť̷̰é̸͇͝ as it can screw up/overwrite your other bootloader completely.
Kinda sucks, when you’ve got a meeting/work and you find out that forced update made your system unbootable/partially unbootable and you now get to live boot in and go fixing the EFI partition manually, in the CLI.
That happened to me once and that’s when I decided feature parity was less important than a reliable system that “just works” for getting things done on a schedule. (I removed windows completely, in case that wasn’t clear)
Anyhow, make sure you install windows to a separate drive that can’t see any others during the windows install, then will keep the bootloader separate.
I ran into similar issues before. My plan was to install Linux on a separate M.2 so Windows won’t interfere and manually boot the OS I want to manually.
I’ve been using Linux for too long because that list of steps sounded completely reasonable to me.
You’d normally use a software raid implementation these days, and Linux has a number of those. But yeah, dual booting can expose some quirks and filesystems and disk setup in general is one of the most prominent.
This. How an advanced use case is accomplished is not a point against a system’s usability.
The point I was trying to make is that if you ever want to do something that is not covered with an out of the box install, it’s typically far harder to do in Linux than in Windows (although my ~15 years as a windows sysadmin probably bias my opinion)
Windows is turning into a telemetry nightmare because about 10 years ago Microsoft figured out that they could sell ad space and monetize user data, so I’m trying to get off the platform before my LTSC install hits EOL. But I have to admit it’s a hard path.
(although my ~15 years as a windows sysadmin probably bias my opinion)
So basically: it’s not any harder in linux, but you have more than a decade of muscle memory in windows, so it’s harder for you.
That’s like saying “Japanese is a less efficient language than English, all of the words are different, and when I want to say a word, I have to learn it first, but in English I just know the words! English is so much better! (My 30 years speaking english probably bias my opinion)”
Things are certainly different, but its hard to compare which is “harder” for the advanced use cases.
There’s no shame in having long term experience with one platform and having that shape your expectation about how a solution should look.
Congrats on taking the plunge. I suspect there are others like you.
I’m actually kind of envious. The joy and frustration and joy again of exploring something new was something I relished in my early Linux years. Back then you had to use a text editor to configure your video card before even getting started, so it was kind of insane haha. But totally worth it later, as all of those skills translated.
But in your example raid controller driver was covered in an out of the box install in windows. If it wasn’t you’ll still need to do pretty much the same. Also there was a couple of weird steps in your linux list like switching DE to run a couple of CLI commands and disabling AHCI for some reason.
Now do a raid like it’s typical for Linux and get it to work on Windows.
Oof, hoops you have to jump through to get two disks in a mirror on windows still haunt my dreams sometimes
For advanced, power user stuff, I find Linux to be much friendlier and faster. Just being able to do everything in a Terminal instead of having to mess around with a mix of inconsistent GUI menus in the two different control panels, gpedit, regedit (which is an entire headache by itself), a mix of cmd and Powershell (and whatever Windows Terminal is) is just so much less of a headache.
Also I find things easier to script in Linux compared to Windows.
Not to mention the mess that is Windows Update, which doesn’t even upgrade third party software, and takes a long time to actually do the updates. Package management is a godsend. Windows has chocolatey and winget, but those are poor substitutes.
And I say all of this as someone who is technically proficient in both.
This is kind of the same situation I’m in, but I’m not quite as tech savvy and I’m more resistant to learning linux even though I’ll still probably want to migrate over at some point.
What I don’t really understand is, or, what I understand, but I suppose I find mildly amusing, or confusing, is how many criticisms I’ve seen of windows that kind of just don’t apply to LTSC as much, if at all? It’s kind of to the point where I wonder why anyone would really use any other version.
20 years ago it was PCMCIA wifi drivers.
Now it seems like it’s always some kind of disk boot filesystem issue.
Proprietary RAID in linux is a shit show. That’s why everyone uses software RAID.
The only real issue I’ve had with Linux is trying to get my old Drobo 5C to work. (it’s a self-managed dynamically adjustable/resizable raid array that just presents itself as a single 70tb usb hard disk. The company that made them dissolved a few years ago)
It’s formatted in ntfs and loaded with 25tb+ of data from when I ran windows primarily.
It’ll mount and work temporarily, but quickly stops responding, with anything that tries to access it frozen. Particularly docker containers.
Then it’ll drop into some internal data recovery routine (it’s a ‘black box’ with very little user control, definitely wouldn’t be my choice again, but here we are), refusing to interact with the attached system for half an hour or so. When it finally comes back, linux refuses to mount it. ‘dirty filesystem’, but ntfsfix won’t touch it either. Off to windows and chkdsk, then rinse and repeat.
I gave up when one of those attempts resulted in corrupt data (a bunch of mkvs that wouldn’t play from the beginning, but would play if you skipped past the first second or two). I can’t backup this data, (no alternative storage or funds to acquire it) so that was enough tempting fate.
I ended up attaching it to an old windows laptop that’s now dedicated to serving it via samba :(
Really looking forward to setting up a proper raid array eventually, but till then I’m stuck with 11mbps. I’d love to rent storage temporarily so I can move the data and try a different fs on the drobo…
You could probably get a Gbit LAN USB card added to that so you could at least get 30MB out of the thing 🤷.
I’d need a windows system to put it in. The Drobo isn’t upgradable beyond stuffing more drives in it, and the laptop is an old hp craptop…
I’ve got a second desktop that’s got usb3 (drobo is usb3), so that’d probably improve things, just not by a lot (pretty sure the slowdown is in the samba share, but I need to do more testing and see where exactly the issue is), and I kinda want to keep that system free for other experiments.
Idk, still thinking on it.
Ah, if the thing has USB 3.0, then the NIC in the laptop is probably 100Mbit (lower end models had 100Mbit even if they were newer), that’s your main issue, not SMB. SMB is a TCP/IP protocol, has nothing to do with hardware implementation and has no speed limits (at least none that I’m aware of). It goes as fast as the slowest part in the chain.