Yeah… but if the packagers dont test it, or ship “stable” KDE Plasma 5.27 which will simply not get most bugfixes (Debian, MX Linux and many more will have these issues for 4 years!) its actually important what Distro you choose.
It is not if
- your Desktop relies on Xorg garbage which is “stable” and will not evolve
- your Desktop is minimal and Distros orient their schedule on it (GNOME)
This doesnt apply to
- KDE
- Cosmic
- Hyprland, Sway, Wayfire
- LXQt getting Wayland support probably
Both are important. I can’t tell you how many times I’ve had to resort to containers, VMs, or compiling from source, just because some application decided to only provide packages for Arch or Debian.
Switch to NixOS, sometimes it feels like we have every package there is.
Well sure. My approach for looking for a distro was usually “which ones have KDE and pacman” and after that I start comparing.
And for me, AUR.
Which distribs have pacman but not AUR?
I just wanted to mention that if a distro (somehow) had AUR but not pacman, I wouldn’t care.
And for new users choosing a distro with big user base (thus having a better support system) should be a top priority. Instead newbies are often advised to use an obscure distro that in theory might be a good fit, but isn’t. Probably those who do the recommendations are Linux testers (using VZ) rather than Linux users and mostly evaluate a distro based on install process and out of the box usage.
Configuring a big distro to your needs is much better than choosing a nishe distro.
The most important thing for most new Linux users would be a pathway to getting support. Because of this the distro you use matters much more than the DE because each of the major distro’s have different pipelines that the funnel users in to getting support. The package manager lock in is distro dependent and depending on the philosophy that they subscribe to can be the difference between how many steps a new user has to take to get a working system up and running. Thankfully, with the rise of flatpak, appimage and snap being more popular than ever package availability is much more streamlined but that is another layer on top of an already overwhelming package system for new users. The defaults for all of this depends on your distro which can be different. Heck we haven’t even gotten to support cycles which depending on user needs can be different. Because not every user has or wants what comes with for example maintaining an rolling release distribution. Did they setup their system to have snapshots so they can roll everything back when the new kernel update breaks something system critical and they have a presentation at 2:00? None of these things are really DE dependent but are baked in to the defaults you subscribe to when you choose a disto. The good part is that if you don’t like how something is configured you can change everything easily depending on how well documented it is. This is why it’s more important to choose a distro with good documentation or at least a active enough community so when you run into hangups you can get some sort of resolution.
Loved your comment, but please, next time use some paragraphs. It was a hard read.
Getting “Linux” support online usually means Ubuntu, but I ran into a Mint problem back in the day (I wanna say about 2014 or so…) And Clem himself replied to me personally with, not just a link to a fix, but an actual “copy and paste this exact thing into the terminal” reply, and it totally fixed me up. Clem being the guy who is in charge of Mint.
Always left me with a warm feeling about Mint, and I keep coming back.
Using LMDE 6 Cinnamon on one of my boxes for that reason.
I switched to Arch[-based distros] when I realized I had been getting 90% of my support from the Arch wiki for years
I understand the argument being made, but I kind of disagree. Yes, picking a DE in which you’ll be comfortable is really important (and often an undervalued aspect of using Linux for the first time), but I think that the time you need to spend self-maintaining your distro is more important, and is also prone to make-or-break your first-time Linux experience. That’s the most important factor on whether a new user says “I love Linux and want to continue using it” or “I fricking hate Linux, it’s filled with a bunch of problems, I’d rather just use Windows instead”. And that’s why it’s important to recommend beginner-friendly distros, as to avoid frustration of newcomers, because those are more manageable (unless those newcomers want the frustration of managing something that they don’t quite understand :)
Does it matter which one in specific? No, and it’s probably at this point that the DE and visual looks should kick in.
Just hopped back over to linux mint again after years of making due with Windows
- Went with cinnamon cuz pretty.
- switched to CobiWindowList so I could see all windows on either of my monitor menu bars.
- switched to CinnVIIStarkMenu for a more familiar menu system.
Not much change, I can lean on the habits I’ve gotten from windows, and now my switch is pretty much unnoticeable to me.
Funny enough, Lutris has made it alot easier for me to access games I usually would just have downloaded, like my itch.io library. Proton has tackled all my other games fine. Hell, I even got Tarkov running smoothly, even though you can only do offline raids on Linux ATM.
Linux users fall into three categories. People who want stability over everything else, people who want everything to be bleeding edge, and people who don’t use desktop environments.
The most important thing for a new user is understand which of those three they are.
What about people who want something up to date AND stable? I don’t want to be stuck on an ancient Debian base when I want up to date goodness for running newer packages. This is what Manjaro promises, but I think we all know the problems with what they’re trying to do. Fedora is probably the one distro that most closely fits imho, but I’ve never liked RPM distros too many bad memories from 25 years ago.
EndeavourOS or raw Arch would both fit that bill, you don’t need to run updates every day just because they’re available. Manjaro delays packages to “increase stability”, but that’s what causes it to break.
I’m quite happy with Tumbleweed. It’s best of both for me, but still RPM based.
I just want to get away from the future hell that will be AI-controlled Win 12
I’ll be honest, unless you have been using Linux for…a long time, of your job requires you to manage servers, your probably not that last category.
If you enrolled in the windows insider/test doohickey then you might want look into the rolling release distros. If not, something with a standard release cadence will be better.
I my self? All of the servers I manage have no desktop environment (core infrastructure does not need graphics). But if I am on a workstation? LMDE - Because I care about the graphics getting out of my way so I can do my job.
Too a certain point. I’ll give you that this applies to the Debian and Ubuntu distro. Gentoo, on the other hand, is a completely different animal and will have a far greater impact on user experience than the DE.
You look at your DE all day and your distro holds everything together. Op didn’t say distro is unimportant and I agree it makes sense for new users to look at images and videos of different desktops first, maybe try a live cd, and then choosing the backend that suits their willingness to interact with.
If your electricity and time are cheap, you want to learn and your pc-system is your playground not a productivity tool, Gentoo is a valid option. In this case, your choice of DE impacts your compile time massively and knowing alternatives beforehand gives you options.
Nowadays they’re so many options, GNOME and Plasma are nice, but heavy, same for DDE(Deepin) and others fancy DEs I know why it’s heavy, but xfce and lxqt looks better on my PC, xfce you can make looks beauty and fast too
For the WM guys: I’ll try some day, for now only DEs :3
Try hyprland, learn the shortcuts, and you never want a DE again.
I knew only a MW would reply lol
I feel like the window manager is important, but for newbies I also consider the package manager and overall installation process to be very important.
I’ve had pretty distros that are basically busted after a package fails to install or video drivers are mucked with. An advanced user could fix most of these issues, but this is usually where a new user may go running back to their previous OS.
A good computing experience for me is all my hardware working with minimal fuss and all the software I expect to be available being a few terminal commands away (e.g. steam, developer tools, etc.)
I’m not sure if it is, but I don’t see it as a hot take. And it sounds reasonable, specially when some distros offer different “flavours” out-of-the-box, and offer you the option of different DEs before you even installed it.
Fair. But “Lukewarm take” just doesn’t have the same punch.
It’s certainly not a hot take. Every “which distro should I try thread” is just a discussion of the different DEs out there. I would like to hear about different package managers. I always seem happiest with apt, and I don’t know why.
I’m a noob using the default Ubuntu DE for a few months now and I’ve gotten used to it, at this point I’m afraid to ask what are the other DEs and whether I should swap over
Test them out on a virtual machine
https://wiki.archlinux.org/title/Desktop_environment
You can use the list there to look up images or videos of the DEs
If you think you’d prefer one then you can try it but you aren’t likely to find an advantage over what you’re used to (there are some like old hardware wanting lighter weight) it’s mostly preference.
If you changed your Window Manager to i3 then you would probably hate it just for being so different
I particularly like Cinnamon, it’s very simple and nothing fancy (while still looking great and modern).
The other popular choices include:
- Gnome
- KDE (customizable to hell)
- XFCE (very easy on resources, good for old hardware, or if you like simplistic DE)
- LXDE (similar to XFCE in the resources department, but looks more modern, IMO)
There are others, but I can’t speak for them as I’ve never tried them. I can’t really describe modern Gnome as well, because the last version I used was 3 and it doesn’t look at all as the same DE, so someone else will have to provide that info.
I‘ve recently used lxqt in a project. Very cool and the successor of lxde afaik at least for lubuntu.
modern gnome is simpler to learn and more polished than basically all other DEs. i think its better for someone that wants something new and for people who just started using a computer, because of just how easy it is to use. its not good if youre switching from windows or mac and want something similar.
I tried it recently and it was confusing as hell.
first time i tried it, i felt it was easier than any other de ive tried, though different people of course wont have the same experience
Don’t. It’s a trap. Most of them have compatibility issues with software. Stock Ubuntu is the benchmark for every piece of software these days. Deviating is fun until it isn’t.
Unless you want to go a non Debian based distro, always pick Ubuntu.
Compatible issues on desktop environment level? This is the first time I ever hear about that.
Tried switching to KDE Plasma and then OpenCV broke because of outdated QT version or some shit. Same with another distro. And I couldn’t install two versions at the same time.
It’s all fun until you get dependency conflicts.
Bro. I think you would benefit from sticking to Chrome OS.
That’s a great insult, I love it :D
Nice comeback when you get evidence of how a different DE breaks software compatibility.
It’s clear that this is a forum of people that only install Linux to open their terminal and type neofetch.
Fair, that reply above is not helpful at all. I mean yeah, I have had my fair share of dependence hell as well. Mostly when trying to install an external deb package. I know how to prevent it nowadays but it ain’t user friendly at all.
Also I would be hesitant to use Linux as a workstation. If I had the luxury of time I would for ideological reasons alone. But I don’t have that kind of time. Troubleshooting can become costly when you get paid by the hour.
Depends on what you do, most of the deep-learning world and scientific computing is based on Ubuntu. And not just Ubuntu but currently 22.04. Even upgrading the distro can bring compatibility conflicts.
I have a massive hate boner for development on Windows for things such as the \ in the paths and needing to install a 10gig IDE to do cpp development. Or they tell you WSL “just works” while it doesn’t “just work” because it can’t cv2.imshow your images because there’s no X11 passthrough etc.
Yeah I agree.
Ubuntu is shit. It used to only be shit under the hood if you were an enterprise sysadmin building your own packages and managing versioned repos for thousand machine fleets, but now it is shit from a user experience, too. Fuck snaps, fuck walled gardens, and fuck vendors attempting lock-in.
I hate everything but Matlock!
Stock Ubuntu is the benchmark […]
…for nothing this days. The only people using Ubuntu now are dinosaurs and system managers running cheap servers or locked into Canonical’s ecosystem, and the latter are using headless servers, remotely managed, not the DE. Variety is the spice of life. All mainstream DEs are perfectly serviceable, 100% compatible with everything and completely stable and reliable. FFS, Ubuntu’s snaps don’t even work well on their own DE. Stop fearmongering for Canonical, let people live life.
You do you. Just stop wasting other people’s time with this worthless false hope. What I’m saying here is what I would have liked people to tell me before I wasted my time troubleshooting issues caused by custom Desktop Environments. What’s next you’re going to tell me Wayland already runs without issues too?
The stock Ubuntu environment looks pretty decent to begin with.
Wow, you really are aggressive and hostile for no reason. You can use Ubuntu all you want. But don’t go around spreading lies just because you are too cognitively challenged to change your DE without breaking the OS. Most people are fine making a fresh install with the DE they want to try preinstalled and it works fine 100% out of the box. It’s trying to make two different DE live on the same system at the same time that is only partially supported and thoroughly discouraged by every single DE developer. Most of the time installing a new DE on a system and uninstalling the old one is a pretty straightforward, although dirty process. Guess who is particularly bad and incompatible with that process? Ubuntu. It has the worst support for alternative DEs, because Ubuntu is not the benchmark for squat shit anymore. Use a real end user distro, and you’ll be able to change DE to your heart’s content without issue.
Because advice like this is an enormous waste of time. Calling people dinosaurs for using Ubuntu instead of KDE is a pretty out there take. The only more modern option is arch based distros like Manjaro but since every programming tutorial assumes you have APT and are running Ubuntu I don’t see much of a reason to deviate from that.
it seems you should be using debian or distros based on it. ubuntu, as far as i know, uses apt as a mirror to snap, so as long as the tutorials youre following letter for letter arent too recent, you really should be using debian for actual apt packages, since ubuntu used those a couple of years ago.
you can also use fedora or arch, but it seems you dont want to check what package youre downloading at all, and just want to follow tutorials blindly.
People here are under the illusion that a DE changes nothing about the base OS. It seems like those people have never actually been using their OS.
Dude…I build my desktops from a bare Debian text only terminal by installing it piece by piece and only what I need. This current install has bee running fine for three years and I have no issues installing and configuring anything you can on Ubuntu.
This is a skill issue on your part, not an OS issue. At a certain point, if you’ve been using it enough, the distro literally doesn’t matter anymore. Linux is Linux is Linux.
That’s like saying that you run Gentoo but you don’t even have the street cred of running Gentoo.
I’m only on Linux for a few months (as a daily driver, always used headless servers before that), and I’m almost certain that my Fedora install came with both KDE and Gnome in Wayland and X11 flavors pre-installed out of the box, and I could just choose between them at login screen. Or am I wrong, and I do I just not remmeber installing the other manually? I mean, that’s also possible, it’s been a while.
Noobs gonna noob
Haha, true. Started with Mint, now on Kubuntu. Same pig, different makeup.
isnt kubuntu worse for installing flatpaks? thats the only thing i can think of that differs and i wanted to know.
You do have to add flathub to the discover store, but that’s a one time thing and you’re good afterwards
Ubuntu is VERY heavily invested in snaps at a very basic level. I think the recommendation is to not mix snaps and Flatpaks as they may not interact well. As a new Ubuntu user, I’m slowly discovering some of the random problems with snaps.
For example, just the other day, I was trying to configure my fish shell using the html-based fish_configure utility, but it just wouldn’t work. Of course, I assumed the problem was with my fish install. After a couple hours fiddling with it, I finally came across a stack exchange comment indicating that the snap version of Firefox simply can’t access the /tmp/ directory, which is where fish_config creates its html configuration page. WTF? Also, you can’t even install a non-snap version of Firefox via apt because the official apt repository just links back to the snap version! I finally installed an apt-based version of librewolf, but had to get it from a non-Ubuntu repository, and then magically I could access to fish_config html page. That’s a pretty long workaround just to view a simple HTML page!
So, if snaps have problems like this just interacting with the base Linux file system, I wouldn’t be surprised if random weird behavior cropped up when trying to use Flatpaks.
Great take. But you know the real sneaky one that trips you up? File system.
I wouldn’t call myself a beginner, but every time I install a Linux system seriously I see those filesystem choices and have to dig through volumes of turbo-nerd debates on super fine intricacies between them, usually debating their merits in super high-risk critical contexts.
I still don’t come away with knowing which one will be best for me long-term in a practical sense.
As well as tons of “It ruined my whole system” or “Wrote my SSD to death” FUD that is usually outdated but nevertheless persists.
Honestly nowadays I just happily throw BTRFS on there because it’s included on the install and allows snapshots and rollbacks. EZPZ.
For everything else, EXT4, and for OS-shared storage, NTFS.
But it took AGES to arrive to this conclusion. Beginners will have their heads spun at this choice, guaranteed. It’s frustrating.
Makes sense to go simplest as possible on a home pc and even home sever. More important with raid and production capacity planning or enterprise stuff.
Yes, I listened to a podcast about that recently. Linux was far with XFS or something, but then Apple came, improved their HFS and actually made tools for it and it got better.
BTRFS is just as established as etx4, just not as damn old. It also just works, and it has advanced features that are crucial for backups. But I have no idea how to use btrbk and there is no GUI so nobody uses that.
But as a filesystem that just works like ext4, plus the automatically configured snapshots in both regular and atomic Fedora systems and OpenSuse, BTRFS is awesome.
Only outdated Distros that fear change stick with ext4, at least thats my opinion.
Lending my voice to this as well for most, my thought is EXT4, without LVM, deferring to the preferred FS for the distro. It is a mature, stable, and reliable choice and logical volumes complicate things too much for beginners.
If dual-booting, yeah, definitely an NTFS partition for shared storage (just be aware that Windows can be weird with file permissions and ownership).
Ext4 is the safe bet for a beginner. The real question is with or without LVM. Generally I would say with but that abstraction layer between the filesystem and disk can really be confusing if you’ve never dealt with it before. A total beginner should probably go ext4 without LVM and then play around in a VM with the various options to become informed enough to do something less vanilla.
and then play around in a VM with the various options to become informed enough to do something less vanilla.
This part is skippable, right? Any reason a user should ever care about this?
(note: never heard of LVM before this thread)
It’s all skippable if you want… Just put a large / filesystem on a partition and be on your way. There are good reasons for using it in some cases (see my response now).
It makes adding space easier down the road, either by linking disks or if you clone your root drive to a larger drive, which tends to not be something most “end users” (I try not to use that description but you said it heh) would do. Yes, using LVM is optional.
What is LVM?
This would absolutely be my thinking too. When I was still newish to linux, I remember lots of confusion with LVM and trying to reformat drives.
Can you explain LVM in practice to me? I used ext4 and now Fedora Kinoite with BTRFS, the filsystem never makes any problems and some fancy features just work.
I should also point out that some modern filesystems like btrfs and zfs have these capabilities built into the filesystems natively so adding LVM into the mix there wouldn’t add anything and could, in fact, cause headaches.
In practice, you would split a disk up to keep /home separate from/ and probably other parts of the filesystem too like /var/log… this has long been an accepted practice to keep a full disk from bringing something production offline completely and/or complicating the recovery process. Now, you could use partitions but once those are set, it’s hard to rearrange them without dumping all the data and restoring it under the new tables. LVM stands for Logical Volume Manager and puts an abstraction layer between the filesystems and the partitions (or whole disk if you are into that). This means you can add Disks arbitrarily in the future and add parts of those disks to the filesystems as required. This can really minimize or even eliminate downtime when you have a filesystem getting filled up and there’s nothing you can easily remove (like a database).
It’s good to know but with the proliferation of cloud and virtual disks it’s just easier on those systems to leave off LVM and just keep the filesystems on their own virtual disks and grow the disk as required. It is invaluable when running important production systems on bare metal servers even today.
Hope this helps.
Thanks! So BTRFS does something similar with volumes, but baked in.
I did NTFS because both windows and Linux can read it. Do I know literally any other fact about formatting systems? Nope. I’m pretty sure I don’t need to, I’m normie-adjacent. I just want my system to work so I can use the internet, play games, and do word processing.
I once tried to install my Steam Library in Linux to an NTFS partition so I wouldn’t have to install things twice on a dual boot system. Protip: don’t do that.
Oo! That’s definitely a gotcha. Good tip!
I once heard that the trick to this is you need to let Steam “update” every game before you switch OSs. If it doesn’t get to finish this, it will bork. That’s also highly impractical I feel though.
So yeah on my dual boot Linux is for making things and doesn’t see my main Steam library. Win10 is just for games. :p
EDIT: Win11 or 12 won’t be a problem because I’m confining them to a VM for only the most stubborn situations, and doing everything including gaming with Linux. :D
chkdsk -f (or r or whatever the third option is), reboot twice, but do it multiple times because steam on linux asks you to reinstall the games in the exact same spot and you accidentally do it because you’re not paying close attention due to the mild panic windows threw at you?
https://github.com/ValveSoftware/Proton/wiki/Using-a-NTFS-disk-with-Linux-and-Windows
There is a guide here that says you can do it, but my experience was that I installed the games in Windows on my D drive, mounted the drive in Linux (Mint, I think), and when I tried to play them The system locked up. Rebooting into windows, Steam said the game files were corrupt and I had to reinstall them. I’ve always just kept two separate game libraries on any dual boot systems ever since.
Interesting. I was able to use the files perfectly fine from linux, but windows threw a tantrum when I tried to boot and removed everything linux had touched.
Honestly, I’d say the defaults most distros use will be fine for most users… If they don’t know why they should use one filesystem over another, then it’s almost certainly not going to matter for them
I’m still figuring it out. I know ExFAT works across all desktop OS’s, NTFS works with Linux and Windows, and ext4 only works with Linux.
But it took a half hour of googling to figure out you can’t install Linux on NTFS. I planned to do that to ease cross platform compatibility. Oops. I’m also attempting a RAID 1 array using NTFS. It seems to work, but I’m not sure how to automatically mount it on boot. I feel like I might have picked the wrong filesystem.
Hey there friend! Sorry to hear about your woes. From my understanding in practice, ExFAT is usually better as more of a universally readable storage system for external drives. Think, using the same portable drive between your PS5, friend’s mac, and whatever else. Great for large files and backups! Maybe not as much for running your OS from.
My approach and recommendation would be that you don’t want OS’s seeing each others’ important business anyway. Permissions and stuff can get wonky for instance.
So your core Linux install can be something like EXT4 or BTRFS. I like BTRFS personally because you can set up recovery snapshots without taking tons of space. It does require a little extra understanding and tooling though, but it’s worth looking into. (There’s GUI based BTRFS tools now though. Yay!)
EXT4 is nice and reliable and basic. Not much to say, really! Both can do RAID 1.
Next, a /home mounted separately, this COULD be NTFS if you really wanted that sharing. (BTW there’s some Windows drivers that can read EXT4 I think?)
BUT I feel more organized using a different way:
What I do personally is keep an NTFS partition I call something like “DATA” or “MAIN_STORAGE” and I mount this into my /home on Linux. It’s usually a separate, chunky 4TB HDD or something.
On Windows this is my D:\ drive, and it’s also where I store my project files, media, and whatever else I want easily accessible. Both OSs see those system-agnostic files, but are safely unaware of each other’s core system files.
In Linux, you can mount any folder anywhere, really! You can mount it on startup by amending your FSTAB on an existing install or setting the option during installs sometimes.
So the file path looks something like /home/MonkeMischief/DATA/Music
It’s treated just like any other folder but it’s in fact an entirely separate drive. :)
I hope this was somewhat helpful and not just confusing. In practice, it’ll start to make more sense I hope! The important thing is to make sure your stuff is backed up.
… Perhaps to a big chonky brick formatted as ExFAT if you so choose. ;)
I am experimenting with Linux on two devices: My daily driver laptop and a desktop.
The laptop is set on a dual boot from 2 SSDs. The first SSD contains Windows and has one 2TB NTFS partition. The other SSD has a 250GB partition for ext4 where Ubuntu lives and a 750GB partition for ExFAT.
The desktop has a 500GB SSD with ext4 for the OS, and has two 4 year old 2TB HDDs for data. This is why I’m trying to run them in RAID 1. For cross compatibility (and what they were already formatted as), they are in NTFS.
What do you think of that? Am I using adequate filesystems?
I’ve settled on btrfs a year ago and I’m happy with it. I like the compression and async trim.