I want to create a backup of my Linux system, including user files, from the command line. I tried using Timeshift but it doesn’t have a CLI argument to include a folder.

I found a guide on dev.to that explains how to use Timeshift from the command line, but it doesn’t mention how to include user files. According to ItsFOSS, Timeshift is designed to protect system files and settings, not user data, so user home directories are excluded by default.

I came across a list of backup programs for Linux on Slant, and BackInTime appears to be the best.

Has anyone used BackInTime to backup the whole system including user files? Are there any other tools that you would recommend?

Edit: would also be nice if it had similar features to Timeshift, like weekly snapshots, list restore and delete snapshots, etc.

  • @[email protected]
    link
    fedilink
    1
    edit-2
    2 years ago

    So… I’m not going to answer your question, feel free to ignore me.

    It’s of course possible to do so and the most obvious way is to use dd since on Linux devices, including disks, are files. Consequently you can indeed “save” the whole system from the CLI.

    That being said I would argue it’s a bit waste of time unless you have a very specific, and usually rare, use case e.g testing OSes themselves. Most likely I imagine (and again I’m not directly answering your question here so please do feel free to fix my assumptions or ignore this entirely) you “just” want to “quickly” go from a “broken” state to one where you can “work” again.

    It might be because you are doing something “weird” e.g tinkering with the OS itself or lack of “trust” in your current setup.

    Here my recommendation would be instead to have a “work” OS and then other partitions, or even virtual machines (not containers) dedicated to testing because it’s truly a great way to learn BUT it shouldn’t come at he risk of your data or your time.

    Finally, one of the bounding resource is the speed of your disk and your time to focus. I find that installing a “fresh” OS from a modern USB stick is fast, like take 2 coffees fast. I installed Ubuntu just yesterday, twice, so rather confident about that comment.

    What is indeed slow is to copy YOUR files because they are larges and numerous.

    So… finally, the “trick” do NOT copy your files despite reinstalling the system! Instead, have a dedicated /home partition so that if you reinstall the OS, your files are untouched. Yes you might have to install a couple of software but if you keep track of them via e.g ~/.history (which BTW will be saved in that situation) you will be able to e.g grep apt install it and be back on track in minutes.

    TL;DR: /home partition that is not deleted on OS reinstallation is often IMHO the most efficient way to go.

    • @[email protected]
      link
      fedilink
      2
      edit-2
      2 years ago

      PS: obviously all the backup tools other recommended are still useful. I personally use rdiff-backup to save important data on my NAS with SSDs over Ethernet. Once again it’s all about speed but only after you identified what actually matters to you and it the vast majority of cases, the whole system ain’t it.