Basically title. I’m in the process of setting up a proper backup for my configured containers on Unraid and I’m wondering how often I should run my backup script. Right now, I have a cron job set to run on Monday and Friday nights, is this too frequent? Whats your schedule and do you strictly backup your appdata (container configs), or is there other data you include in your backups?

  • Andres S
    link
    fedilink
    12 months ago

    @Sunny Backups are done weekly, using Restic (and with ‘–read-data-subset=9%’ to verify that the backup data is still valid).

    But that’s also in addition to doing nightly Snapraid syncs for larger media, and Syncthing for photos & documents (which means I have copies on 2+ machines).

  • hendrik
    link
    fedilink
    English
    3
    edit-2
    2 months ago

    Most backup software allow you to configure backup retention. I think I went with some pretty standard once per day for a week. After that they get deleted, and it keeps just one per week of the older ones, for one or two months. And after that it’s down to monthly snapshots. I think that aligns well with what I need. Sometimes I find out something broke the day before yesterday. But I don’t think I ever needed a backup from exactly the 12th of December or something like that. So I’m fine if they get more sparse after some time. And I don’t need full backups more than necessary. An incremental backup will do unless there’s some technical reason to do full ones.

    But it entirely depends on the use-case. Maybe for a server or stuff you work on, you don’t want to lose more than a day. While it can be perfectly alright to back up a laptop once a week. Especially if you save your documents in the cloud anyway. Or you’re busy during the week and just mess with your server configuration on weekends. In that case you might be alright with taking a snapshot on fridays. Idk.

    (And there are incremental backups, full backups, filesystem snapshots. On a desktop you could just use something like time machine… You can do different filesystems at different intervals…)

  • @[email protected]
    link
    fedilink
    English
    32 months ago

    I classify the data according to its importance (gold, silver, bronze, ephemeral). The regularity of the zfs snapshots (15 minutes to several hours) and their retention time (days to years) on the server depends on this. I then send the more important data that I cannot restore or can only restore with great effort (gold and silver) to another server once a day. For bronze, the zfs snapshots and a few days of storage time on the server are enough for me, as it is usually data that I can restore (build artifacts or similar) or is simply not that important. Ephemeral is for unimportant data such as caches or pipelines.

  • @[email protected]
    link
    fedilink
    English
    22 months ago

    I honestly don’t have too much to back up, so I run one full backup job every Sunday for different directories I care about. They run a check on the directory and only back up any changes or new files. I don’t have the space to backup everything, so I only take the smaller stuff and most important. The backup software also allows live monitoring if I enable it, so some of my jobs I have that turned on since I didn’t see any reason not to. I reuse the NAS drives that report errors that I replace with new ones to save on money. So far, so good.

    Backup software is Bvckup2, and reddit was a huge fan of it years ago, so I gave it a try. It was super cheap for a lifetime license at the time, and it’s super lightweight. Sorry, there is no Linux version.

  • battlesheep
    link
    fedilink
    English
    32 months ago

    Backup all of my proxmox-LXCs/VMs to a proxmox backup server every night + sync these backups to another pbs in another town. A second proxmox backup every noon to my nas. (i know, 3-2-1 rule is not reached…)

  • @[email protected]
    link
    fedilink
    English
    22 months ago

    I continuous backup important files/configurations to my NAS. That’s about it.

    IMO people who redundant/backup their media are insane… It’s such an incredible waste of space. Having a robust media library is nice, but there’s no reason you can’t just start over if you have data corruption or something. I have TB and TB of media that I can redownload in a weekend if something happens (if I even want). No reason to waste backup space, IMO.

    • @[email protected]
      link
      fedilink
      English
      22 months ago

      Maybe for common stuff but some dont want 720p YTS or yify releases.
      There are also some releases that don’t follow TVDB aired releases (which sonarr requires) and matching 500 episodes manually with deviating names isn’t exactly what I call ‘fun time’.
      Amd there are also rare releases that just arent seeded anymore in that specific quality or present on usenet.

      So yes: Backup up some media files may be important.

      • @[email protected]
        link
        fedilink
        English
        12 months ago

        Data hoarding random bullshit will never make sense to me. You’re literally paying to keep media you didn’t pay for because you need the 4k version of Guardians of the Galaxy 3 even though it was a shit movie…

        Grab the YIFY, if it’s good, then get the 2160p version… No reason to datahoard like that. It’s frankly just stupid considering you’re paying to store this media.

        • @[email protected]
          link
          fedilink
          English
          12 months ago

          This may work for you and please continue doing that.

          But I’ll get the 1080p with a moderate bitrate version of whatever I can aquire because I want it in the first place and not grab whatever I can to fill up my disk.

          And as I mentioned: Matching 500 episodes (e.g. Looney Tunes and Disney shorts) manually isnt fun.
          Much less if you also want to get the exact release (for example music) of a certain media and need to play detective on musicbrainz.

          • @[email protected]
            link
            fedilink
            English
            12 months ago

            Matching 500 episodes (e.g. Looney Tunes and Disney shorts) manually isnt fun.

            With tools like TinyMediaManager, why in the absolute fuck would you do it manually?

            At this point, it sounds like you’re just bad at media management more than anything. 1080p h265 video is at most between 1.5-2GB per video. That means with even a modest network connection speed (500Mbps lets say) you can realistically download 5TB of data over 24 hours… You can redownload your entire media library in less than 4-5 days if you wanted to.

            So why spend ~$700 on 2 20TB drives, one to be used only as redundancy, when you can simply redownload everything you previously had (if you wanted to) for free? It’ll just take a little bit of time.

            Complete waste of money.

            • @[email protected]
              link
              fedilink
              English
              1
              edit-2
              2 months ago

              I prefer Sonarr for management.
              Problem is the auto matching.
              It just doesnt always work.
              Practical example: Looney. Tunes.and.Merrie.Melodies.HQ.Project.v2022

              Some episodes are either not in the correct order or their name is deviating from how tvdb sorts it.
              Your best regex/automatching can do nothing about it if Looney.Tunes.Shorts.S11.E59.The.Hare.In.Trouble.mkv should actually be named Looney.Tunes.Shorts.S1959.E11.The.Hare.In.A.Pickle.mkv to be automatically imported.

              At some point fixing multiple hits becomes so tedious it’s easier to just clear all auto-matches and restart fresh.

    • @[email protected]
      link
      fedilink
      English
      22 months ago

      It becomes a whole different thing when you yourself are a creator of any kind. Sure you can retorrent TBs of movies. But you can’t retake that video from 3 years ago. I have about 2 TB of photos I took. I classify that as media.

      • @[email protected]
        link
        fedilink
        English
        12 months ago

        It becomes a whole different thing when you yourself are a creator of any kind.

        Clearly this isn’t the type of media I was referencing…

  • @[email protected]
    link
    fedilink
    English
    4
    edit-2
    2 months ago

    Right now, I have a cron job set to run on Monday and Friday nights, is this too frequent?

    Only you can answer this. How many days of data are you prepared to lose? What are the downsides of running your backup scripts more frequently?

  • Lucy :3
    link
    fedilink
    English
    62 months ago

    Every hour, automatically

    Never on my Laptop, because I’m too lazy to create a mechanism that detects when it’s possible.

    • @[email protected]
      link
      fedilink
      English
      42 months ago

      I just tell it to back up my laptops every hour anyway. If it’s not on, it just doesn’t happen, but it’s generally on enough to capture what I need.

  • @[email protected]
    link
    fedilink
    English
    32 months ago

    No backup for my media. Only redundacy.

    For my nextcloud data, anytime i made major changes.

  • SavvyWolf
    link
    fedilink
    English
    122 months ago

    Daily backups here. Storage is cheap. Losing data is not.

  • @[email protected]
    link
    fedilink
    English
    42 months ago

    Local zfs snap every 5 mins.

    Borg backups everything hour to 3 different locations.

    I’ve blown away docker folders of config files a few times by accident. So far I’ve only had to dip into the zfs snaps to bring them back.

    • Avid Amoeba
      link
      fedilink
      English
      1
      edit-2
      2 months ago

      Try ZFS send if you have ZFS on the other side. It’s insane. No file IO, just snap and time for the network transfer of the delta.