Hello selfhosted! Sometimes I have to transfer big files or a large amounts of small files in my homelab. I used rsync but specifying the IP address and the folders and everything is bit fiddly. I thought about writing a bash script but before I do that I wanted to ask you about your favourite way to achieve this. Maybe I am missing out on an awesome tool I wasn’t even thinking about.

Edit: I settled for SFTP in my GUI filemanager for now. When I have some spare time I will try to look into the other options too. Thank you for the helpful information.

  • @[email protected]
    link
    fedilink
    English
    4
    edit-2
    4 months ago
    • sftp for quick shit like config files off a random server because its easy and is on by default with sshd in most distros
    • rsync for big one-time moves
    • smb for client-facing network shares
    • NFS for SAN usage (mostly storage for virtual machines)
  • @[email protected]
    link
    fedilink
    English
    24 months ago

    Ye old samba share.

    But I do like using Nextcloud. I use it for syncing my video projects so I can pick up where I left off on another computer.

  • @[email protected]
    link
    fedilink
    English
    1
    edit-2
    4 months ago

    rsync if it’s a from/to I don’t need very often

    More common transfer locations are done via NFS

  • @[email protected]
    link
    fedilink
    English
    24 months ago

    By “homelab”, do you mean your local network? I tend to use shared folders, kdeconnect, or WebDAV.

    I like WebDAV, which i can activate on Android with DavX5 and Material Files, and i use it for Joplin.

    Nice thing about this setup is that i also have a certificate secured OpenVPN, so in a pinch i can access it all remotely when necessary by activating that vpn, then disconnecting.

  • hendrik
    link
    fedilink
    English
    2
    edit-2
    4 months ago

    I’d say use something like zeroconf(?) for local computer names. Or give them names in either your dns forwarder (router), hosts file or ssh config. Along with shell autocompletion, that might do the job. I use scp, rsync and I have a NFS share on the NAS and some bookmarks in Gnome’s file manager, so i just click on that or type in scp or rsync with the target computer’s name.

  • @[email protected]
    link
    fedilink
    English
    114 months ago

    People have already covered most of the tools I typically use, but one I haven’t seen listed yet that is sometimes convenient is python3 -m http.server which runs a small web server that shares whatever is in the directory you launched it from. I’ve used that to download files onto my phone before when I didn’t have the right USB cables/adapters handy as well as for getting data out of VMs when I didn’t want to bother setting up something more complex.

    • @[email protected]
      link
      fedilink
      English
      1
      edit-2
      4 months ago

      As I understand it, the establishing of the connection is reliant on a relay server. So this would not work on a local network without a relay server and would, by default, try to reach a server on the internet to make connections.

  • @[email protected]
    link
    fedilink
    English
    4
    edit-2
    4 months ago

    rsync is indeed fiddly. Consider SFTP in your GUI of choice. I mount the folder I need in my file browser and grab the files I need. No terminal needed and I can put the folders as favorites in the side bar.

    • Lv_InSaNe_vL
      link
      fedilink
      English
      24 months ago

      If you want to use the terminal though, there is scp which is supported on both windows and Linux.

      Its just scp [file to copy] [username]@[server IP]:[remote location]

  • @[email protected]
    link
    fedilink
    English
    34 months ago

    WinSCP for editing server config

    Rsync for manual transfers over slow connections

    ZFS send/receive for what it was meant for

    Samba for everything else that involves mounting on clients or other servers.