It’s Sunday somewhere already so why wait?
Let us know what you set up lately, what kind of problems you currently think about or are running into, what new device you added to your homelab or what interesting service or article you found.
I’ll post my ongoing things later/tomorrow but I didn’t want to forget the post again.
Why is it so hard to send large files?
Obviously I can just dump it on my server and people can download it from a browser but how are they gonna send me anything? I’m not gonna put an upload on my site, that’s a security nightmare waiting to happen. HTTP uploads have always been wonky, for me, anyway.
Torrents are very finnicky with 2-peer swarms.
instant.io (torrents…) has never worked right.
I can’t ask everyone to install a dedicated piece of software just to very occasionally send me large files
Sending is someone else’s problem. They have all sorts of different understandings and tools and I can’t deal with them all. So the only alternative is to set them up with an account in (e.g.) Nexcloud or just accept whatever Google service they use to send you a large file.
Sending other people files is easy in Nextcloud, just create a shared link and unshare when done. Set a password on the file itself.
Sending is someone else’s problem.
It becomes my problem when I’m the one who wants the files and no free service is going to accept an 80gb file.
It is exactly my point that I should not have to deal with third parties or something as massive and monolithic as Nextcloud just to do the internet equivalent of smoke signals. It is insane. It’s like someone tells you they don’t want to bike to the grocer 5 minutes away because it’s currently raining and you recommend them a monster truck.
OK 80 GB is for sure an edge case. Nextcloud won’t even work for that due to PHP memory limits, I think.
Interesting problem. FTP is an option, with careful instructions to an untutored user. Maybe rsync over a VPN connection if it is always the same sender.
Not even sure what else would reliably work, except Tannenbaum’s Adage.
Could you set a ‘password’ on the uploads? So the server will only accept and start the upload if the password is present. The password is a passphrase to make it easy to type in.
I sometimes create them a Nextcloud account and send them the credentials
On a related note, it would be nice if there was a shared storage option for self hosting. It wouldn’t be the same as self hosting, but more like distributed hosting where everyone pools storage they have available and we could have an encrypted sharing option.
You’re describing the world wide web, except giving others write access
I think that openssh or any ssh or ftp app should facilitate this.
Thanks for the mention :>
Yeah, copyparty was my attempt at solving this issue - a single python-file for receiving uploads of infinitely large files, usually much faster than other alternatives (ftp, sftp, nextcloud, etc.) especially when the physical distance to the uploader is large (hairy routing).
I’m not gonna put an upload on my site, that’s a security nightmare waiting to happen.
curious to hear your specific concerns on this; maybe it’s something that’s already handled?
I already saw copyparty but it appears to me to be a pretty large codebase for something so simple. I don’t want to have to keep up with that because there’s no way I’m reading and vetting all that code; it becomes a security problem.
It is still easier and infinitely more secure to grab a USB drive, a bicycle and just haul ass across town. Takes less time, too.
You could always toss it in a sandbox for some isolation :> but yeah I get you, all of the optional features does mean more code.
It’s a shame that browsers make stuff like chunked uploading so tricky, so even just the essentials would be a fair bit of logic – and you won’t get optimal upload speeds without sending chunks in parallel. And the corruption detection is also worth its weight in gold… Ah well, it is what it is hehe
Currently trying to figure out how to create and maintain an internal CA in order to enable pod to pod TLS communication, while using letsencrypt for my public ingresses.
Tried to setup custom domains using Nginx Proxy Manager and Let’s Encrypt DNS-01 challenges so I wouldn’t have to open any ports and it worked!.. except not really?
Proxy Manager shows everything was successful but the domains don’t go anywhere. It seems to be because the TP-Link router from my ISP does DNS Rebinding protection… with no option to turn it off apparently… why…
So now I don’t know where to go. I’m not really fancying hosting DNS myself but if I can’t fix this any other way then I guess I’ll do it. Or maybe I should ditch the ISP TP-Link and get something I could flash OpenWRT on?
Is the ISP supplied box also your wifi?
If not, IMHO I’d use the ISP equipment as a pass-through modem (if possible on that model?) and have a separate OpenWRT / pfSense firewall do all the heavy lifting for DHCP, DNS, ad blocking, etc
Depends if you’d then need another WAP, of course
It is also my Wifi, yeah. I didn’t even consider that’d complicate things further. It does have a “pass-through” option though.
Presuming you can put OpenWRT on it, it’ll be fine as a single box
IMHO, I just prefer having it all as separates and then fix / change / upgrade parts as I go - but I soon run out of places to hide them
A couple of days ago, after testing it myself for a few months to make sure I understood how everything works, I made the switch to NextCloud Calendar, and will no longer use Google Calendar.
This is the best part though… I somehow convinced my wife to do the same. She let me install the NextCloud app(optional for Calendar stuff but makes the setup easier) and DAVx5 on her phone (both from F-Droid, so DAVx5 was free). I exported and imported her calendar, and made sure the notifications were set up to her preferred default.
It’s multiple days later, and she hasn’t complained!
I’ve also moved all of my contacts over to NextCloud, but have yet to coerce my spouse to do the same.
Which calendar client did you use?
I thought the switch to nextcloud calendar was going to be simple, but davx is … Not a clean-cut app.
- Did you find a way to sync from device to NC?
- Were you able to merge Google’s dumb export of 3 calendars?
I’ve been using Fossify Calendar for a while now and it’s been pretty great. I moved to it after the whole Simple apps getting sold drama when it happened.
I’m trying to figure out setting up TrueNAS scale and docker for the first time. Building a NAS and self hosting a few things from an old all in one mini PC.
I spun up a new Plex server with a decent GPU - and decided to try offloading Home Assistant’s Preview Voice Assistant TTS/STT to it. That’s all working as of yesterday, including an Ollama LLM for processing.
Last on my list is figuring out how to get Home Assistant to help me find my phone.
Got any links for howtos on this?
Sure! I mostly followed this random youtuber’s video for getting Wyoming protocols offloaded (Whisper/Piper), but he didn’t get Ollama to use his GPU: https://youtu.be/XvbVePuP7NY.
For getting the Nvidia/Docker passthrough, I used this guide: https://www.bittenbypython.com/en/posts/install_ollama_openwebui_ubuntu_nvidia/.
It’s working fairly great at this point!
I’m eternally sitting here putting off migrating my homelab from docker to rootless podman due to some rather janky patterns I use. It might be super smooth or it might not so instead I just wait in endless decision paralysis
Looking for a self-hosted period tracking app with companion android app. Have done literally zero investigation at this point but it’s on my todo.
period tracking app surveillance… how did we as society come to accept this?
That’s definitely one of those things I found bizarre and awful yet…entirely unsurprising. I can see how selling that data probably sounds like such a lucrative edge to marketing companies.
how did we as society come to accept this?
By not establishing ethical
lineshigh-voltage containment fences on the advertising industry quickly enough, and letting them convince us “this is just how business works”, when their entire existence is about finding the scummiest ways to hack free will for profit.Did system76 doing cosmic lit fire under gnome devs asses?
Hehe I think you might have been replying to a different thread. :)
No idea how this happened lol
I have setup a immich docker container and am slowly moving users and images from google photos.
Replacing Google Photos is still on my to-do list. How do you like Immich so far? Did you compare it to any alternatives?
Interested in this too - immich gets so much viral hype I’m a little suspicious of it
I set it up a couple weeks ago. It’s alright; facial recognition works pretty well, the files are easy to manage, and setup was pretty straightforward (using docker).
Searching for images works fairly well, as long as you’re searching for content and not text. Searching ‘horse’ for example does a pretty good job showing you your pictures of horses, but often misses images containing the word horse. Not always, but it’s noticeable to me.
The mobile apps work well too; syncing files in the background as they appear, optionally creating albums based on folders. Two things I find missing though are the ability to edit faces/people in an image (you’ve gotta do that from a browser), and the ability to see what albums an image is in and quickly navigate to one.
It’s a developing project that’s well on it’s way. A good choice imo.
I’m 3 time zones away from my server and it hasn’t crashed yet after being gone for 3 days. I’m very proud of it.
I feel you. I did not expect mine to crash but I am in Japan and streamed a movie from my server on the West coast of North America.
That’s such a nice feeling
The absolute bliss
Same with me when I was in Brazil, it was chugging along just fine back in New England
My big problem is remote stuff. None of my users have aftermarket routers to easily manipulate their DNS. One has an android modem thing which is hot garbage. I’m using a combination of making their pi be their DHCP and one user is running on avahi.
Chrome, the people’s browser of choice, really, really hates http so I’m putting them on my garbage ######.xyz domain. I had plans to one day deal with Https, just not this day. Locally I just use the domain for vaultwarden so the domain didn’t matter. But if people are going to be using it then I’ll have to get a more memorable one.
System updates have been a faff. I’m 'ssh’ing over tailscale. When tailscale updates it kicks me out, naturally. Which interrupts the session, naturally. Which stops the update, naturally. Also, it fucks up dkpg beyond what --configure -a can repair. I’ll learn to update in background one day, or include tailscale in the unattended-upgrades. Honestly, I should put everything into unattended-upgrades.
Locally works as intended though, so that’s nice. Everything also works for my fiancee and I remotely all as intended, which is also nice. My big project is coalescing what I’ve got into something rational. I’m on the make it good part of the “make it work > make it good” cycle.
System updates have been a faff. I’m 'ssh’ing over tailscale. When tailscale updates it kicks me out, naturally. Which interrupts the session, naturally. Which stops the update, naturally.
Have a look at Screen. You can create a persistent terminal to start your update in, disconnect (manually or by connection loss), and resume the session when you reconnect, with it having completed the update while you were gone.
Slowly building up my self hosted test env in a VM on my gaming PC.
Most recently playing with homepage so I don’t have to remember as many sub domains.
Eventually I will get the *arr stack going so my jellyseerr instance is more automated.
I try to install docker (only docker) on the extern hdd… I have some tutorials, but I do not get
What exactly are you trying and on which operating system are you?
I am setting up the server on Raspberry Pi 4 with RaspiOS. I want to download torrents and I have connected an external hdd USB3 for it… I was told that you could change the Docker directory to the external hdd to mount the containers on it. That way the microsd would work less and in case of failure, it would only be to install RaspiOS again and change the directories… All the configuration, docker containers, etc are in the hdd… So far I have not succeeded, although I have listened to 2 or 3 tutorials.
You can also mount everything on the Raspberry, leaving the microSD only for booting, but it is more complicated…
Excuse my DeepL english
I haven’t tried that but good luck!
Half finished projects
Same as it ever was.
Same as it ever was.
Crazy enough, I have everything going that I want to on my server!
- *arr suite and jellyfin
- traefik reverse proxy with crowdsec + bouncer for some sites (e.g. not documents or media)
- paperless-ngx for documents
- immich for photos
- leantime to manage personal projects
- Book stack for a personal wiki
- calibre-web for my library
- syncthing for file and music syncing so I don’t have to stream music
- valheim server for me and my friends
- boinc for turning my server to a productive heater in the winter
- home assistant for my in-renovation smart home
As far as my server goes, I have everything I need. Maybe setting up something for sharing files over the web if needed. I used nextcloud for that before it killed itself completely and I realized I never really needed it.
Next is working on my smart home because we had to fully strip the house to renovate. KNX first, zwave for things that KNX doesn’t have or are crazy expensive, ESPHome for everything that the other two can’t accomplish. Minimal 2.4GHz interference and don’t have to rely as much as possible on flaky wireless in a brick house.