🇨🇦

  • 3 Posts
  • 43 Comments
Joined 1 year ago
cake
Cake day: July 1st, 2023

help-circle


  • Sure, cloudflare provides other security benefits; but that’s not what OP was talking about. They just wanted/liked the plug+play aspect, which doesn’t need cloudflare.

    Those ‘benefits’ are also really not necessary for the vast majority of self hosters. What are you hosting, from your home, that garners that kind of attention?

    The only things I host from home are private services for myself or a very limited group; which, as far as ‘attacks’ goes, just gets the occasional script kiddy looking for exposed endpoints. Nothing that needs mitigation.




  • I setup borg around 4 months ago using option 1. I’ve messed around with it a bit, restoring a few backups, and haven’t run into any issues with corrupt/broken databases.

    I just used the example script provided by borg, but modified it to include my docker data, and write info to a log file instead of the console.

    Daily at midnight, a new backup of around 427gb of data is taken. At the moment that takes 2-15min to complete, depending on how much data has changed since yesterday; though the initial backup was closer to 45min. Then old backups are trimmed; Backups <24hr old are kept, along with 7 dailys, 3 weeklys, and 6 monthlys. Anything outside that scope gets deleted.

    With the compression and de-duplication process borg does; the 15 backups I have so far (5.75tb of data) currently take up 255.74gb of space. 10/10 would recommend on that aspect alone.

    /edit, one note: I’m not backing up Docker volumes directly, though you could just fine. Anything I want backed up lives in a regular folder that’s then bind mounted to a docker container. (including things like paperless-ngxs databases)



  • Darkassassin07@lemmy.ca
    cake
    toSelfhosted@lemmy.worldBackup solutions
    link
    fedilink
    English
    arrow-up
    3
    ·
    4 months ago

    After reading this thread and a few other similar ones, I tried out BorgBackup and have been massively impressed with it’s efficiency.

    Data that hasn’t changed, is stored under a different location, or otherwise is identical to what’s already stored in the backup repository (both in the backup currently being created and all historical backups) isn’t replicated. Only the information required to link that existing data to its doppelgangers is stored.

    The original set of data I’ve got being backed up is around 270gb: I currently have 13 backups of it. Raw; thats 3.78tb of data. After just compression using zlib; that’s down to 1.56tb. But the incredible bit is after de-duplication (the part described in the above paragraph), the raw data stored on disk for all 13 of those backups: 67.9gb.

    I can mount any one of those 13 backups to the filesystem, or just extract any of 3.78tb of files directly from that backup repository of just 67.9gb of data.





  • Interesting, that I was not aware of. I’ve never run into a scenario where I’ve had to add/edit while offline.

    When using vaultwarden however, you can be offline as long as the client can still reach the server (ie they are within the same lan network or are the same machine). You’d still be fine to add/edit while your home wan is out for example, just not on the go.

    Plus there’s the no-internet package mentioned in that link, but it’s limited to the desktop application.


  • Bitwarden is (primarily) a single db synced between devices via a server. A copy is kept locally on each device you sign into.

    Changes made to an offline copy will sync to the server and your other devices once back online. (with the most recent change to each individual item being kept if there are multiple changes across several devices) /edit: the local copy is for access to your passwords offline. Edits must be made with a connection to the server your account resides on, be that bitwardens or your own.

    If you host your own sync server via vaultwarden, you can easily maintain multiple databases (called vaults) either with multiple accounts, or with a single account and the organizations feature. (options for creating vaults separate from your main one and sharing those vaults with multiple accounts) You can do this with regular bitwarden as well, but have to pay for the privilege.

    Using vaultwarden also gives you all the paid features of bitwarden for free (as it’s self-hosted instead of using public servers)

    I’ve been incredibly happy with it after setting it up ~3 months ago. Worth looking into.





  • Darkassassin07@lemmy.ca
    cake
    toSelfhosted@lemmy.worldHosting private UHD video
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    2
    ·
    edit-2
    5 months ago

    Emby, Jellyfin, and Plex will all detect connection speed, adjust quality settings, and transcode the media to playback without buffering.

    I wouldn’t recommend Plex. They’ve been steadily moving away from self-hosted private media servers and towards just serving comercial content to you.

    I myself run Emby as I’m rather fond of their development team and their attitude towards privacy. It does require payment for ‘emby premier’, ie the installable client apps and transcoding features, but it has single payment lifetime licenses as well as monthly.

    Jellyfin is a popular open source option that is built on a fork of Embys older open source code before they went closed source.

    Either would work for you.