• 2 Posts
  • 62 Comments
Joined 1 year ago
cake
Cake day: June 18th, 2023

help-circle

  • I use Technitium DNS as both my DHCP and DNS Server on my network. I then have my ISP Router’s DHCP turned off, and point the primary DNS IP To Technitium’s on my network. I have roughly 66-67 network devices at a given time on my network, mostly wireless. (Think wiFi locks, Lights, Outlets etc) then I have my phones and gaming systems an any given thing.

    To manage my IP’s I use an Airtable type of database via BaseRow, also self hosted. Through my router’s records, I copied/pasted every single MAC address I found, into a column in my BaseRow table there, and then added the device name or friendly name to another with an assigned IP I want to use. I have a more organized system of ranges 192.168.1.1-10 is mobile devices, 192.168.1.11-30 is IoT etc…

    By having my network setup in this fashion, I accomplish a few things, all new devices which power on or connect to the router to get their IP assignment fail to get it since it’s turned off there, and they search the network for an available DHCP Server which lands squarely on the TechnitiumDNS server and are assigned it through there. I also have adblocking enabled through the same server so I have a more home wide adblock which works. (You’d be amazed at how much Telemetry a TV Sends out for every single remote keypress!) I have been able to block those with the adblock enabled. With the DNS server, you can also assign DHCP ranges address, it is really an overly complex server and probably overkill for a home network. I’ve only scratched the surface of what it can do.

    If you don’t want to fuss with TechnitiumDNS, there’s AdguardHome, or even PiHole you can use if you want to block Ads (or you can simply disable that function) and those also act as a DHCP Server.

    Or, if you are wanting to spend a few hours configuring it, you could run your own DHCP Server in a VM or dedicated device such as a Raspberry Pi.

    With all of these settings, it’s important to set your DHCP lease offer long enough that if you have to reboot the DHCP Server for kernel update, or it crashes, you won’t have any devices fail as some do regular polling to check for connectivity (My Linux computer does this a lot). I don’t remember if it’s KDE or Arch. Anyway, running the DNS Server also allows you to custom build your own “domain” system if you will. So could assign maybe your self hosted Calendar for example to http://calendar.local or http://calendar.internal.

    By setting up a dedicated DHCP Server, using the manual method or one of the different AdBlock systems, you can also turn off DHCP registration for ‘foreign’ devices or those which aren’t in your DHCP table. This offers a small element of extra security for your WiFi, but it’s not 100% secure if someone knows your IP ranges and Subnet Mask. Also, this will make it easier in the future for you if you upgrade your router or replace it as there’s just two settings to change. (DCHP Server off and the optional self hosted DNS).


  • Why not use a different DDNS service? There are plenty out there. :) I think this may solve your issue. I’ve been using freemyip.com’'s for a while and have had no problem in the past issusing LetsEncrypt SSL’s. At the moment, I’m on Cloudflare tunnels so it’s automatic with them, which I know is a huge trust issue for a lot of people, but I don’t mind it for my stuff. But I do like to have my DDNS as a backup service from time to time.


  • I have been using Tailscale, connected it to my domain, I use Authentik for my OIDC/SSO Sign in and tied it that way for the MFA OIDC Login Tailscale let’s you use. All I needed to do is setup a webfinger for it and once it verified my domain, I was able to give them my OIDC settings for them. Tailscale so far for me in the last year or so has been quite simple to use. Plus, being able to log into my admin console and any devices I enroll through Authentik’s front end, has given me peace of mind knowing it’s quite secure. (All of this on a Proxmox server BTW).

    One may argue about self hosting Wireguard and I agree, it’s quite easy to do if you use something like wg-easy which makes it simple to add phones to your network. My concern with it though was having to poke a hole into my firewall for the WG traffic to hit the server, once I got into Tailscale, it’s made it easier and I don’t have any open ports on the router now. I think this is primarily why the Jupiter Broadcasting guys push it so much on their podcasts, not to mention one of the hosts on his podcast is an employee for Tailscale as well, so that probably helps a bit.

    As for funding for both Nebula, or Tailscale, they do cater to enterprise customers so you have the assurance that they do have to answer to them if they revoke a service or ruin it. :)

    For Tailscale, it’s just a matter of them allowing you to add 100 devices for free and it’s simple command to install it on any client via the cli including Apple TV for example. For phones, I have Tailscale on my phone connected 24/7 to my exit node which is my Proxmox server which acts as one, and as a backup, my Raspberry Pi which acts as one as well. So, even if I’m on the road or away from home, I’m always on my home network (unless blocked by overzealous sysadmins on their public WiFi networks). There’s not much to manage via the phone, but I like to think it’s ‘set and forget’ really, once you have it all configured, it just runs in the background and they do not decrypt your traffic much less care what goes through it.


  • I took a quick read of the comments and I apologize in advance if this has been suggested already.

    I use a self hosted DNS server (AdGuardHome) I was using TechnitiumDNS for a long while, but moved over to the other recently so I could do some more blocking as needed (adult special needs house dweller sometimes needs limited internet). It also acts as a DHCP Server so it takes the role of both the DHCP assignments away from the router. As it so happens, this week, I got to experience the benefit of having this setup live when my main router also went down, I was able to switch to a spare router (My ISP provided one) and all I had to do was turn the DHCP off and optionally point the DNS To my AdGuardHome address, set the SSID’s up and I was in business. All of my devices happily reconnected and grabbed their assigned IP’s.

    In short, if you have a spare computer, SBC such as a raspberry PI or whatnot, you can easily host something like that and not have to worry about setting those again.




  • I’ve seen a few mentions of PiHole and AdguardHome, I started on PiHole, then moved to AdguardHome for adblocking. Then I heard about and have been using TechnitiumDNS server which is sort of overkill for our needs, but with the right ad-lists, it is fantastic at blocking advertisements on my home network. Super fast install too, even on a Raspberry Pi 2 :) I run that along with Proxmox-VE (Protected behind OIDC Login) and several other containers on my cranky old Dell Desktop server.

    Mostly Vaultwarden, and a few other services for home private use such as PairDrop for inter system sharing and a self destructing file sharing server for when we need to send documents to our Attorney’s (rarely but sometimes we need to) office via Pingvin.

    I also run:

    • Home Assistant
    • Transmission Dockerized so I can help contribute to the Linux community and share the ISO’s.
    • For some of my externalized sites, I run Authentik It acts sort of like a Reverse Proxy if you configure it to do so. I love that I can simply identify myself with my WebAuthn device skipping any passwords. :)

    With Authentik setup, I can login to things like my Fresh Tomato Router TechnitiumDNS (Both use HTTP Auth headers) and Memos which uses OIDC/SSO. It’s meant to replace our Google Keep notes.

    • Tailscale is installed and I connect to it from my phone when away from home to always stay on my network. Sometimes, hotspots block it so I generally avoid those as much as possible.
    • Wallos to help keep track of our re-occuring subscriptions.
    • Grafana and Promethus - both are staged and ready for configuration and one of those I will get around to eventually.
    • InfluxDB - I plan on moving Home Assistsant logging soon to that which should tie nicely into Grafana later.
    • Ben Phelps’ Homepage - it’s my main server dashboard my wife and I use to access our server. Quite simply one of the best dashboards IMHO.
    • Wyze Cam Bridge - One of the better services in which you can log into your Wyze cams and convert their streams to RTSP, RTMP or HLS streams easily. I have that feed to my Home Assistant Security Dashboard.
    • Baserow It’s a good Airtable alternative and I use it to keep track of my Static IP assignments, Sleep tracker (I suffer from insomnia), and other data points. It’s pretty amazing. I even created a pain logging for for my wife so she just accesses it and answers basic questions about her pain levels and it pushes it to the database for later retrieval.
    • Joplin Server - Sorry, I don’t have the link, but it’s installed via compose. I use Joplin Notes on my phone and computer for keeping my code snippets. I’ve tried Obsidian and it didn’t really meet my needs and Also Anytype, but that’s not self-hosted. Joplin server is for me and that’s become handy a time or two when on the road.
    • Bookstack - my grand plan for that is to build a Wiki for my family to use in the event something should happen to me, they can know how to manage the server with nice screenshots and instructional steps. I have that protected behind Authentik’s OIDC logins.
    • IT-Tools - hands down one of the coolest self hosted tool sets you can use.
    • Webcheck - All-in-one OSINT tool for analyzing any website https://web-check.xyz/ is their demo site. :)
    • Stirling PDF - Kind of like a Swiss-army knife for PDF’s. :)
    • Dozzle - For those times with you really need to see what your Docker logs and too lazy do run a docker logs -follow command.

    I still use Portainer-CE and am happy there, I may try Dockage or the others, but it’s fine for what I need it for (It’s also protected by OIDC)

    I’m sure I may have missed a few, but this post has gone on long enough. :)


  • You can always use something like SSHwifty It retains your logins through your browser’s session data and never on your server, but it will allow you to remote into your local system from anywhere on the WWW if you desire to do so. With Tailscale, once you are connected into your Tailnet, you can pretty much SSH into any of your devices as long as the subnet sharing flag is turned on I believe. I’ve never had any issues with mine not allowing any SSH connections.




  • node815@lemmy.worldtoSelfhosted@lemmy.worldWeatherStar 4000+ Emulator
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    7 months ago

    From their readme. I asked about that last night and he replied an pointed me to it. :)

    Kiosk mode

    Kiosk mode can be activated by a checkbox on the page. Note that there is no way out of kiosk mode (except refresh or closing the browser), and the play/pause and other controls will not be available. This is deliberate as a browser’s kiosk mode it intended not to be exited or significantly modified.

    It’s also possible to enter kiosk mode using a permalink. First generate a Permalink, then to the end of it add &kiosk=true. Opening this link will load all of the selected displays included in the Permalink, enter kiosk mode immediately upon loading and start playing the forecast.


    I didn’t see IIS mentioned, but I didn’t take a close look at the code. They give you a docker run command to set it up, so I converted it to a docker compose file so I can run it later. All of this is running on a Debian 12 system, so if IIS is needed, I’d wager that is if you are running a Windows setup.

    I have mine embedded in Home Assistant now as an iframe using the Kiosk mode setting which works.



  • Authentik is my IDP provider so I put it in front of all my publicly facing Apps which support OIDC login. For example, I can log into my Portainer instance from an external network, but to do so, I log into Authentik First which sends it to my service.

    For the apps which support HTTP headers, like I said, Pomerium acts as the service which passes my credentials to the device. I admit - Authentik does this also without the need for Pomerium, (through their flow settings) but I found Pomerium to be much easier to set up for this than Authentik and haven’t looked back or felt the need to change it.


  • With that, I use Pomerium for apps which accept a HTTP Headers, for example, my Fresh Tomato firmware flashed router, it has a HTTP dialog. This allows me to login from the road if I need to manage something like rebooting it or updating firewall rules etc.

    My access flow is this :

    router.example.com —> Cloudflare Tunnel —> Pomerium IP —>Authentik —> Router’s Gui.

    It works flawlessly. I don’t often use it, but when I do, it helps. I also had it enabled for AdguardHome but moved to Technitium DNS which I prefer and that doesn’t have the HTTP Headers so it’s not fully compatible with Pomerium that I’m aware of.



  • I am a former IT Desktop drone…er…support worker… I used to swap towers for my local municipality back when Windows XP was being replaced with 7. I saw passwords on post-its attached to the monitor, mouse pad, and even under the keyboard or keyboard drawer (I had to get under desks to do the swap). Our policy was to remove those whenever we saw them and trash them in a different can across the building or a different one. They have a standard 90 day password cycle and most people couldn’t handle that. I would answer the phone often to 'unlock" their account after 3 attempts. My all time favorite when I would help an end user with software was when I would encounter someone’s “God Mode” icon for some of the registry hacks that used to float around. Everyone had Admin privileges (ironically), so it wasn’t really needed anyway.

    Their primary server admins and IT folks in the main office were Top notch though. Never any downtime and the main security guy was very strong in making sure everything was adhered to. We, as desktop support didn’t have the master password to decrypt a laptop which was GPG protected and had to bring it to him if we had a user which locked themselves out. With great consternation, only a few machines would be allowed to XP and those were VLAN’d and isolated from the outside world.

    The rest of the server admins handled everything with ease seemingly. The fun part was when they had a third party come in and do a security audit. No problems on the server side, but it wasn’t a success. They did the 'ol drop a flash drive randomly in different locations test. Knowing human nature, they knew someone would pick it up, plug it in and be baited with an excel file which looked like it had financials. Unbeknownst to the user, it sent a ping to their reporting server and the drive ID. Which was later reported back. They also did physical security penetration tests - walk in behind you type of thing. I remember seeing a group of guys non company ID badges try to follow me into the main IT office. I stopped them and asked who they were and what they wanted (this was a Govt building), and the look of confusion mixed with satisfaction from them that I stopped them was priceless. I let the head IT guy know who was at the door and left it up to them to unlock it for them.

    I now work in a help desk position for a software company and miss those days of desktop support. But, I know for a fact that I.T. Guys an Gals don’t get enough recognition. They are the understated backbone of a company’s well-being especially when holidays and weekends are prime time for systems to fail and they are practically on call no matter what.


  • I am testing it and it seems to run every 5 minutes to sync. Handles standard IMAP and POP inboxes. No auth for main page, so they caution appropriately to avoid public facing web exposure. They are planning on adding more support for Gmail and the like:

    https://github.com/bandundu/email-archiver/issues/6

    It installs by default in debug mode which may or may not be a red flag depending on your security model.

    The email search is fast, but could use work, I will say it is VERY early in development. But for downloading email for later storage, it should do. It stores your e-mails in a SQLite database in the same directory as the installer, so if you want to manipulate the compose file a bit, it should be able to point to your desired storage directory. With that said, I also was able to add a TZ= directive so my logs at least are a bit cleaner with timestamps to match my timezone, something they have not added.

    If you wish to access this remotely before they add a public facing login, protect it with a SSO solution or other front facing login setup so it would not be accessible. Or securely access it via Wireguard, TailScale, or Headscale.


  • node815@lemmy.worldtoSelfhosted@lemmy.worldProxmox vs. TrueNAS Scale
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    8 months ago

    I use Proxmox and don’t use Truenas. My setup is basically to install Cockpit on the host server via apt-get and then the 45 Drives cockpit-sharing plugin. This provides the NFS and Samba sharing I need and use. I host Home Assistant in a VM and Docker containers in a few LXC containers which host about 10 containers each. Then, in combination with https://tteck.github.io/Proxmox/ you can set up pretty much anything you need from there.

    This is on in computer terms, ancient; a 13 year old Dell Optiplex 990 with 16gb Ram and software such as Authentik and Vaultwarden from different dedicated LXC containers. Never have any issues with overload of the system resources or running out of memory. It’s pretty much rock solid.