• 0 Posts
  • 9 Comments
Joined 1 year ago
cake
Cake day: June 11th, 2023

help-circle



  • ricecake@sh.itjust.workstoAnimemes@ani.social*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    24
    arrow-down
    1
    ·
    4 months ago

    Not everything is an evolved functional trait, like the first poster was saying.
    Loosing our hair and getting greasy was a functional adaptation. That grease getting stinky is just a byproduct that didn’t introduce a negative selection pressure.

    Evolution doesn’t have a plan, it just takes the shortest path towards better that doesn’t make things worse.
    Giraffes have a nerve that runs from their brain, down to their torso, then back up to the top of their neck. There’s no reason or benefit to this, it’s purely because when what the nerve runs to evolved in reptiles, it was at the top of the torso. Neck gets longer, nerve follows since there’s no pressure to select against stupid nerve layout. There’s a species of toad that evolved to become so small that their ear bones can’t actually pick up the sound of their species mating chirp. They still chirp, but none of them can hear it, and instead they signal based on seeing the motion of chirping.


  • People have been cleaning themselves for essentially forever. Bathing was not as common as it is today, but we know people have been washing their hands, feet and face regularly for many thousands of years.
    Cleanliness features very heavily in religion dating back thousands of years, and the earliest soap recipe is from ~2500BC, although we know they were making at scale hundreds of years before then.
    Wells to make water available in places where there’s no stream or river date back even further to the ~8000s BC.

    Most people weren’t rocking perfumed soaps and immersion in hot water, but washing your clothes with a homemade soap, scrubbing your feet, hands and face with cold water and a rag every day or so and likewise your body roughly weekly was available to most people at a minimum. If you were near a river or body of water, like humans tend to prefer to live, washing your feet, hands and face every morning and a weekly scrub was perfectly comfortable.

    Primates are generally very conscious of grooming. Humans are unique in regularly washing with water, but we’re also unique in being nearly hairless, remarkably greasy, and clever. It tracks that we’d figure out the water thing pretty fast.



  • It’s not a simple task, so I won’t list many specifics, but more general principles.

    First, some specifics:

    • disable remote root login via ssh.
    • disable password login, and only permit ssh keys.
    • run fail2ban to lock people out automatically.

    Generally:

    • only expose things you must expose. It’s better to do things right and secure than easy. Exposing a webservice requires you to expose port 443 (https). Basically everything else is optional.
    • enable every security system that you don’t have reason to disable. Selinux giving you problems? Don’t turn it off, learn how to write rules to let your application do the specific things it needs. Only make firewall exceptions where needed, rather than disabling the firewall.
    • give system users the minimum access they require to function.
    • set folder permissions as restrictively as possible. FACLs will help, because it lets you be much more nuanced.
    • automatic updates. If you have to remember to do it, it won’t happen. Failure to automate updates means your software is out of date.
    • consider setting up a dedicated authentication setup like authellia or keycloak. Applications tend to, frankly, suck at security. It’s not what they’re making so it’s not as good as a dedicated security service. There are other follow on benefits.
    • if it supports two factor, enable it.

    You mentioned using cloud flare, which is good. You might also consider configuring your firewall to disallow outbound connections to your local network. That way if your server gets owned, they can’t poke other things on your network.


  • So, you’re going to run into some difficulties because a lot of what you’re dealing with is, I think, specific to casaOS, which makes it harder to know what’s actually happening.

    The way you’ve phrased the question makes it seem like you’re following a more conventional path.

    It sounds like maybe you’ve configured your public traffic to route to the nginx proxy manager interface instead of to nginx itself.
    Instead of having your router send traffic on 80/443 to 81, try having it send the traffic to 80/443, which should be being listened to by nginx.

    Systems that promise to manage everything for you are great for getting started fast, but they have the unfortunate side effect of making it so you don’t actually know what it’s doing, or what you have running to manage everything. It can make asking for help a lot harder.


  • You’ll be fine enough as long as you enable MFA on your Nas, and ideally configure it so that anything “fun”, like administrative controls or remote access, are only available on the local network.

    Synology has sensible defaults for security, for the most part. Make sure you have automated updates enabled, even for minor updates, and ensure it’s configured to block multiple failed login attempts.

    You’re probably not going to get hackerman poking at your stuff, but you’ll get bots trying to ssh in, and login to the WordPress admin console, even if you’re not using WordPress.

    A good rule of thumb for securing computers is to minimize access/privilege/connectivity.
    Lock everything down as far as you can, turn off everything that makes it possible to access it, and enable every tool for keeping people out or dissuading attackers.
    Now you can enable port 443 on your Nas to be publicly available, and only that port because you don’t need anything else.
    You can enable your router to forward only port 443 to your Nas.

    It feels silly to say, but sometimes people think “my firewall is getting in the way, I’ll turn it off”, or “this one user needs read access to one file, so I’ll give read/write/execute privileges to every user in the system to this folder and every subfolder”.

    So as long as you’re basically sensible and use the tools available, you should be fine.
    You’ll still poop a little the first time you see that 800 bots tried to break in. Just remember that they’re doing that now, there’s just nothing listening to write down that they tried.

    However, the person who suggested putting cloudflare in front of GitHub pages and using something like Hugo is a great example of “opening as few holes as possible”, and “using the tools available”.
    It’s what I do for my static sites, like my recipes and stuff.
    You can get a GitHub action configured that’ll compile the site and deploy it whenever a commit happens, which is nice.