With all the CDNs and content been served from several locations for a single web page for example, would it be possible to implement a maintainable whitelist in something like a proxy? Does it makes sense? Or I would break half of the websites?

  • moonpiedumplings@programming.dev
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    13 hours ago

    Yeah you probably want a proxy based solution. Have a network that has no internet access except through a proxy that you control.

    You would also have to lock dns down. The problem with dns based blocks is that things like dns over https allow people to use an alternative dns server. But, if you control the devices that you are managing, then you can also control what dns server they use.

    • dontblink@feddit.itOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 hours ago

      Do you think a Proxy would be better in this regard compared to a firewall? I was trying to watch the logs of ufw today and see if I could do something there but the incoming and outgoing connections are A LOT, and I would essentially like to whitelist both per domain and per IP.

      How much maintainance would this require? I wonder how often IPs change today, but with all the NAT, dynamic DNS and CDNs there around maintaining a whitelist only with IP addresses looks like a nightmare…

      Squid proxy with squidguard could be a better option than trying to work with a forewall maybe?