Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Networking The Internet

Providing a Whitelisted Wireless Hotspot? 58

Ploxis writes "I volunteer some of my day managing a small network (and a ragtag band of computers) for a local nonprofit. I have been asked to set up a second, open, independent wireless network on site that will provide cost-free broadband Internet access to patrons. The catch is that they want to provide access only to a select group of about 25 websites while disallowing everything else. No objectionable sites, no mundane but non-relevant sites such as online banking or YouTube, and no other activities such as P2P or IM. They only want HTTP and HTTPS activity from a set of whitelisted websites." For the rest of Ploxis's question and his intial thoughts on making this happen, read on below.
"They'd also like any non-whitelisted URL to be redirected to a 'splash page,' which would just be some HTML providing a list of allowed sites by category. I'd host this page internally on the network. Their primary concerns are liability for access of illegal/objectionable materials and conserving their bandwidth, while still providing access to specific relevant tools online. My initial thought was simply an open wireless router, a set of remarkably restrictive firewall rules, and an in-house server as a custom DNS ... but that's pretty shaky (i.e. anyone specifying their own DNS can still get at whatever they want). I assume they'll need a router with some pretty significant traffic management capabilities as well, but that's not something I've investigated before. Anyone's experiences, recommendations, case studies, or maps of similar networks would be greatly appreciated."
This discussion has been archived. No new comments can be posted.

Providing a Whitelisted Wireless Hotspot?

Comments Filter:
  • and turn it into a router and make a domain for those folks?
    • There's no need for domains. The firewall just has to forward any non-approved IPs to their "non-approved hosts" page. Easy as pie, and (while non-trivial) not worth our time.

  • Squid (Score:5, Informative)

    by eln ( 21727 ) on Thursday August 21, 2008 @05:58PM (#24696401)

    Configure a linux box as a router, put squid on it, set up your whitelist, and you're all set.

    • Re:Squid (Score:5, Informative)

      by eln ( 21727 ) on Thursday August 21, 2008 @06:06PM (#24696525)

      I should also add there's some iptables stuff involved too, but if you know the terms "squid" and "transparent proxy", Google will give you plenty of pages telling you how to set it up.

      • I concur.
        This is the way I operate our lab connection (allow connections only to equipment vendor sites for software).
        -nB

        • Re: (Score:3, Informative)

          I would suggest that adding a pac.localdomain to DNS might (might mind) be better. Write a proxy auto-configuration file and don't permit any access to the outside world (i.e. turn off ip4_forward and use iptables rules to enforce just in case). That way you don't have to worry about transparent connections (which sometimes cause issues with certain pages - like "submit" on /. every now and then for me here) and you can sort-of monitor HTTPS connections as well.

          Just a thought, annihilate at will.
          • Re: (Score:3, Interesting)

            by networkBoy ( 774728 )

            Sure that looks like a better solution, but squid over a linux router is easier and "good enough".
            My caveat is that we have a strict usage policy and if you are caught circumventing my "good enough" solution you are not going to like the written warning. If you want general internet access you are expected to use your notebook and WiFi connection, and not connect to my lab network.
            -nB

    • Re: (Score:3, Informative)

      by mikael_j ( 106439 )

      Squid was my first thought as well, configure it as a transparent proxy and redirect all non-allowed traffic to the splash page. Combine that with firewall rules that block all non-DNS and non-HTTP traffic.

      /Mikael

    • Re: (Score:3, Informative)

      by cbiltcliffe ( 186293 )

      pfSense has got this built in.
      Install it on an old Pentium 266-400 or so, with 256MB RAM, if you can, and check the captive portal section.
      Set your client WAP up on a NIC by itself, and you can configure captive portal on that interface. Ensure your login page has no login options...just a "You can't go here" type of thing.
      Then, set up your allowed sites in the captive portal whitelist.
      Problem solved, and you've stopped another machine from ending up in a landfill.

  • pfSense (Score:3, Informative)

    by Fez ( 468752 ) * on Thursday August 21, 2008 @05:59PM (#24696423)

    Sounds like something that pfSense [pfsense.org] might be able to do, between squid and maybe the captive portal.

    • Re: (Score:3, Informative)

      Not even that complex. I wrote a little tutorial, here [bfccomputing.com] - just invert the meaning of the block rule and add a default deny.

  • mod_proxy (Score:5, Informative)

    by ak_hepcat ( 468765 ) <slashdotNO@SPAMakhepcat.com> on Thursday August 21, 2008 @06:00PM (#24696437) Homepage Journal

    mod_proxy, mod_rewrite

    your friends at apache have most of the work done for you. All you have to do is slap it together and write some custom rules.

    Linux as a firewall, to make sure that all http/http traffic gets redirected through the proxy

    if the hostname in the url doesn't match what's in your rewrite rules (aka, to pass through) then rewrite it to your custom splash page.

    no need for wacky dns tricks here.

    • This is only good for http, and he mentioned that his employer wants to block all objectionable traffic, and this can include ftp, irc, and other protocols. You'd also have to block all non-http traffic.
  • by techsoldaten ( 309296 ) on Thursday August 21, 2008 @06:08PM (#24696567) Journal

    Tell them no and strike a blow for Net Neutrality!

    M

    • Tell them no and strike a blow for Net Neutrality!

      M

      That's not net neutrality, do you even know what that means? They're not running an ISP, they're just trying to provide access to a handful of websites for free.

  • Untangle Pro (Score:2, Informative)

    by Russianspi ( 1129469 )
    Untangle's pro version should allow this. [untangle.com] Maybe they have a discount for non-profits?
  • by Anonymous Coward

    You need a web proxy and a DNS proxy: The web proxy to restrict the URLs to those which are whitelisted and the DNS proxy to stop "clever" people from tunneling through DNS.

    • by Zan Lynx ( 87672 )

      The only way a "clever" person could fool a transparent proxy would be to corrupt the DNS of the proxy.

      Or perhaps you're talking about one of those things that uses DNS as a protocol transport?

      query TXT GET.www.slashdot.org.proxy.home.server.org
      query TXT 0..proxy.home.server.org
      query TXT 1..proxy.home.server.org
      2,3,4,5,...,session_end.

      Like that?

      Most of these wireless AP thingies have a DNS proxy included already, it gets used to redirect people to the AP IP for the usage agreement page.

    • by Hatta ( 162192 )

      You'd need a firewall too, to drop non HTTP traffic. Otherwise a clever user could just use SSH to tunnel their HTTP traffic and even related DNS requests.

      • Re: (Score:2, Funny)

        by Anonymous Coward

        Absolutely. Another essential ingredient is electricity. And an internet uplink. Who are you? Captain Obvious?

    • Use squid with a domain based whitelist. Set /etc/resolv.conf with "nameserver 127.0.0.1" and use dnsmasq to provide dns lookup for squid, which has the added benefit of using your /etc/hosts for lookup exclusively. You then set up a script to look up each of the IPs of the desired sites automatically from an external server, probably with nslookup though there might be a more efficient method, on an hourly basis from a trusted DNS server. Dnsmasq handles your dhcp as well, adding further simplicity to the

  • tinyproxy (Score:5, Informative)

    by argent ( 18001 ) <peter@slashdot . ... t a r o nga.com> on Thursday August 21, 2008 @06:33PM (#24696971) Homepage Journal

    Instead of squid, use tinyproxy. You're not primarily interested in caching, you're interested in access control. Tinyproxy gives you much finer control of that, and it's also ... well ... tiny.

    Just set up a "no proxy" rule for the sites you want them to get to, and redirect everything else to a 404 server.

  • Of the allowed sites.

    Use any commercial router and access point, or even a WRT-54G. Drop the list of allowed ips into an access list

    Deny traffic for all other ips.

    Use separate rules to deny traffic to ports other than 80 and 443

    • Just using a firewall; nice idea. You'd have to keep on top of DNS lookups though.

      The router I got from my ISP actually allows you to do this by default. It also lets you redirect to another page, which would allow an error message to be displayed. Can you think of a way to do this with kit available in normal routers?

      • I would expect a simple shell script on a workstation or laptop should be able to assist with maintaining the ip list; or make a Linux virtual machine/put it on the cloud, and only ever run the VM while updating the list. It would be sensible to use a script and format the output so it can be pasted into the device to perform updates according to any DNS change.

        It is fairly commonplace to have script-generated firewall configurations like this. Esp. when a single site has several firewalls (I.e. for backu

  • OpenDNS? (Score:3, Informative)

    by jabithew ( 1340853 ) on Thursday August 21, 2008 @07:07PM (#24697429)

    OpenDNS were talking about adding this as a pay-for service [opendns.com], which would be cheaper and easier than setting up a dedicated Linux box, which is the normal proposed solution to any problem posed to Slashdot.

    Incidentally, the thread I linked has some other solutions posted in it.

    • That's at DNS level.

      Anyone with a static DNS server entry (or enough knowledge to look) would get around it instantly.
  • I would suggest a linksys-WRT54GL/Buffalo/Asus/etc. wifi router running OpenWRT. If you're only allowing to a relative handful of sites (~25), iptables rules wouldn't be too cumbersome. Add on a captive portal package (wifidog, nocat, etc.) and you're good. Though the basic captive portal redirection could be handled simply with iptables too, but one of the packages could make things easier to administer/monitor. I know that wifidog uses libhttpd as a web server that runs on the router, so you could run the

  • by coryking ( 104614 ) * on Thursday August 21, 2008 @09:30PM (#24698909) Homepage Journal

    Whatever you do, make sure you whitelist any dependencies these 25 websites use. I'm thinking of things like google-analytics, any kind of javascript library that is third-party hosted (Google Code or YUI) and ad code here. If you whitelist those as well, your patrons browsers might act a little funky depending on your solution.

    • by Z-MaxX ( 712880 )

      Why would you whitelist google-analytics? Isn't it some sort of usage-tracking service? I'd rather minimize the information collected on me.

      I use the NoScript Firefox 3 extension, and I set google-analytics to UNTRUSTED. I have no problems browsing with that.

  • Mikrotik will do everything you need and more.

    You would need build your own using a RB/411A, CA/411, R52H, AC/SWI and a 12-24volt power supply and you would be all set.

    http://www.mikrotik.com/ [mikrotik.com]
    http://forum.mikrotik.com/ [mikrotik.com]

    The guys over at http://www.quicklinkwireless.com/ [quicklinkwireless.com] sell preassembled AP's and will even walk you through configuring it.

  • Dans Guardian (Score:2, Interesting)

    Setup a transparent proxy and use dansguardian [dansguardian.org]. I've set this up and had it running for several months. It *easily* supports whitelited/blacklisted sites, domains (using regular expressions even), and mime types. It can also block objectionable content based on keyword groups and ratings etc. Very good indeed.

  • I have a router that does that. Provides for up to 40 white-listed URLs, and only those. Dual firewalls, all the latest, even QoS (not that it matters). $100 @ Newegg. D-Link DIR-655.

    Does not provide a bounce page, that I'm aware of.
  • DD-WRT will allow you to do this using their "hotspot" option. You set a list of sites that are allowed without logging in then when they try to go to other sites it brings up the "login" page. You can customize that page to whatever you want.
  • Simplest, quickest way to do it, and does everything you're looking to do.

    They put a relatively decent shell interface on top of linux that hides a lot of the complexity, and also have a good GUI management utility (I don't use it myself, but it can do everything the shell can).

    It'll run on most hardware, including x86. You'd have to buy a license, $45, but it's worth the time saved figuring out how to get all the different parts tied in together.

    And there is an active community forum with helpful people in

It is easier to write an incorrect program than understand a correct one.

Working...