users can bypass url.txt

For general issues related to PWB v2.

Moderators: Tyler, Scott, PWB v2 Moderator

Post Reply
gvozd
Spectator
Spectator
Posts: 1
Joined: Thu May 01, 2003 11:36 am

users can bypass url.txt

Post by gvozd »

In my url.txt file, I only have the line
+fedex.com

If I type http://slashdot.org?fedex.com, the web site loads.
It's annoying to browse the internet like this, but it seems that many sites ignore the text after the ?.

I haven't tried this, but I suspect if I had control of a DNS server, I could have it translate http://fedex.com.mydomain.com to a web proxy.

Is there a way to protect against this without explicitly specifying IP addresses for allowed hosts?

- David

Scott
Site Admin
Site Admin
Posts: 2539
Joined: Mon Dec 16, 2002 12:31 pm
Location: Rochester, MN
Contact:

Post by Scott »

PWB adds the "http://" to the beginging of all URLs, to further limit the filter you can use "+http://www.fedex.com" in stead of "+fedex.com". You may also try adding "-?" to the filter file.

We are working on wildcards in the filter files for version 2.05. This should help eliminate this exploit.

Other wise you can use the IP filter instead of the URL filter which would not suffer from this problem because PWB resolves the URL to an IP before checking the filter file. To keep people on the Fedex site, you can use "+191.81" in the IP filter file.

--Scott

Post Reply