In my url.txt file, I only have the line
+fedex.com
If I type http://slashdot.org?fedex.com, the web site loads.
It's annoying to browse the internet like this, but it seems that many sites ignore the text after the ?.
I haven't tried this, but I suspect if I had control of a DNS server, I could have it translate http://fedex.com.mydomain.com to a web proxy.
Is there a way to protect against this without explicitly specifying IP addresses for allowed hosts?
- David
users can bypass url.txt
Moderators: Tyler, Scott, PWB v2 Moderator
PWB adds the "http://" to the beginging of all URLs, to further limit the filter you can use "+http://www.fedex.com" in stead of "+fedex.com". You may also try adding "-?" to the filter file.
We are working on wildcards in the filter files for version 2.05. This should help eliminate this exploit.
Other wise you can use the IP filter instead of the URL filter which would not suffer from this problem because PWB resolves the URL to an IP before checking the filter file. To keep people on the Fedex site, you can use "+191.81" in the IP filter file.
--Scott
We are working on wildcards in the filter files for version 2.05. This should help eliminate this exploit.
Other wise you can use the IP filter instead of the URL filter which would not suffer from this problem because PWB resolves the URL to an IP before checking the filter file. To keep people on the Fedex site, you can use "+191.81" in the IP filter file.
--Scott