Forum

Use "vanilla" 404 for not-found CGI scripts?

James
24 January 2018, 00:42
I have CGIhandler = /var/www/cgi-bin/url.sh:url

the contents of url.sh are:

#!/bin/sh

#TODO idk
status="302 Found"

#TODO parsing of other than raw-plaintext .URL files
if true; then
url="$(cat "${1}")"
fi

printf 'Status: %s\nLocation: %s\n\n' "${status}" "${url}"
cat "${1}"


and additionally I have it enabled in cgi-wrapper.conf.

This seemingly works perfectly; for instance, I can just run, as my user:

echo "http://website.org/favorite_page" > ~/public_html/bookmarks/website_page.url

to create a "hyperlink" to that page in the directory listing, which works very well:

$ curl -sv http://www.example.com/~james/bookmarks/website_page.url
* Trying ::1...
* TCP_NODELAY set
* Connected to www.example.com (::1) port 80 (#0)
> GET /~james/bookmarks/website_page.url HTTP/1.1
> Host: www.example.com
> User-Agent: curl/7.52.1
> Accept: */*
>
< HTTP/1.1 302 Found
< Date: Tue, 23 Jan 2018 23:35:54 GMT
< Server: Hiawatha v10.7
< Connection: keep-alive
< Transfer-Encoding: chunked
< Location: http://website.org/favorite_page
<
http://website.org/favorite_page
* Curl_http_done: called premature == 0
* Connection #0 to host www.example.com left intact
$


However, if I try to visit for instance http://www.example.com/~james/bookmarks/asdflmaofilenotfound.url

I get a 404 error, but not a "clean" one:

curl -sv http://www.example.com/~james/bookmarks/asdflmaofilenotfound.url
* Trying ::1...
* TCP_NODELAY set
* Connected to www.example.com (::1) port 80 (#0)
> GET /~james/bookmarks/asdflmaofilenotfound.url HTTP/1.1
> Host: www.example.com
> User-Agent: curl/7.52.1
> Accept: */*
>
< HTTP/1.1 404 Not Found
< Date: Tue, 23 Jan 2018 23:38:33 GMT
< Server: Hiawatha v10.7
< Connection: keep-alive
< Transfer-Encoding: chunked
<
* Curl_http_done: called premature == 0
* Connection #0 to host www.example.com left intact


That is, instead of just displaying the usual Hiawatha 404 error page, there's a suspicious blank page.

So, I am wondering, what do I need to do to get a "Normal" 404 page for URLs ending in a CGI extension?
James
24 January 2018, 00:48
The biggest issue is that, especially when I am using "unorthodox" things like a custom CGI handler script, which may very well have vulnerabilities (how do I know whether some shell script I've written in 2 minutes is bulletproof?), I would rather not leak the fact that this is there, if possible.

So for instance, if an attacker says "Hmm, it seems that all URLs with this extension are being handled strangely", that gives immediate indication of a possible attack vector.

Whereas if it just shows the usual inscrutable 404 page, then there's less information leakage
(Not that I am not interested to ensure my script is as secure as possible)

Consider by analogy how if ShowIndex = no, directories return a 404 rather than a 403: to avoid information leakage. You don't want a potential attacker knowing that there is a directory there, if there's no index and listing isn't permitted; likewise it seems like it would be better to give no indication that a CGI handler can be so trivially invoked. Better yet to avoid invoking the CGI handler at all, if it's just a 404 error. Since if the file doesn't exist, there's no reason to invoke its parser. (Is that possible?)
James
2 February 2018, 10:07
Revised the script; it now works for Windows-style .url files as well as plaintext, went ahead and Gisted it [gist.github.com]

#!/bin/sh

while read line; do
case "${line}" in
"http"*)
url="${line}"
;;
"URL="*)
url="${line#URL=}"
;;
esac
if [ -n "${url+memes}" ]; then
status="302 Found"
printf 'Status: %s\nLocation: %s\n\n' "${status}" "${url}"
break
fi
done <"$1"

cat "${1}"


Should be at least hopefully maybe a tine bit more bulletproof (shellshock 2.0 not withstanding) but my original question still stands:

It would be nice not to have extension probing obviously and immediately reveal what extensions are or aren't configured as CGI handlers; but is this possible?
Hugo Leisink
4 February 2018, 16:45
Perhaps the 'TriggerOnCGIstatus' is what you need.
This topic has been closed.