You are right, blocking connections by User-Agent at the webserver level is nowhere near any standard, nor recommended.
But it turns out to be very effective and popular method:
http://perishablepress.com/press/2009/03/29/4g-ultimate-user-agent-blacklist/As this blog states: "...serious reduction in wasted bandwidth, stolen resources, and comment spam." This may be true even for Hiawatha, not just Apache. In my experience, the bad User-Agent cannot be stopped just by BanOnFlooding, BanOnMaxPerIP, or another method.
You're right, "..selecting the output based on certain headers in the request can easily be done in the website code..", but I can't secure the code which I did't wrote.
I'm just hosting webapps like TWiki, TRAC, OTRS, Roundcube, Hobbit, Squirrelmail, phpMyAdmin, Horde, Joomla, Typo3... I don't have the time and resources to make sure that all that code is secure enough. I'm even having a hard time to keep all that stuff up to date. Every single security oriented webserver option is a huge help here.
Bad User-Agent should be stopped from requesting anything as soon as possible. My logfiles are filled with entries like:
... User-Agent: Made by ZmEu @ WhiteHat Team - www.whitehat.ro ...
... User-Agent: Toata dragostea mea pentru diavola ...
These attempts are clearly malicious, there is no reason why the webserver or webapplication should even respond to this. It creates unnecesary load on the server, exhausting my allowed connections limit. Even if all applications were absolutely secure, I'd still wish to have the "MatchBrowser" option as a security/performance feature.
Anyway, thank you for making Hiawatha simply the best webserver out there