Forum

9.13 ChallengeClient ban browsers and bots?

Anton
14 June 2015, 22:59
I have tried such string on real site with traffic.

ChallengeClient = 25, httpheader, 1800

It bans users from search engines and bots.
This option is very good to find real bad bots. I have found those parsing activities through proxy servers.
But it is also ban real users. That is very strange. Maybe I'm doing something wrong again.

I have checked this in real time looking to system.log and doing
grep '1.1.1.1' access.log
to look who was banned and for what.
Hugo Leisink
15 June 2015, 11:15
Perhaps the reverse proxy is blocking cookie headers?
Anton
16 June 2015, 00:29
Threre is no reverse proxy. I use only hiawatha. I have tried javascript option too.
It also bans me.

Could you recommend config rules including ChallengeClient option that were tested successfully during implementation of this option?

Can huge dedicated server provider do such tricks with reverse proxy on client servers?
It gives me idea to test site with hiawatha on another hosting or at home.
Anton
16 June 2015, 00:44
hiawatha -v

shows

Hiawatha v9.13, cache, IPv6, Monitor, reverse proxy, TLS (1.3.10), Tomahawk, URL toolkit, XSLT

I thought that probably you mean hiawatha's reverse proxy. I don't use any options regarding reverse proxy.
Anton
16 June 2015, 00:46
config

ServerId = www-data
ServerString = Server
ConnectionsTotal = 10000
ConnectionsPerIP = 50
SystemLogfile = /var/log/hiawatha/system.log
GarbageLogfile = /var/log/hiawatha/garbage.log
ExploitLogfile = /var/log/hiawatha/exploit.log

Binding {
Port = 80
}

BanlistMask = deny 66.249.64.0/19
BanOnGarbage = 1800
BanOnFlooding = 50/1:1800
BanOnMaxPerIP = 1800
KickOnBan = yes
RebanDuringBan = yes
ReconnectDelay = 200
ChallengeClient = 25, httpheader, 1800


gives such problems.
This topic has been closed.