View Full Version : Protecting a server from excessive requests

12-17-2011, 10:05 AM
I'm asking both out of curiosity (since I know very little about this topic) and to potentially fix a somewhat annoying situation with my site.

I have an old site with a forum that is generally inactive. It's still online in case anyone wants to go back and look something up, but it's not used on a daily basis.

I switched servers about 6 months ago, and every once in a while I receive error emails from the forum (maybe 1 time per week on average, but sometimes up to 3 in one day, and sometimes not for several weeks).

The forum is set to send error messages to the administrative email when it can't connect to the database. The email says "Warning: forum cannot connect to database. Check with your host."

But every time I try to see what's wrong, it's all working fine. I've NEVER duplicated the problem as a visitor to the site.

Recently I decided to put some time into it and I modified the forum's error email (of course embedded deep within many layers of unlabeled includes), so that now I get an email like before but with the extra information of: 1) Page requested (URI); 2) Browser information (and IP); 3) Time.

That's a lot more helpful, but unfortunately this only happens once in a while so I've only received two messages since I updated that.

The first browser string was for an MSN search bot. The second was for a normal Windows user as far as I can tell.

So, my question is:
1) Are these users overloading the system? Maybe the MSN bot (and the user?) are making many requests every second so the database hits a quota.

2) If they are, how can I stop them from bothering the server?

This is a shared hosting account with godaddy and honestly I don't mind that much if once in a while there are a few seconds of downtime with the database on my personal website. It's not high traffic.

But I'd like to know for the future in case this happens again and also just to stop getting the emails every week or so. (I could disable them, but that seems counterproductive.)

12-18-2011, 03:49 AM
It's certainly possible that you're getting periodic, heavy "spurts" of traffic - it's not really that difficult to exceed the allowed number of concurrent mysql connections. The mysql default is 151 concurrent connections (your host may well have a much lower limit), and connections are _not_ immediately available to be re-opened as soon as they're closed.

I'm sure you already follow some of these suggestions:

1. only use as many connections per page as necessary. if at all possible, shoot for only one connection per user http request.

2. as soon as you're sure you won't need the connection anymore, explicitly close it.

3. if you're still having problems, you might look at caching your more common SQL query results (therefore, preventing the need for the queries, and closing the connection earlier (or possibly, removing the need for a connection to be opened at all)). use a caching extension (adodb (http://adodb.sourceforge.net/), for example) or write your own functions if you don't need such a heavy solution.

Take a look at the traffic records for your site and see if heavy traffic really is what you're up against.

changed caching extension recommendation

12-18-2011, 04:36 AM
Thanks for the info. That's very useful for the future, but at the moment I'm using forum software that I didn't write and I don't really want to modify to that extent. Any ideas when using prebuilt software?

But I do think that the software uses only one connection. It's used continuously while the page loads, so it stays open during the whole request, unfortunately.

And in this case I am most interested in finding out how to block users that take up too many resources, rather than trying to make the website more efficient. Both could work, but when it never gives me trouble as a user, I'm happy with it but anyone abusing the site should be blocked.