Protecting a server from excessive requests
I'm asking both out of curiosity (since I know very little about this topic) and to potentially fix a somewhat annoying situation with my site.
I have an old site with a forum that is generally inactive. It's still online in case anyone wants to go back and look something up, but it's not used on a daily basis.
I switched servers about 6 months ago, and every once in a while I receive error emails from the forum (maybe 1 time per week on average, but sometimes up to 3 in one day, and sometimes not for several weeks).
The forum is set to send error messages to the administrative email when it can't connect to the database. The email says "Warning: forum cannot connect to database. Check with your host."
But every time I try to see what's wrong, it's all working fine. I've NEVER duplicated the problem as a visitor to the site.
Recently I decided to put some time into it and I modified the forum's error email (of course embedded deep within many layers of unlabeled includes), so that now I get an email like before but with the extra information of: 1) Page requested (URI); 2) Browser information (and IP); 3) Time.
That's a lot more helpful, but unfortunately this only happens once in a while so I've only received two messages since I updated that.
The first browser string was for an MSN search bot. The second was for a normal Windows user as far as I can tell.
So, my question is:
1) Are these users overloading the system? Maybe the MSN bot (and the user?) are making many requests every second so the database hits a quota.
2) If they are, how can I stop them from bothering the server?
This is a shared hosting account with godaddy and honestly I don't mind that much if once in a while there are a few seconds of downtime with the database on my personal website. It's not high traffic.
But I'd like to know for the future in case this happens again and also just to stop getting the emails every week or so. (I could disable them, but that seems counterproductive.)