PDA

View Full Version : Server Config Suggestions, Web, DB, MySQL, PHP



Lightson
11-30-2006, 09:08 PM
Have questions for some of the hardware and db design gurus out there.

Business is growing rapidly and we are experiencing heavy traffic on several of our sites now. I guess it is a good problem to have. We are not having major problems now, but in order to prevent server slowdowns that affect our in-house operations in the future, I want to further examine the possibility of restructuring our server setup, ie: separate web and db servers. Some of the bottleneck we are experiencing now is a result of huge, improperly indexed data files and old legacy code that has inefficient queries.

I'm already going down the path of increasing bandwidth to rule that out as a bottleneck since we host our own servers in-house, but I'm also looking for some suggestions as far as restructuring our hardware to help ensure that lack of processing power is also not a bottleneck.

Here is what I'm thinking - feel free to offer your input and suggestions:

Single web server per site. This will ensure that only traffic to that site and from the associated external traffic affects processing ability as far as pages served.

Shared DB server (loaded with RAM, plus dual or more processor, SCSI drives) to handle all of the heavy MySQL query traffic. I've heard comments that separating DB activity and actual serving of the final page will increase efficiency when pages are loaded, etc... I'm not just talking about serving pages to external traffic, our internal systems are custom written and served up using php/mysql as well - so there are many processor intensive internal applications that affect the web servers as well.

I could go an extra step and create separate web and db servers for each site, which would be great for our main site that pulls the most traffic, but that seems like overkill for some of our sites with less traffic that don't necessarily have this problem.

In short, I don't ever want lack of bandwidth or processing power to be the reason a crawler would stay away from our sites or have it adversely affect our internal operations. I know that throwing money doesn't solve every problem, but it seems that some fairly snappy servers and separation of processing in regard to db functions and serving pages would help to prevent slowdowns.

It would be great to know if anyone has been down this path before and I look forward to your views on this matter.