What are web gardens?
Web gardens are
different from Web farms. A Web garden is configured on a single server by
specifying multiple worker processes for an application pool. Web farms use
multiple servers for a Web site.
Creating a Web garden
for an application pool can also enhance performance in the following
situations:
- ·
Robust processing
of requests: When a worker process in an application pool is tied up (for
example, when a script engine stops responding), other worker processes can
accept and process requests for the application pool.
- ·
Reduced
contention for resources: When a Web garden reaches a steady state, each new
TCP/IP connection is assigned, according to a round-robin scheme, to a worker
process in the Web garden. This helps smooth out workloads and reduce
contention for resources that are bound to a worker process.
When do we need to go for a Web garden?
More RAM:
|
If you have a single process
serving a site/application and compare it to a web garden of five processes
serving the exact same traffic load, you'll generally see more RAM consumed
on the server. Part of the reason is because just spinning up a .NET application,
even with no load, requires a certain amount of memory overhead. Another
reason could be related to any application caching – multiple copies of the
cached data (one per process) needs to be stored.
|
More CPU:
|
Multiple processes often consume more
processor capacity in a web garden even with consistent traffic loads. In
general, the more processes that an OS has to track and manage, the more
resources it takes. I'm not sure exactly why, but we've seen processor usage
climb drastically as web garden process counts are increased. On a busy web
server it can actually overload the server, where the same load in a single
process might handle all requests perfectly with a much lower overall
resource load.
|