MiniVend Akopia Services

[Date Prev][Date Next][Thread Prev][Thread Next][Minivend by date ][Minivend by thread ]

minivend server specs (LONG)







On Tue, 17 Aug 1999, betty c wrote:

> ******    message to minivend-users from betty c <bc90007@yahoo.com>     ******
> 
> Kyle,
>    What's your spec idea for a minivend server.
>    Currently, I am shopping/putting together one for a grocery website.
> 

I assume that I am the Kyle you are talking to here :-)

What kind of server you build is going to depend on the kind of
interactions you expect.

You need to answer several questions:

1) how many people will come play with the MV part of the site?

2) how many people at peak times?

3) How complex is the MV part of the site?

4) what is an acceptable delay in user response time?

5) what percentage of users will make it all the way through the ordering
process?

6) how long does the average user interact with the system?

You need to calculate capacity based on the peak.  Assume everyone hits
the checkout page at once.

I am finding the following characteristics:

I have a P-II 400 with 256MB of RAM and a 2GB IDE drive.

I get access times that are usually sub-second until I get to the checkout
page.  Now, there are two parts to that cost, the first is the SSL set-up
time.  The second is the MV time to render the page in HTML.  I have a
very complex checkout page (over 20K, most of it is MV tags) compared to
most I have seen.  The total time of SSL set-up plus checkout rendering
takes about three seconds :-(

We do not expect large numbers of users.  Of the people who want to order,
we think nearly all will complete the process.  We have an existing
e-commerce site, and that seems to be the case there.

My assumptions are that I will get no more than five users at peak and
that we will get something like 500 users per day.  We have sold our
products in more than 160 countries with most of our sales being outside
the US.  Because our customers and potential customers are spread so
widely, we don't see any single peak of hits during the day.  Usually we
can just barely tell when it is "prime time" in US (one mild peak), Europe
(another mild peak), and Asia (again a mild peak).  However, certain areas
are getting more and more hits like Eastern Europe and South America.

I do a lot of database stuff behind the scenes, so my pages tend to be
fairly slow.  Also, nearly all names, products and prices are determined
dynamically based on the reseller from which the customer arrived.  All
the pages are quite fast except the checkout page.  I don't think I can
make a static page for that, so I am stuck with it being the bottleneck.

My pages are probably not the best indication of maximum MV performance.

There is no hard and fast rule about capacity planning for web sites.  It
completely depends on what you are doing on your site and how your users
behave.  I know of sites (not using MV) that do completely dynamic pages
and yet manage to serve 500k impressions per day on a P-II 300 machine
with 256MB of RAM.

I used to do this kind of capacity planning as part of my former life with
IBM Global Services e-Commerce division.  It isn't easy and there are all
kinds of things that can effect the end result.  Remember, machines are
actually quite cheap these days.  $1500 can buy a fairly powerful Linux
server.  If there is any question whether you have enough capacity, buy
another machine.

My rules of thumb are:

1) a few (low) hundred simultaneous users per machine if you are serving
static HTML and have a good multithreading/multiprocess web server.  This
drops as the pages become more dynamic.  If you are using IIS, divide by a
factor of two or three at a minimum because it tends to do stupid things
at high loads.  This is assuming a P-II class machine.  A stacked Sun
Enterprize E10000 will support a lot more.  An IBM 390 can support
something like 6-10k simultaneous connections, but who has a few million
dollars to spend on a web server?

2) reduce web-server dependencies on external machines.  This helps reduce
traffic, but more importantly often makes it easier to scale by simply
adding new machines.

3) Hardware is cheap, losing customers is not.  RAM helps keep things from
hitting the disk.  A good network card is very important.  USE 100Mbit.
The hubs and cards are relatively cheap now so there isn't an excuse.  The
10Mbit cards tend to be really bad for latency and end throughput.

4) Each page click loses a few people.  Make sure people can do what they
want with the fewest possible clicks.

5) plan for growth.  The Internet is still growing by more than a factor
of two each year.  If your business is global or has at least a wide
reach, you should plan to grow that much or more.

6) Identify those elements of your system that cannot be easily scaled by
adding more machines (such as central databases for inventory). Figure out
how you are going to work around them or how to increase the capacity of
those systems.

7) The end user experience is composed of many factors such as
look-n-feel, speed etc.  If the user likes the experience on your site,
you might just be bookmarked.  End user don't care whether you run Linux,
Solaris, BeOS, Windows etc.  They only see the HTML.

8) critical success factors are end-user happiness and making enough money
to eat and grow.  It is easy to get into religious arguments about various
technologies etc.

9) Think security from day one.  Your webserver is probably outside any
firewall that might protect it.  It is exposed.  We get port scanned and
checked quite often.  If you have important data, seriously consider
keeping it on a separate server that is more protected.  If you run Linux,
make sure that you have NFS, imap etc. turned OFF.  Check /etc/inetd.conf
and close all the holes there.  

etc.

Obviously a lot of this stuff applies more toward large-scale
installations than it does the odd single MV system.  However, the
Internet has a way of quickly destroying all preconceptions about how fast
your site can grow.  A very important point to remember is that all your
competitors are basically as easy to get to as you are.  If you don't have
the fastest site or something unique to keep them coming back, they won't.

You can do incredible things with very simple servers.  Yahoo serves more
than a billion pages (many billions if I remember correctly) from about
1200 servers.  I have seen some of their facilities in co-location
companies around the San Francisco Bay Area.  It is impressive!

Your mileage may vary.

Best,
Kyle



Search for: Match: Format: Sort by: