An inherent problem with sharing in hosting
The regular hosting package
Traditionally, hosting packages have two very basic, measurable dimensions: the disk space and the amount of monthly data transfer. Following these are the plethora of other features which are basically permissions to use specific server based applications: email, PHP, MySQL, ASP, FTP and many others.
While this might seem to most people like the complete image of a shared hosting package, it really isn’t. Something (or some things actually) are missing from this picture. It will be quite surprising for some of you to find out what’s missing. Most likely it never crossed your mind because it’s just too basic to think of hosting without them.
The missing link
Once you sign-up with a host you will be allocated a slice of space to use on the servers hard disk. The next thing you do is upload your website, set things up and you’re done. Once the new nameservers propagate your website will be up and running.
All you have to do next is to promote your website and get as many visitors as you can, right? Right! However, as your website grows it might need more space or more data transfer allowance. The solution is simple: upgrade to a bigger account, with more space and more data transfer.
Do you see what’s missing from this picture yet? Do you see anything else that your website is using besides data transfer and space? A missing link? Most likely you don’t see it, but your website is using something else. Imagine that your website is transferring 10,000GB per month. Still not finding what is missing from this picture? Well, that something is generally referred to in the hosting industry as "resources".
Come again? Resources? What are those? They’re the things that make the whole server work; basically the CPU (the processing unit, the "brain" of the computer) and the internal memory. Fact is that the whole "shared hosting" concept almost revolves around these two things as they are two of the most important things that all the websites hosted on a server share.
Because the access to these shared resources is basically unrestricted, a website or script will use as much CPU and/or memory as it needs – sometimes to the point where the other websites on the server are affected.
Obviously the administrator cannot and will not let a client use resources up to that point. In the Terms of Service the host will reserve the right to temporarily suspend and even completely terminate an account that uses too many resources. It’s basically a "fair use" policy. After all, you cannot expect to use the whole processing power of a CPU when you’re paying only for a share.
However, trouble is no one knows what that share is (besides the host of course) so this opens a very big door for abuse.
No matter how you look at it, the CPU and the memory are oversold. Why? Well, for one because there’s no real/exact limit to their use. The host will approximate an "average" or "normal" usage of resources per unit of transferred data and will suspend a website that uses too much over that value, so that it affects the performance of the server. It will then ask the customer to upgrade. Websites will generally be allowed to use higher than average amounts of resources from time to time, but not all the time.
Somehow this whole thing remembers me of a Seinfeld episode where George dipped his chip, took a bite and then dipped again. Let’s assume that the data transfer and the hard disk space are the chips and the dip represents "the resources".
What a host does is sell you and the other clients a number (quantity) of chips, put a bowl with dip in front of all of you and say: "Dipping is free, but the bowl is only this big. Everyone can dip according to his needs, but if we notice that someone abuses dipping, up to the point where it’s affecting the other clients using that same bowl, they’ll get a slap over their hand. If they do it again we’ll not only slap them, but we’ll point them to the door."
Sure this example is about products rather than about services (as is the case with hosting), but it’s just meant to be a way of giving you a more vivid picture.
Going a bit further now, one cannot overlook the fact that these resources that we’re talking about are often strong selling points. Many hosts mention the fast processors that they use and the vast amounts of internal memory that their servers are equipped with. This supposedly means that the client gets access to a powerful computing system so his website will "fly like the wind".
Well, considering that the CPU power is not oversold, just how much of it should a customer be allowed to use? Here’s an example:
Say a CPU is shared by 200 websites (equal accounts in terms of space and data transfer). That means a website should be allowed to use a maximum of 1/200 (or 0.5%) of the processing power of the CPU. Less even, because a server becomes unstable at high CPU usage, so a maximum figure of 0.25% is more likely.
However, a host will not close websites for using just 0.25% of the processor, even on a consistent basis. You might get to read about limits of 5% or 10% CPU usage in the TOS of some hosting companies. (Yet another sign of overselling.)
Most hosts allow the customer to use high amounts of resources if he did not use much of his data transfer quota. This is done in an effort to show that they’re not overselling data transfer, which may very well be true, but in this way they actually prove that they’re overselling the other resources.
Translation: one user can "consume" more CPU than others and even much more that his money have actually bought. If customers pay for 0.25% each, but are allowed to use up to 5% on a consistent basis, it’s obvious that the host places a bet on the fact that very few websites will actually use their 0.25% share of the processing power. The very basis of overselling: because users on average don’t actually use what they pay for, the provider can promise a lot more – as long as just a few of the customers actually use it.
What if all/most of those websites suddenly start to use 1-2% of the CPU? Obviously the server will not be able to cope with this and the company will have to redistribute the websites on other servers and cover the costs, but for a while it will be unable to provide the services. Why? Because they are sold beyond he means of delivery or, in a single word, oversold.
Extremes make great examples
Many hosts boast the fact that they allow users to consume high amounts of "bandwidth" and "resources" without asking them to upgrade, yet they still have a "resources abuse" policy in their TOS.
Translation: "we do our best not to limit usage but because we’re overselling the resources, at some point we have to tell you that you’re using much more than what you paid for and ask you to upgrade."
When hosts make the packages they basically add the costs of running the server and decide on a selling price. That price is then divided into pieces (accounts) according to allocated hard disk space and bandwidth.
Trouble is that a website using 100GB of monthly data transfer could very well need a whole server in terms of CPU usage. Sure, when you’re paying for 100Gb of data transfer on a shared server, you’re not supposed to use 100% of the CPU, but, as long as CPU and memory usage are not clearly defined/limited, measured and sold, you are at the mercy of the host. If the host says that you were using too much then that’s it. You will either have to upgrade (often this means to get a dedicated server) or you’ll have to change hosts hoping (as strange as it might sound) that the current host lied in an attempt to make you pay more.
Sure, most hosts will not like my statement but some will agree to it, I’m sure. There are hosts out there that are definitely against any kind of overselling and they most likely know by now that this resources thing is a gray area. They are not overselling space nor data transfer, but they can’t really do anything about the resources because the tools for measuring their usage in the way that bandwidth and space are measured are not available.
Sure, the VPS (virtual private server) might seem a good solution, but that’s not really shared hosting and very small VPSs are not really efficient. Plus, a VPS generally requires more knowledge from the part of the user, a knowledge that the average user simply doesn’t have.
Is overselling a bad thing?
Tough question to answer! As this is a very controversial issue all I can do is express my opinion on the matter (again). I do believe that overselling can be done smartly, even when it comes to resources. One proof is that even though all "shared hosts" oversell the CPU’s capacity there are lots of good hosts… successful hosts out there, proving that it can be done smartly.
This article is not about overselling being "bad", it’s about the fact that as a hosting customer you’re not given the specifics of your account. You don’t know what’s your share of CPU and memory, as you do with space and bandwidth.
Probably sometime in the future (not a very near one though) this issue will be dealt with. However, I don’t see a movement towards this end – not from the part of the hosting companies nor from the part of the customers. Most people seem to be satisfied with the current status-quo. Without a driving force behind it, such a change will not happen.
I do believe that the real complete package (with the CPU and memory usage clearly specified in some way) is a thing that could help hosts and customers alike and I’m looking forward to see it become a reality. All I can do though is write this article, wait, and hope that it will help people realize that there are things that need to be improved.