In the modern hosting world, there are three basic choices if you want access to a complete server environment: virtual private servers, cloud servers, and dedicated servers. Each has benefits and potential drawbacks, and it’s important that application developers try to see through the marketing hype to the real-world capabilities of each platform.
It might seem as though the default choice these days is a cloud server — but, while the cloud does offer clear advantages in many scenarios, it’s often not the optimal choice. I’d like to take a look at a few of the reasons many application developers choose dedicated hardware instead of opting for a cloud server.
Better Per-Dollar Performance
It wouldn’t be strictly true to say dedicated servers are less expensive than cloud servers. It’s possible to use a low-spec cloud server for next to no money and impossible to find a decent enterprise-grade dedicated server at that price.
However, a dedicated server will almost always offer a better price-to-performance ratio for high-load applications with predictable resource demands. If you were to compare a cloud server with similar capabilities to a dedicated server, any cost advantage would disappear.
A dedicated server makes all of its resources available to a single client: there is no contention for RAM or processing power, as there is on cloud platforms, which are shared hosting environments. Dedicated servers don’t run a hypervisor or guest operating systems, which can consume a considerable chunk of a physical server’s resources.
There are also built-in performance limitations with cloud platforms that simply aren’t an issue for dedicated hardware.
Let’s take a look at one such limitation.
Dedicated servers typically have storage media attached to the server over very fast internal busses. Cloud platforms work differently: their storage is connected to the server instances over a Storage Access Network. Hundreds of cloud servers may use the same SAN, which can be a problem because the bandwidth connecting those servers to their storage is shared. Each server must compete with other servers on the same network connection.
As has often been pointed out, cloud servers may be prone to latency because of degraded IO performance. That simply isn’t an issue for dedicated servers.
It is possible to “upgrade” the IO performance on cloud platforms, but it is so expensive
to do so that any cost benefit is substantially reduced.
Developers like to know what is happening with their hardware, especially when they suspect performance issues for their application are caused by the network or the server, rather than their code. On a cloud platform, there are so many layers of abstraction between the cloud server and the hardware that it’s almost impossible to debug performance issues. Beneath the hypervisor, cloud users have no insight or control.
Should The Cloud Be Avoided?
Cloud platforms are beneficial in some scenarios. If you want to quickly deploy a development or testing server, use a cloud or virtual private server. If your app requires an elastic platform — one that can deploy and delete servers in minutes — choose a cloud platform. But if you want the most cost-effective, reliable, and predictable platform with the best performance, consider using a dedicated server for your application.