The way we use the web has changed out of all recognition since its inception two decades ago. In the mid–90s, a webpage was a simple collection of HTML and static assets, almost all of which could be relied upon to reside on the same server. Loading a webpage was a simple matter of establishing an HTTP connection with that server, which would send the necessary files. A modern web application functions quite differently, with dozens of assets spread across multiple servers to be delivered at different times. HTTP 1.1, which was designed for bulk data transfers, does not perform well when it is tasked with the transfer of multiple small files in different locations.
In our Black Friday promotion, all new Virtual Private Server accounts are eligible for a 25% discount with the promotional code: BF25.
As the holiday season approaches, it may seem like 2013 is winding down and it’s time to leave new projects until the New Year, but, in fact, December is the perfect time to begin laying the foundations of next year’s projects.
That’s why we’re excited to be offering new customers a 25% discount on all of our Virtual Private Server plans. A VPS is the perfect platform for developing, testing, and eventually hosting a web site or application. If you’re going to need extraordinary hosting for years to come, with unparalleled support and unbeatable reliability, check out our virtual private server hosting plans, but before you do, here’s why we think you’ll love our VPS plans.
User authentication presents a number of problems for web developers. As the web has become richer, moving from static sites to interactive services, the need for identifying users has become prevalent. In theory, the problem is not a difficult one to solve: the user presents an identifying token, a username, for example, and a shared secret such as a password, which are matched against entries in a user database. Unfortunately, in practice, there’s a lot that can go wrong, from insecure transmission of tokens to the database breaches we hear about all too often. In fact, many security experts advise against sites attempting to implement their own authentication procedures if it can be avoided. There is too much at stake and the chances of making a mistake are too high to risk it.
Single sign-on services offer an alternative to self-designed log-in systems. Anyone with a social media account is familiar with how single sign-on services work. In that case, the social media platform acts as the identity provider, verifying the identity of signed-on users for the service provider. Single sign-on provides a number of benefits, but it isn’t an unproblematic authentication mechanism.
The Benefits Of Single Sign-On
The obvious benefit to developers of using a single sign-on service is that they merely have to implement code to link their service to to the authentication provider’s service, Facebook Connect, for example. That process is much less complex and time-intensive than building an authentication system from scratch. It’s also much less likely to result in a flawed authentication system: Facebook and the other SSO providers are likely to have significantly more resources to invest in getting it right than the average web service startup. An additional advantage is that web services don’t have to provide their own support for lost or forgotten usernames and passwords.
Google wants the web to be faster. A faster web means a better experience for the billions of users that head from its search engine results pages out into the wider web every day. That translates to more page views for web masters and more advertising revenue for them and for Google.
Mod_pagespeed is an Apache module that implements a number of technologies to speed up page-load times. The module performs a series of best practice optimizations by filtering the content of a page, rewriting some of it, and altering how other parts are served, all with the aim of making a page load faster (and appear to load faster).
Mod_pagespeed and its sibling for the Nginx web server, ngx_pagepeed, have been around for several years, seeing incremental improvements to the ways in which they speed up page loads. The most recent version brings new techniques that in some cases can lead to a 2x increase in the speed at which pages render.
If any of you have ever ventured in the basement of a small or medium business over the past couple of decades, you’ll have no doubt noticed a creaky old piece of equipment lurking down there, faithfully — or quite often, not faithfully — managing telephony for the business.
That machine is a PBX, and it’s about time we sent the PBX the way of the telex and the rotary telephone: “Thanks for your service, but you’re expensive to buy, a pain to maintain, and we have something that work much better.””
Anyone who has ever used Skype knows what VoIP is. Instead of sending phone calls directly over the lines, conversations are converted into data packets and transmitted over the Internet. The result is a much more flexible, economical, and reliable way to manage a business’s telephony infrastructure.
Google Webmaster Tools Introduces Security Issues Feature, Provides Expanded Malware And Spam Information
It’s a nightmare scenario for webmasters. You wake up one morning to an inbox full of worried users complaining that Google Chrome is showing them a big red malware warning when they try to visit your site or an email from Google itself via Webmaster Tools letting you know that they’ve identified malicious code on some of your pages.
Obviously, that can be bad for site’s reputation and it can seriously impact traffic as Google stops sending search users. But, the biggest concern is ridding the site of malware and removing the vulnerability that allowed hackers to place malicious code on the site in the first place. I’ve heard many stories of frantic webmasters trying to clean out their site, only to be told again and again that they have failed to remove the malware.
While a managed server running Linux is very secure compared to the alternatives, no operating system is invulnerable.
When it comes to choosing an operating system to use on a server, security is of paramount importance. General scuttlebutt has it that Linux is vastly more secure than alternatives — alternatives that aren’t based on Unix, at least. To a degree, this is true, but it’s far from the case that Linux is completely invulnerable. To think otherwise is to risk taking a complacent attitude to server security, which can lead to server owners being taken unawares when they end up being hacked.
We’re going to cut through the misleading flimflam promulgated by fanboys and the ill-informed so that you can make informed decisions about the security issues come with running a Linux server.
The shift in computing habits from desktop to mobile over the last few years has been nothing short of staggering. It may be hyperbole to talk about the post-PC world, but we’re rapidly approaching the time when the majority of Internet use occurs on mobile devices. Some countries are already there, particularly in the developing world, where infrastructure deployment for wireless access outstrips traditional wired connections.
Earlier this year, it was reported that over 40 percent of Internet time in the US was on mobile devices. The growth of mobile devices as a primary point of access to the Internet presents numerous business opportunities for web developers and designers, as well as mobile app developers. However, at the current rates of growth, demand is going to outstrip available bandwidth. The technology currently in use and the spectrum available for wireless communications is limited. Anyone who has attended a conference where large numbers of people are trying to get online either through the venue’s WIFI or over 3G and 4G connections can attest to the fact that bandwidth is a finite resource.
DropBox is an enormously useful service that has all sorts of different uses for businesses and individuals alike. In principle, what it does is fairly simple, syncing one or more folders to a cloud service and from there to as many devices as users choose to connect. That ability, coupled with the ubiquity of mobile devices and the need to share data between numerous different people has made DropBox one of the cornerstones of the cloud computing revolution.
However, not everyone is satisfied with the idea of sharing their data with a third-party service and relying on that service to keep it safe. While DropBox is relatively secure, that’s often not good enough for businesses who have to adhere to regulatory requirements, who want to keep their data private, or who just dislike the idea of putting all their eggs in one basket under the control of someone else.
BitTorrent Sync is a new service from the minds that created the BitTorrent file sharing protocol that implements a way to sync folders without the need for a third-party service. One drawback of BitTorrent Sync is that, although it is capable of syncing data between multiple devices, without the cloud component, if those devices are lost, stolen, or simply turned off, there’s no way to continue to sync the data. However, if you set up BitTorrent sync on a Virtual Private Server, you get the best of both worlds: a private, always-on syncing service over which you have complete control.
The technology works similarly to the BitTorrent peer-to-peer protocol, with each of the connected devices acting as both server and a client to efficiently transfer data. One of the most important aspects of the Sync protocol is its security. BitTorrent Sync encrypts all transfers between devices with an AES cipher and a secure 256 bit key. BitTorrent Sync works very well on a Linux VPS and has client software for Linux, Windows, Android, and iOS.
In addition to straightforward syncing of data between multiple devices, the service also has some handy additional features for security conscious users, including read-only access where data will sync to a device, but changes on that device will not affect other devices, and one-time secrets, which provide single use access to a folder.
If you want to set-up your own DropBox replacement, you’ll need a Linux Virtual Private Server with the appropriate version of BitTorrent Sync installed. There’s a great tutorial for setting up Sync with Linux, and you can find full instructions for using the service on the BittTorrent Sync site.
For most modern businesses, data is either their product or an essential component in the design, creation, sale, and marketing of their products. Catastrophic data loss almost always means lost revenue. Every system administrator, IT technician, and executive knows that maintaining regular backups is strictly necessary for ensuring business continuity.
A recent study found that data loss accounted $400 million in lost revenues annually, and troublingly showed that much of that data loss was preventable. A survey conducted by Carbonite revealed that half of small businesses are hit by data loss, with inadequate backups being the most frequently cited cause. Furthermore, many of the businesses that suffer irretrievable data loss are immediately put out of business, with a significant proportion failing after two years. Having adequate and well-tested backups is a crucial part of business continuity planning.
However, backups come in various different forms. Traditionally, backups have been made daily, weekly, or even monthly. In the even of a hardware failure, data in the period between backups can be lost, and frequently that means work that was done in that period is rendered worthless. Lost data equates to lost man-hours and lost opportunities for revenue.
There is an alternative to atomic periodic backups: continuous data protection. Unlike with traditional backups, continuous data protection (CDP) strategies use an approach that results in much more finely grained backups.
CDP works by incrementally capturing changes to data as they are made, rather than gathering the sum of those changes and creating a copy of the original data. Capturing just the deltas results in numerous benefits, including the ability to roll-back to a previous state, lower bandwidth requirements, and more efficient use of backup storage — there’s no need to replicate 1 GB if only 1 Byte has changed on the disk.
Continuous data protection is often usefully used with MySQL to prevent data loss. CDP for MySQL is an effective method for preventing data loss because of hardware failure and it ensures that there is minimal business interruption even in the case of a catastrophic failure.
CDP is not a replacement for atomic backups and should not be relied on as a business’s sole backup method. There should always be multiple backups of important data, including on-site and off-site backups in concert with continuous data protection, but CDP provides an additional level of protection and the assurance that data loss can be limited to a very small period of time.
Future Hosting’s Future Protect automated backup solution uses continuous data protection technology to ensure that client’s data is always available and up-to-date in the case of a node failure.