The web runs on PHP. The most popular content management systems, including WordPress, are PHP. The most widely used eCommerce applications are PHP, including Magento and WooCommerce. If your business operates a custom web application, it’s probably built on PHP (although Node and other modern server-side languages are making inroads). What’s more, many of these websites, eCommerce stores, and web applications use PHP 5.6, which reaches the end of its life when 2018 comes to a close.
Category Archives: Web Hosting
All web hosting is an allotment of resources on an internet-connected server. What differs is how that allotment is carved out from the server’s resources and the software that does the carving. Both impact the capabilities and features of a web hosting account. It is awkward to move between types of web hosting once a site is established, so it pays to understand their benefits and limitations.
There are three basic hosting options for the server-side component of SaaS applications: colocation of owned hardware, dedicated server hosting, and the cloud. Each has its advantages, but there’s a tendency for SaaS developers to turn to cloud platforms without giving much consideration to using bare metal.
A couple of months ago, I wrote about the dangers of using old versions of PHP. Web hosting companies that provide customers with out-of-date software are a liability, both to themselves and the web. My focus was on hacked web applications, but it can get much worse than that. In June, a story broke that shows exactly why web hosting customers should make sure their hosting company provides an up-to-date software stack.
Korean web hosting company Nayana has agreed to pay attackers $1 million to retrieve the data of 34,000 web hosting clients on 153 Linux servers that were attacked by ransomware. Although it’s not yet clear how the attackers compromised the servers, it appears likely that ancient versions of the Linux kernel, the Apache web server, and other software may be to blame.
If you’re starting a small eCommerce store, website, or content business, you have two options for hosting: shared hosting and virtual private servers. There are many other forms of hosting, but it’s unlikely you’ll need the power of a dedicated server or the complexity of a cloud platform until your business grows considerably.
Your decision can affect the future of your business, so it’s worth taking some time to think about the implications of choosing shared hosting or a VPS.
Update: We’ve added four more cool things you can do with a VPS!
A virtual private server is a great platform for web hosting, but it’s also useful for many other things. A VPS is an always-on always-connected server capable of running anything that can be run on a standard Linux box (provided it doesn’t need a GUI). That makes it useful for any number of projects.
Let’s take a look at some of the things you might consider doing with yours – any of which can be done in addition to hosting a website.
The open-source nature of Linux means that there exists a downright staggering selection of different distributions, each one designed with a slightly different purpose in mind. While one of the platform’s greatest strengths, it could also be considered one of its most glaring weaknesses, as new users might feel somewhat overwhelmed by the sheer volume of choices available to them. This holds particularly true when you’re trying to select a Linux distro for your server, as choosing the wrong one could have dire consequences.
Picking between the various Java application servers can be a challenge for developers who are new to developing online Java apps and Java-based sites. The big three are Tomcat, Glassfish, and JBoss. All of them are excellent platforms upon which develop and deploy applications, but they have different strengths. Making the wrong choice can result in more work than necessary, so we’re going to cut through the confusion with a quick guide to which application server will best suit your needs.
In 2013, JBoss received a name change and is now known as WildFly, but the old name is still widely used, especially by those using the older versions, so, we’ll stick to calling it JBoss for the moment.
Those choosing a Linux distribution for their virtual private server or dedicated server have an almost limitless number of options. Because the Linux ecosystem is open source, anyone with the necessary skills can build and release their own distribution. There’s even a distribution called Linux from Scratch, which is a set of instructions for putting together a distribution from source — although I wouldn’t recommend anyone uses it for their web hosting server.
In spite of the number of Linux distributions, two are dominant on servers: CentOS and Ubuntu Server. They are both excellent choices, but when choosing between them it’s useful to know the ways in which they are different. I want to have a quick look at the origins of each and the differences between them.
We’re offering a $50 per month reduction in the price of one of our most popular dedicated server plans. The E5-2420 dedicated server will be available to new clients for just $100 a month from November 29th using coupon code HEXCORE50.
Only 300 servers are available as part of the promotion, and we expect them to sell like hotcakes.
There are two options for web hosting clients who need more resources than a shared hosting account can offer: a Dedicated Server or a Virtual Private Server (VPS). Let’s start by taking a look at how Dedicated Servers and Virtual Private Servers are similar, before investigating the differences that will help you choose.
Both Dedicated Servers, often called bare metal servers, and Virtual Private Servers offer a full server environment. A server is essentially a computer like your laptop, except that servers are specialized for tasks like hosting websites or web applications. Both Dedicated and Virtual Private Servers have a set of resources and an operating system. Both allow a user to install software of their choice.
New web hosting clients are often confused by the way the domain name system and the domain registration system work. More specifically, they’re confused that there is a difference between registering a domain name and actually linking it up with their site via a DNS hosting service.
I’d like to take a look at the three services that work together to ensure that when a user puts a web address into their browser, the appropriate web site appears. Those three services are web hosting, the domain registration system, and DNS, the Domain Name System.
Most medium and small business websites are hosted on the Linux operating system. Linux is a (usually) free platform comprised of open source software — including the Linux kernel and GNU tools — created by thousands of development projects over many years. In the modern web hosting world, building hosting plans on a Linux distribution like CentOS is almost the default choice.
But that doesn’t mean there aren’t other options, including Microsoft Windows. Windows is better known as a desktop operating system, but Microsoft also develops an excellent suite of tools that include a server version of their operating system and a web server.
Your site’s files are stored on a server in a data center. When a browser requests a page, it connects to the web server — usually via lots of other machines like routers and switches — which will send the browser the necessary files. If the browser is close to the server, it won’t take long for the files to traverse the internet. If the browser is on a computer on the other side of the world — or even just the other side of the country — it might take a relatively long time. The browser has to send requests to the server, and the server has to send data to the browser, a process repeated many times for every page request.
According to The Register’s Trevor Pott, Windows 10 is a resounding “MEH” from a sysadmin perspective. To be fair, Pott likely wrote his blog before Microsoft began aggressively pushing the operating system onto its users, resorting to everything from incessant nagging to outright deceit. If you think Microsoft’s behaving like a purveyor of malware or adware, you aren’t alone in that.
“It appears as if Microsoft designed the Windows 10 upgrade mechanisms in a way that makes it very complicated for users to block the upgrade offer for good on machines running previous versions of Windows,” writes Martin Brinkmann of Ghacks. “This persistence is similar to how malware evolves constantly to avoid detection or come back after it has been removed from operating systems.”
The vast majority of web hosting companies base their hosting plans on CentOS, which is essentially a free clone of Red Hat’s hugely successful Red Hat Enterprise Linux. Web hosts choose CentOS because it provides a stable and secure platform. CentOS is a conservative distribution, with major releases happening only once every few years.
CentOS is a stable foundation for web and application hosting, and much of that stability is the result of the glacially slow — or sensibly cautious, depending on your perspective — rate at which new software versions are incorporated into the distribution’s repositories.
If you ask five different Linux developers which distribution is best for server management, you’re liable to receive five different answers. Because it’s open-source, Linux has one of the most diverse development ecosystems of any operating system on the market. Anyone with the necessary skills and time can code their own distribution (and many do).
From the perspective of a sysadmin, that means that your choices are effectively limitless when it comes to choosing a hosting distro.
That isn’t to say you can just pick any old distribution and spin up your server, of course (that’s something I wouldn’t really recommend). It just means you’ve got plenty of options. On the one hand, that’s pretty great; more choices means you’re that much more likely to find something that’s a perfect fit for you.
When I say “content marketing,” most small business owners hear “blog”. A blog is often the foundation of content marketing strategies. Blogging has a low barrier to entry, and, over time, can prove an excellent search engine optimization, audience building, and social media marketing resource. But blogging is far from the only content marketing strategy within reach of small businesses. Ebooks have also proven enormously successful for businesses of all sizes.
If you’ve been following the tech news of late, you’ll have heard about a serious vulnerability in the Linux kernel that could allow an attacker to gain root access. The media has treated the story with its usual restraint: headlines abound about the vulnerability of millions of servers and Android phones. I’d like to take a more level-headed look at the vulnerability and the impact it might have on web hosting clients.
Many bloggers and small business site owners opt for shared hosting when they first create a new website. Shared hosting tends to be inexpensive, and it often requires little in the way of technical experience to get a site up and running. But once a site generates more traffic, and once the site owners become confident managing their site, it’s time to consider moving to a more capable form of hosting.
First, I want to take a look at why shared hosting is not a good option for moderate to high traffic sites, and then I’ll take a look at what the alternatives are.
If you were a web designer in the mid-to-late 2000s, you’ll remember the problems inherent in creating a site with great typography. Suffice to say that since the advent of web fonts and web font hosting services, we’ve had it good. Every web designer and developer has a choice of tens of thousands of typefaces — many of them free.
Fontdeck was one of the first and most popular of the font hosting services. It gathered together the best fonts from a diverse range of foundries and made them available for a price within reach of most projects. Sadly, Fontdeck recently announced that it will not allow the creation of new accounts from 1st December 2015 that it will close altogether on 1st December 2016.
PHP developers have any number of options when choosing a development environment. From command-line editors like Vim, via richly featured code editors like Sublime Text or Atom, to IDEs like the Zend IDE and PhpStorm. It’s the latter I’d like to take a look at in this article. PhpStorm is a full-featured IDE that has become increasingly popular over the last couple of years, largely because developers have been looking for an alternative to the sluggish performance of the most popular PHP IDEs. Earlier this month, PhpStorm hit version 10, and with it came a boatload of new features to complement its existing feature-set.
Last month, President Barack Obama met with Chinese President Xi Jinping to discuss the strained relations between their two countries – primarily the prevalence of digital espionage.
As you may recall, the two nations have a long history of being at one another’s throat in the digital realm. Neither is entirely blameless in this, of course – while China has made a habit of targeting American businesses with aggressive hacking campaign, digital espionage in the States is nothing to sneeze at, either. It’s hoped by everyone involved that the pact signed between Obama and Jinping will serve to warm relations between the two countries, and make aggressive digital attacks a thing of the past.
Ever since social media marketing became a thing, its practitioners have cast about for a measurement that reflects the value of what they do. Follower counts (along with shares) seem an obvious candidate.
A follower represents someone who has expressed an interest in a brand and its content. The more people who are interested in what a brand shares, the more chances to convert leads into buyers. Follower counts also have a snowball effect. As an account’s follower or friend count increases, so does the likelihood that content will be shared, which will attract even more followers. And they have a reputational effect; a brand with 100,000 followers is clearly more awesome than a brand with 10,000 followers, and we can look with pity on the businesses with sub-1000 follower counts — clearly they won’t be around for long.
Black Friday and Cyber Monday are almost upon us, and to mark the occasion we’re slashing the price of our Virtual Private Server and Dedicated Server plans by 75% for the first month.
The promotional price applies across our range of VPS plans, including managed and unmanaged servers, Pure SSD VPS’s, Hybrid Servers, and our SaaS Hosting plans. All Dedicated Server plans are included too.
CSS has numerous limitations. You can build any website it’s possible to imagine in CSS (with the addition of various other technologies), but CSS doesn’t make life easy for developers. It’s too easy to make mistakes, to build massively complex style sheets that are a nightmare to maintain, and to end up with a mess that no designer can properly get their head around.
Tools like SASS were created to mitigate some of the problems with CSS. They add features like mixins, variables, and mathematical notation that make it easier to write elegant and maintainable stylesheets which can then be processed into valid CSS browsers can understand.
WordPress and most other popular content management systems store content in a relational database like MySQL. While there are advantages to using a database, it adds a layer of complexity that isn’t necessary for many sites. An alternative is to simply store content, template, and configuration files on the server’s filesystem. Content management systems that take this approach are known as flat-file CMS’s. Over the last couple of years, a number of flat-file CMS’s have entered the market, most notably Statamic, which now has competition from Grav, a new flat file CMS from RocketTheme.
Small businesses are usually started by people who are passionate about doing one thing well. Whether it’s making the best cupcakes, taking awesome photos, or building a service that empowers users, we start businesses because we see a market for something that we can do well — perhaps better than anyone else.
Once it’s time to move from idea to execution, small business owners have a problem. It’s not enough to be great at what you do, you also have to let potential customers know you’re great. We turn to web hosting and web design, to social media, and to marketing to get the word out and build a customer base. But there’s the rub: being able to bake delicious cupcakes won’t make you a good marketer.
Technically, you don’t need a framework. You can easily handle the ins and outs of development on your own. You can create your own libraries, download independent modules, and tweak free-flowing code however you see fit.
The question is, why would you really want to?
“A framework is not absolutely necessary: it’s “just” one of the tools that is available to help you develop better and faster,” reads a post on the Symfony Blog. “Better, because a framework provides you with the certainty that you are developing an application that is in full compliance with business rules, is structured, and both maintainable and upgradeable. And faster, because it allows developers to save time by reusing generic modules so they can focus on other areas.”
First things first, why 4.0.0? Most users of Node are using a version that hasn’t yet hit 1.0. I don’t want to get far into the weeds on this issue, but the nutshell explanation is that last year Node was forked into a couple of competing versions: Node.js, which was overseen by Node’s original creators Joyent, and io.js, which was a community project. The fork occurred because the community was unhappy with the pace of development.
The question of whether to invest in the development of a native application has vexed publishers over the last few years. I’ll lay my cards on the table: I’m in favor of the web, but in this article I’d like to take a look at both the pros and the cons of each choice — web or native?
What Are The Benefits Of A Native Application?
The most obvious benefit of native applications is their access to device features that aren’t available to web applications and sites. For publishers, most of those features are irrelevant (the camera and sensors), but one in particular is important: native apps are faster than the web.
In the early years of software development, applications were complex monoliths with logic and UI inextricably entangled. In the late 70s, the designers of the Smalltalk programming language developed a different way to architect applications — one that separates the “business” logic of software from the elements that deliver the interface to the user and accepts their instructions.
That principle of software architecture — MVC, or Model-View-Controller — later become the design pattern for the vast majority of web applications and the frameworks on which they are based. PHP frameworks like CodeIgniter, CakePHP, and Laravel are designed to make building MVC-style applications as easy as possible.
One can’t help but notice the irony here.
As one of the most popular image hosting sites on the web, Imgur’s got everything from movie screenshots to unusual thrift store purchases to how-to guides to awesome scenery. There’s plenty of NSFW content there too, of course (no, we aren’t going to link to it). Mostly, though, it’s pictures of cats.
Turns out, however, that for a short time, there was something else on the platform too: malicious code.
Since it was released four years ago, Bootstrap has taken the web design world by storm. So much so that a bare Bootstrap site has come to be a byword for quick and easy — or lazy — design.
Bootstrap’s popularity stems from its easy-to-use grid, but in the modern web design space, you can’t throw a stick without hitting a grid framework. Its popularity is maintained because of its plethora of extra features, from sane CSS defaults to user interface elements that make it relatively straightforward to build a range of common site layouts.
There are two major groups of performance optimizations that can be implemented on websites (if we set aside network optimizations). The first reduces the weight of a page — compressing HTML files and optimizing images are examples. The second doesn’t change the amount of data that the site sends to browsers. Instead it involves taking control of when elements on the page load to improve users’ perceptions of load times.
Lazy loading is an example of optimizing perceived performance. Modern sites tend to be image heavy — designers favor large full-bleed images that inflate the size of their pages. For long pages, that means a significant amount of image data has to be downloaded before the page becomes usable.
In the early days of the web, everyone used HTML tables for layouts. At the time, it made sense. Print media has always made heavy use of grid layouts. The HTML table element was a straightforward way to bring some of the power of print media’s grids to web design. As we all know now, using HTML for page layout isn’t a good idea, and any designer who has been in the business long enough will remember tables nested within tables nested within tables, all of which was muddled up with the page’s content.
When CSS came along, we learned to separate design and content. Unfortunately, CSS’ layout tools were not great — they relied on low-level fiddling with nested divs, absolute positioning, and floats. The box model is not intuitive, and many a designer has lost hair trying to figure out why their divs refused to appear in the right place.
I’m an admirer of Ev Williams. When he founded it, Blogger was an innovative platform — one of the first to bring accessible online publication to non-technical users. I use Twitter — also co-founded by Williams — dozens of times a day. His most recent project, Medium, continues his commitment to make publishing online as easy and elegant an experience as possible.
As a PHP developer, it can be difficult to decide which framework to use – there is, after all, a rather extensive list of them. Today, we’re going to help you sort through that daunting selection. We’ll be going over some of the most popular PHP development frameworks on the market and taking a look at their strengths and weaknesses.
Once you’ve learned a little about what each framework can do, you’ll be better-equipped to determine which is the best option. Now, at this point, there’s one thing worth mentioning – although no two PHP frameworks are created equal, and every framework has certain projects it’s better suited for, the majority can be used for whatever projects you desire. Which you choose is largely a matter of how you code and what you want to accomplish.
This year will be the twentieth anniversary of PHP’s first release. Back in 1995, its creator, Rasmus Lerdorf, could have had no conception of the impact it would have. What was originally called Personal Home Page Tools is now the foundation of much of the modern web. WordPress alone accounts for 24% of all websites. Publishers, eCommerce merchants, and businesses the world over rely on PHP every day. Many thousands of developers make their living from PHP. All of which means that the imminent release of new major version of PHP is a big deal.
PHP 7, which is currently in beta and is expected to have its final release at the end of this year, brings some significant changes. But first, why PHP 7? The current stable version of PHP is 5.6. There will be no public release of PHP 6, which briefly existed as a development project, but was abandoned before it reached completion. In order to avoid confusion with the aborted version, the new release will skip 6 and go straight to 7.
The number of freelance businesses has increased enormously over the last few years: developers, writers, photographers, designers, customer service operatives, personal assistants, eCommerce retailers — the list of jobs amenable to the freelance business model is long and growing.
Many freelancers find work through sites like oDesk, but once they’ve established a reputation, much of it will come from referrals — do a great job for a client and they’ll refer someone else.
Not so long ago, you could count the number of generic top-level domains on your fingers and toes. The domain names of most sites ended in “.com,” “.net,” “.org” or one of a few other familiar domains. Added to that were the several dozen country-code top-level domains, most of which are localized to a geographic region by search engines; sites with those domains would rank higher in their specific region, all things being equal. And then there were the country-code domains which were treated like ordinary generic domains in search: “.tv,” “.me,” and “.bz” are in this category.
The way these domains were treated in search was well understood, and, in-spite of the usual flimflam from the less honorable parts of the SEO industry, few site owners were likely to buy or avoid a domain name with the misconception that it would harm or benefit their site’s performance in the SERPs.
Web fonts opened up a new world of typographic possibilities for web designers. No longer did we have to rely on a handful of “web safe” fonts. We could choose from thousands of fonts offered by services like the Google Fonts and Typekit. We could experiment with fonts to our hearts’ content, finding the perfect combinations without having to spend big money on licenses. In short, web fonts are awesome.
However, web font services are not without their limitations, one of the most pernicious of which is the dreaded Flash Of Unstyled Text (FOUT). We’ve all experienced this: we load up a web page, and for just a second, we see an ugly fallback font, which is quickly — or not so quickly — replaced by the designers’ chosen font.
In this article, I’d like to have a look at what Node.js is for, its major benefits, and why you might want to think about adding yet another tool to your web development kit.
The most important employers and money generators in the international economy are small and medium businesses. In terms of economic impact, the small, sustainable manufacturers and service providers who sell to limited markets make a big difference. 98% of US businesses have fewer than 20 employees. The Apples, Googles, and Ubers of the world make the headlines, but the real economic powerhouses are the thousands of little businesses that no-one but their customers has ever heard of.
If you follow the startup scene at all, you’ll have noticed that hardly anyone is aiming at small and sustainable. They want the next Google or Facebook. They want to ride the next unicorn to an astronomical payout. But most companies will not be the next Google or Facebook. There’s an inherent contradiction here: most companies will never be huge, and yet thousands of startups every year choose a business model that can only work at scale: free to the user.
If you’re a follower of this blog, you’ll know that I’m a fan of static site generators like Jekyll. Creating a basic Jekyll site is very simple: Jekyll will do the heavy lifting for you. But there’s more to a modern website than a basic scaffold of folders and files. Most depend on a number of external libraries, frameworks, and other tools.
The process of beginning a new project often involves heading to the sites of tools like Bootstrap, Foundation, and JQuery to grab the most recent version, unzipping them and dropping them where you want them in your project folder. That doesn’t seem especially onerous, and it isn’t, but it can be a drag when you do it for every new project.
We’ve talked about some of the leading static site generators several times on this blog. I’m a big fan of the way they provide a framework for people to learn about the technology underlying the web, without making them do all the hard work of managing a non-CMS controlled site.
They offer a happy medium: once a static site generator and its site are configured, adding new content can be as simple as dropping a text file in a directory. It’s often simpler than publishing to a full-fledged CMS like WordPress, with all its bells and whistles. But in creating the site, bloggers and site owners have to learn about the underlying technology of HTML and CSS and develop an intimate understanding of how their site is created.
Freelancing is more popular than it has ever been. The number of people creating and managing their own micro-businesses has grown sharply since the financial crisis of a few years ago and shows no sign of waning as the economy recovers. A recent survey showed that there were more than 53 million freelance workers in the US alone.
It’s hardly surprising: the autonomy accorded to freelancers is unlikely to be matched by more traditional employment — even if freelancers work just as long hours. The Internet has made it possible for freelancers to be “present” wherever they are needed.
Historically, call and contact centers have been notoriously expensive and complex to build and manage. It’s not just the telecoms technology — which is complicated enough on its own — it’s also the software to manage dozens of calls, to feed operators a stream of contact data, to handle complex dialing scenarios, to provide intelligent call scripts with automatically filled placeholders, and so on.
A contact center, whether its used for promotion, product support, or customer feedback, is something that smaller businesses ranging from manufacturers, to realtors, via software developers, and many businesses in-between would find useful, but they don’t want the expense and hassle of building a solution or buying an expensive proprietary software suite.
It wasn’t so long ago that Laravel was the new kid on the block – an irrelevant upstart of a framework that couldn’t possibly hope to challenge the then-titans; Symfony, CakePHP, and CodeIgniter. Oh, how the times have changed. When 2013 rolled around, Laravel began almost immediately to experience a period of explosive – some might even say meteoric – growth.
Now, it’s right up there with the other most popular frameworks – some even believe that, as 2015 moves forward, it might overtake them both in user interest and adoption.
But how did we get to this point? What was it that made Laravel grow from a relatively unimportant framework coded by a single developer into one of the most widely-used PHP programming tools in the world?
On the surface, PHP development isn’t really all that different from any other technical profession in the world. It’s a given that there are certain rules you’ll have to follow; certain best practices you must adhere to. And really, you’ve no reason not to follow them – doing so will make your life as a developer significantly easier, and make you better at your job besides.
That’s what we’re here to talk about today, folks. We’re going to go over some of the best pieces of advice ever given about PHP development – best practices that any developer worth their salt should definitely have in place. Take a look, and see which you use (and which you probably need to implement).
In my “Six Things You Need To Know If You’re Starting A WordPress Blog” post from a couple of months ago, one of the non-essential but beneficial improvements I suggested was to install a caching plugin like W3 Total Cache. Caching helps make a WordPress site faster, but there’s a way to make it even faster still — you can turn it into a completely static site.
If you’re a follower of this blog, you’ll know that we’ve talked about static sites often, usually in the context of static site generators like Jekyll. I’m a fan of static site generators, because I like to encourage people to ditch the content management systems that put a layer between them and the technology they’re using. But, static site generators aren’t for everyone, and for anything more than a personal blog or portfolio site, they’re lacking in much needed functionality. A major reason we use content management systems in the first place is to make it easier to handle more complex publishing workflows, and static site generators aren’t great for that.
Product pages are a perennial source of headaches for eCommerce retailers, especially mid-sized retailers that have large catalogues but not the manpower to create a unique experience on each page. Creating compelling product pages can be a significant investment of both time and money, but I’d suggest that it’s more than worthwhile in many case because the result will be more search traffic and increased conversion rates.
In this article, I’m going to take a look at a number of strategies that eCommerce retailers can use to improve the quality of their product pages and, hopefully, their sales.
One of the interesting (and some might say depressing) details to come out the recent collapse of The New Republic was the comment of new editor Guy Vidra that he thought “the magazine was boring and that he couldn’t bring himself to read past the first 500 words of an article.” The battle between a century-old cultural institution and Silicon Valley’s “disruptive” mentality aside, what struck me about this was Vidra’s apparent distaste for longform content.
If you look at the most successful publishing ventures on the Internet at the moment, stalwarts like the New York Times aside, we see Buzzfeed and Upworthy: outlets that focus on short content that generates clicks and shares. On the other hand, there’s a strong movement to promote longform content on sites like Longform and Medium. And, as there is for every movement espousing a niche interest, there’s a reaction against the rising popularity of longform. The result is acres of content — from the longform to the tweet — being published about the correct length of content and the benefits of writing a lot of it, to which this (short) article is about to add.
A web robot, commonly referred to as a bot, is any non-human web user. They are usually scripts or computer programs that access web pages for a variety of different reasons. The most familiar bot is Googlebot, which accesses and analyses web pages for inclusion in Google’s search index. Most site owners want Googlebot to come calling, but there are also many bots with less friendly intent, including those that visit a site and attempt to hack it or scrape the content for use on other sites.
In fact, a significant majority of web traffic is generated by bots rather than humans. Towards the end of 2013, Incapsula estimated that about 61.5 percent of all web traffic was bots.
When configuring your virtual private server, one of the most important decisions you’ll make involves your operating system. It’ll form the core of your server, regulating which apps you can install, how long it takes to tune everything, and even how well your server performs.. It goes without saying that it’s not a choice you should make lightly.
But maybe not for the reasons you’d think.
Our hosting platform and extensive international infrastructure allows us to offer you some of the fastest and most reliable managed hosting available. That same infrastructure can also be leveraged by agencies, businesses, and other organizations to offer web hosting to their clients. Until now that hasn’t been as easy as it should, which is why we’re happy to announce that we have partnered with WHMCS to bring the best in web hosting automation to Future Hosting.
The WHMCS hosting automation service will be available on our virtual private server and dedicated server plans for an additional charge of $11.95 per month.
WHMCS is the industry leader in hosting automation. It significantly reduces the burden of providing web hosting services by automating numerous hosting workflows.
Although there’s mounting evidence that page speed doesn’t have as much of an effect on PageRank as was initially believed, that doesn’t mean it’s not extremely important to your success in the online arena. A slow site can still have an adverse effect on your traffic, as frustrated readers decide to take their business elsewhere. It goes without saying, then, that if you’ve got a slow website, you want to do everything you can to address the problem.
We’ll lend you a hand with that. In today’s piece, we’re going to go over five of the most common causes of website slowdown. More importantly, we’ll detail how you can deal with each one.
If you’re looking to set your business up with a dedicated or virtual private server, you’ve a very important decision to make, perhaps even more important than which hosting company you choose. It involves how much money you’re willing to spend in the interest of convenience. More importantly, it involves how much work you intend to put into your server – and how much control you’ll have over its operation, besides.
I am speaking, of course, about choosing between managed and unmanaged hosting – literally, choosing between having a host run a server for you and running that same server yourself. It’s important that you understand the differences – and strengths – of each approach as well as your business’s resources and needs. Otherwise, you might end up making the wrong choice.
We’re here today to ensure that doesn’t happen.
Let’s have a chat about frameworks. Although certainly not the best choice for every development project, when used in the proper context, a framework can be incredibly powerful. By using one of the many frameworks available on the web, a savvy developer can significantly reduce development time while simultaneously creating compliant, structured, and easily-maintained apps.
There are a ton of different frameworks available to you as a developer, regardless of the language you’re programming in. Today, we’re going to focus on one in particular: Yii. Designed for the development of high-performance, Web 2.0 applications, the open-source Yii Framework has been around since 2008.
When the web was first developed, it was designed to meet the needs of academic and scientific institutions like CERN, for which Tim Berners-Lee worked when he laid the web’s foundations. There was no concept of secure communications baked into the protocols that underlay the web. Tthe web and the world have changed since the early 90s, and the need for secure encrypted connections is clear.
We have a reasonable technological solution for providing encrypted connections between clients (often browsers) and servers. SSL (more properly known as TLS) and HTTPS are conceptually sound, even if the implementation sometimes leaves a little to be desired. But although SSL works, it is a million miles away from being user friendly: even technically adept people have trouble implementing SSL on their domains, which is why many don’t bother.
Tracking is the holy grail of the online advertising industry. Randomly throwing advertising at users has a very low success rate. The better advertisers can predict what a user will be interested in, the more likely they are to serve advertising that gets more clicks that convert to more sales. To target advertising, networks need to develop profiles of users, and the most common way to do that is with cookies. A cookie is placed in the user’s browser containing a unique identifying number, and whenever a browser visits a site that belongs to the advertising network, code on the page looks at the cookie. In this way, advertising networks can track users across the web — and if those users are logged in to a service like Google or Facebook, the tracking can be all the more accurate, because they can associate it with much richer data.
If you’re a regular reader of this site, the question in the title may seem a little basic, but for millions of non-technical people with a desire to blog or create a site, choosing a content management system from the plethora of available options is a serious business. So, in this article, we’re going to have a look, in basic terms, at what WordPress is.
So, you’re a small business owner or webmaster thinking of setting up your first website, and you’re looking for a hosting partner to help you run things. Chances are pretty good that whichever company you end up going with, you’ll select a shared hosting plan. After all, everyone’s always saying this is where a small business should start, right?
What if I were to tell you that isn’t always the case?
While it’s certainly true that most first-time webmasters will want to start with shared hosting – it’s far cheaper than any other plan, and far simpler to manage – that doesn’t mean it’s the Holy Grail of hosting for every small business. Before you settle for a shared plan, there are a few questions you need to ask yourself.
To anyone working in the web design and development fields, it probably seems like old news, but it’s worth taking a moment to acknowledge that HTML5 has been officially recognized as a standard by the W3C, the body in charge of web standards (for the most part).
Of course, HTML5 has been in active use for years, but the modern version of HTML now has the imprimatur of the organization overseen by the web’s inventor, Tim Berners-Lee.
“Today we think nothing of watching video and audio natively in the browser, and nothing of running a browser on a phone,” said the W3C Director. “We expect to be able to share photos, shop, read the news, and look up information anywhere, on any device. Though they remain invisible to most users, HTML5 and the Open Web Platform are driving these growing user expectations.”
In what is becoming a worryingly frequent occurrence, a vulnerability has been reported in the SSL protocol used to encrypt connections between web clients and servers. The good news in this case is that the vulnerability occurs in a relatively ancient version of SSL (so old that it was still called SSL, and not the more modern TLS). The bad news is that the way SSL is implemented on modern browsers and other clients means that the ancient protocol is sometimes still used.
Cutely named Poodle (Padding Oracle On Downgraded Legacy Encryption) and officially named CVE-2014-3466, the vulnerability has the potential to allow an attacker to read plaintext versions of data that should be encrypted.
MySQL has a great deal to offer your server – there’s a reason it’s among the most widely-used database platforms in the world. Offered by the majority of Linux-based web hosts, the open-source database is incredibly fast and lightweight, in addition to being closely integrated with PHP. This makes it ideal for a wide variety of different installations including data warehouses, web hosting, web development, application management, and digital storefronts.
Even better, it’s open-source. That means that not only does it provide you with a great deal of freedom concerning how you use it, but at an incredibly low total cost of ownership. Factor in that it’s one of the more intuitive platforms on the market, and there’s a clear case in favor of installing it.
Now, it’s worth mentioning that – even though it is known to be one of the simpler database platforms in terms of usability – it isn’t necessarily easy to work with if you don’t know what you’re doing. Today, we’re going to walk you through what’s involved if you want to install MySQL onto a virtual private server. There are several ways you can do this, and which one you use depends on what your server’s running your server is using (via the MySQL Dev Site)
Today, we’re going to talk about a growing – and incredibly important – tool in the world of web hosting: the content delivery network. You’ve probably heard the term, at least in passing – and wondered if your website could use one. Today, we’re here to help you decide.
First thing’s first, let’s talk about exactly what a CDN does.
A CDN works by caching your content at several points of presence spread out across a global network. When a user accesses something on your site, the CDN taps into whichever point of presence is closest to them. This has several effects.
Tracking is the holy grail of online marketers and businesses that rely on accurate information about their users. The motivation for tracking is hardly ever as suspicious as some privacy advocates would have us believe. Companies use the information to provide better services. Nevertheless, users should be able to decide for themselves whether to allow their online activity to be tracked. That many decide to install tracking blockers and deny the use of third-party cookies is evidence that there’s a proportion of Internet users that dislike the idea of being tracked.
ETags are a method used by some site owners to circumvent user choice where tracking is concerned, and they are an interesting illustration of how tracking works.
Business Continuance Protection helps companies avoid one of the little-considered consequences of a DDoS attack: a huge bandwidth bill.
Denial of service attacks are the bane of modern site owners and web service companies. It seems like every month we hear a story about the biggest ever DDoS attack disrupting some service or other. DDoS attacks are especially pernicious because they can be so difficult to mitigate. Most hacking attacks exploit vulnerabilities in software that can be fixed if they’re discovered. But denial of service attacks turn the fundamental technologies of the web into a weapon to be used against online businesses.
Click-baiting works if your goal is simply to drive traffic, but businesses that don’t survive on page views and advertising impressions should avoid click-baiting if they’re to create the right impression.
As a writer of online content, I pay attention to the tactics that my fellow writers and publishers use to attract traffic. In recent months, I’ve seen an alarming rise in the number of business sites that with click-baiting. The virtues of click-baiting for a certain type of publisher may be debatable, but for businesses that rely on consumer trust, click-baiting can be a big mistake.
It was a great idea in theory: some writers can be relied on to publish authoritative content on particular topics, but the implementation was flawed, and so, Google Authorship goes the way of all Google programs that have failed to demonstrate their value to the search giant.
In what will not be much of a surprise to those who have been following Google Authorship, the system for linking content to an author via a Google Plus profile has been retired. The writing was on the wall for Google Authorship since last month, when it was announced that the rich authorship snippets that Google has been including in the SERPs for the past couple of years were to be slimmed down with the removal of the byline head shots that had been Authorship’s major draw.
There are lots of things we can do to make our sites faster: caching, image optimization, compression, and, of course, low-latency web hosting. They’ll all have an impact on load times, but often, taking a scattergun approach to performance optimization isn’t the most effective method. To get the best performance from a site, we need to think about what needs to load to present a usable site as quickly as possible and then load that first.
Aside from choosing your host, selecting your control panel is one of the most important decisions you’ll make as a client. Whichever control panel you ultimately settle on, it’ll have a direct impact on both cost of ownership and functionality. Save yourself the headache, and make the right choice right at the start.
At this point, some of you are probably leaning towards cPanel. I don’t blame you. It’s the current industry leader, backed by a passionate and thriving community and equipped with a comprehensive set of features. It’s reliable, it’s durable, and it’s powerful; it’s not hard to see why it’s got the biggest market share.
Cybersquatting in the age of hundreds of generic top-level domains could be a huge problem, but is it really something that businesses have to worry about?
Cybersquatting has long been a headache for brands. Less-than-honest third parties have an interest in securing domain names that are similar to the trademarks of existing brands. It can allow them to misrepresent themselves as that brand, to harm that brand’s image, or attempt to force a company to pay a significant premium to secure a domain name that could be damaging in malicious hands.
Often the solution has simply been to register all the domain names that contain a trademark. That’s not ideal and it can be expensive, but the alternative has the potential to be worse — domains are usually “first come, first served” and having an infringing domain suspended or transferred can be expensive and time-consuming.
When you’re dealing with sensitive information in the digital realm, you encrypt it. That’s one of the most basic facts of security, right up there with controlling access and running regular audits. Sending unencrypted data between two points would be tantamount to walking through a windstorm bearing an open briefcase filled with privileged documents.
Unfortunately, while traditional encryption methods are perfectly capable of protecting our information, they’re anything but perfect. There are glitches. There are flaws, exploits, and backdoors. As the recent Heartbleed scandal so-unnervingly revealed to us, our information may not be quite as safe and secure as we think it is.
Over the last few years “the cloud” industry has been on a serious marketing blitz. The basic aim of cloud marketing is to convince consumers that whatever their question, the cloud is the right answer. That makes sense for cloud vendors, but it’s not clear that the cloud is always the right choice for site owners. In this article, I want to think about whether the cloud is the right solution for a moderate to high-traffic websites and eCommerce stores. I’ll lay my cards on the table: for the vast majority of websites of the type we’re talking about, I think a dedicated server is a much better choice. Here’s why.
If you’re new to the world of web hosting, selecting the right plan for your business can be more than a little overwhelming. There are many factors that go into determining which hosting option is best for an organization, and they vary by the client. Confronted with an overwhelming array of choices, how does one avoid making an incorrect – and costly – mistake?
At the end of the day, it’s ultimately about understanding what your business needs. What are you planning to use your hosting plan for? And just as important, what sort of budget is available to you?
If you’re running an unmanaged VPS instance that deals with sensitive data or private information, it goes without saying that you want to keep it as secure as possible. As a server administrator, there are a few basic measures you should take to ensure the information you’re dealing with stays securely in your hands. Ignore these procedures at your own peril, as they are the key to protecting your VPS.
One of the most significant challenges in web hosting is calculating how much bandwidth your website needs. Give yourself too little, and your website’s performance will drag down to a crawl under heavy load. Too much, and you’re unnecessarily funneling away money that could be going elsewhere. Accurate estimation of your bandwidth requirements is a must if you’re to strike a balance between cost and performance.
Thankfully, this isn’t all that difficult to do…provided you know which questions to ask.
Given the number of hosting options available, it’s no surprise that some organizations have a bit of difficulty choosing the one that’s right for them. How can one tell which choice is best-suited for their needs? How can one avoid an incorrect choice, and a costly mistake?
It’s all about knowing what your business’s needs are. If you’re a small to mid-sized business with a fairly well-trafficked website, there’s a good chance that shared hosting isn’t for you – and an equally good chance that you aren’t quite large enough to justify a dedicated server hosting plan. Unless you’re running a series of incredibly resource-intensive web apps, it’s quite likely that a virtual private server is the perfect choice.
Given that there’s not exactly a dearth of hosting options available, it can sometimes be difficult to determine which is the best choice. While making the wrong choice probably won’t cause an business to collapse, it could still result in a lost revenue. Knowing the difference between the different breeds of hosting is the first step towards making the right one.
In theory, every site on the Internet is accessible to every user. Packets flow across networks from server to browser unhindered — although it won’t be like that for long if some companies have their way. Every point on the network is connected to every other point, so in principle it shouldn’t matter where a site is hosted: London, Novgorod, and Cape Town are just as reachable from Chicago as anywhere else with a connection.
Unfortunately, physics, and technical and economic realities get in the way of the Utopian view of the Internet. We all know that site speed is important. Slow sites are ineffective, regardless of their purpose. Conversion rates are eroded by poor performance, bounce rates increase, time-on-site takes a dive, and even SEO takes a hit.
The Domain Name System is one of the most important components of the Internet’s infrastructure. It’s what allows websites to use human-readable web addresses rather than hard-to-remember and unbrandable IP numbers. The DNS is responsible for translating domain names into IP addresses, which are then used by machines to route packets of information around the Internet. Site owners are frequently under the impression that they “own” their domain name, but that’s not really the case. No one owns domain names; they merely pay for the use of them for a while.
The public deals with domain name registrars, which often take the form of web hosting providers or other entities that provide online services. Users pay registries to register their domain names, but who do they register them with?
Today, I’d like to start with a story. Bill is a website owner whose site is experiencing a period of unprecedented traffic. Unfortunately, his servers were ill-prepared for such a spike; unable to withstand the strain, they’ve come grinding to a halt. Suddenly, all those users who were discovering or visiting his website are instead being directed to an error page.
It’s a catastrophe. Bill knows full well that this has the potential to severely damage his site’s reputation; a fact made all the worse by the disheartening knowledge that this whole fiasco was entirely preventable. If he’d made the necessary preparations, he could have weathered the storm and enjoyed his newfound popularity – even if it was only temporary.
So, the next question is clear: how can you prevent this from happening to you? How can you prepare yourself for traffic spikes, while adequately predicting your average required capacity in the process?
Web hosting affiliate programs are a method of revenue generation that allows webmasters to easily generate income from their site.
Even successful sites have trouble generating revenue. Advertising is the most popular monetization strategy, but unless a site generates a large amount of traffic, advertising revenues can be pitiful.
Subscription is an option and it’s proven successful for some solo bloggers, Shawn Blanc and Andrew Sullivan are good examples. If a site has a committed community, is home to a popular writer, or provides information of considerable value to a niche audience, then subscription may be a better option than advertising, but it isn’t a model that will work for the majority of sites.
If you’re starting a business that isn’t in the tech space, you probably don’t have a lot of experience with web hosting. In recent years the industry has fragmented, with many new product categories coming to market — a process driven by the advent of the cloud.
There are true cloud solutions, cloud washing: presenting traditional hosting products as if they were cloud products, various sorts of virtualized servers, and of course good old fashioned shared hosting and dedicated servers.
If you don’t have a lot of experience with web hosting, it can be tricky to navigate the strengths and weaknesses of various products, separate the genuine benefits from the marketing hype, and make a choice that is right for your business.
If any of you have ever ventured in the basement of a small or medium business over the past couple of decades, you’ll have no doubt noticed a creaky old piece of equipment lurking down there, faithfully — or quite often, not faithfully — managing telephony for the business.
That machine is a PBX, and it’s about time we sent the PBX the way of the telex and the rotary telephone: “Thanks for your service, but you’re expensive to buy, a pain to maintain, and we have something that work much better.””
Anyone who has ever used Skype knows what VoIP is. Instead of sending phone calls directly over the lines, conversations are converted into data packets and transmitted over the Internet. The result is a much more flexible, economical, and reliable way to manage a business’s telephony infrastructure.
Google Webmaster Tools Introduces Security Issues Feature, Provides Expanded Malware And Spam Information
It’s a nightmare scenario for webmasters. You wake up one morning to an inbox full of worried users complaining that Google Chrome is showing them a big red malware warning when they try to visit your site or an email from Google itself via Webmaster Tools letting you know that they’ve identified malicious code on some of your pages.
Obviously, that can be bad for site’s reputation and it can seriously impact traffic as Google stops sending search users. But, the biggest concern is ridding the site of malware and removing the vulnerability that allowed hackers to place malicious code on the site in the first place. I’ve heard many stories of frantic webmasters trying to clean out their site, only to be told again and again that they have failed to remove the malware.
DropBox is an enormously useful service that has all sorts of different uses for businesses and individuals alike. In principle, what it does is fairly simple, syncing one or more folders to a cloud service and from there to as many devices as users choose to connect. That ability, coupled with the ubiquity of mobile devices and the need to share data between numerous different people has made DropBox one of the cornerstones of the cloud computing revolution.
However, not everyone is satisfied with the idea of sharing their data with a third-party service and relying on that service to keep it safe. While DropBox is relatively secure, that’s often not good enough for businesses who have to adhere to regulatory requirements, who want to keep their data private, or who just dislike the idea of putting all their eggs in one basket under the control of someone else.
BitTorrent Sync is a new service from the minds that created the BitTorrent file sharing protocol that implements a way to sync folders without the need for a third-party service. One drawback of BitTorrent Sync is that, although it is capable of syncing data between multiple devices, without the cloud component, if those devices are lost, stolen, or simply turned off, there’s no way to continue to sync the data. However, if you set up BitTorrent sync on a Virtual Private Server, you get the best of both worlds: a private, always-on syncing service over which you have complete control.
The technology works similarly to the BitTorrent peer-to-peer protocol, with each of the connected devices acting as both server and a client to efficiently transfer data. One of the most important aspects of the Sync protocol is its security. BitTorrent Sync encrypts all transfers between devices with an AES cipher and a secure 256 bit key. BitTorrent Sync works very well on a Linux VPS and has client software for Linux, Windows, Android, and iOS.
In addition to straightforward syncing of data between multiple devices, the service also has some handy additional features for security conscious users, including read-only access where data will sync to a device, but changes on that device will not affect other devices, and one-time secrets, which provide single use access to a folder.
If you want to set-up your own DropBox replacement, you’ll need a Linux Virtual Private Server with the appropriate version of BitTorrent Sync installed. There’s a great tutorial for setting up Sync with Linux, and you can find full instructions for using the service on the BittTorrent Sync site.
For most modern businesses, data is either their product or an essential component in the design, creation, sale, and marketing of their products. Catastrophic data loss almost always means lost revenue. Every system administrator, IT technician, and executive knows that maintaining regular backups is strictly necessary for ensuring business continuity.
A recent study found that data loss accounted $400 million in lost revenues annually, and troublingly showed that much of that data loss was preventable. A survey conducted by Carbonite revealed that half of small businesses are hit by data loss, with inadequate backups being the most frequently cited cause. Furthermore, many of the businesses that suffer irretrievable data loss are immediately put out of business, with a significant proportion failing after two years. Having adequate and well-tested backups is a crucial part of business continuity planning.
However, backups come in various different forms. Traditionally, backups have been made daily, weekly, or even monthly. In the even of a hardware failure, data in the period between backups can be lost, and frequently that means work that was done in that period is rendered worthless. Lost data equates to lost man-hours and lost opportunities for revenue.
There is an alternative to atomic periodic backups: continuous data protection. Unlike with traditional backups, continuous data protection (CDP) strategies use an approach that results in much more finely grained backups.
CDP works by incrementally capturing changes to data as they are made, rather than gathering the sum of those changes and creating a copy of the original data. Capturing just the deltas results in numerous benefits, including the ability to roll-back to a previous state, lower bandwidth requirements, and more efficient use of backup storage — there’s no need to replicate 1 GB if only 1 Byte has changed on the disk.
Continuous data protection is often usefully used with MySQL to prevent data loss. CDP for MySQL is an effective method for preventing data loss because of hardware failure and it ensures that there is minimal business interruption even in the case of a catastrophic failure.
CDP is not a replacement for atomic backups and should not be relied on as a business’s sole backup method. There should always be multiple backups of important data, including on-site and off-site backups in concert with continuous data protection, but CDP provides an additional level of protection and the assurance that data loss can be limited to a very small period of time.
Future Hosting’s Future Protect automated backup solution uses continuous data protection technology to ensure that client’s data is always available and up-to-date in the case of a node failure.
Choosing a hosting plan is one of the most important decisions that a new business can make. It can have an impact on both short- and long-term online strategy, investment, scalability, and agility. Those looking for new hosting have a number of fundamental plan types to choose from: shared hosting, virtual private servers, dedicated servers, and cloud hosting. Each has specific benefits and limitations, so it’s important to make the right choice.
Small and medium business that don’t expect the sort of site load that would require dedicated servers often choose between shared hosting and a virtual private server for their initial site development and launch. Both provide an excellent hosting platform in specific circumstances, and choosing between them is not always easy, so we’d like to help you to make the right decision.
Advantages Of Shared Hosting
Suppose there was a single strand of perfect fiber optic cable that stretched all the way around the world and ended up back where it started. How long would it take a signal to circumnavigate the world through the cable and end up back at its point of origin?
Light is very fast. In a vacuum it travels at about 186,282 miles per second. In fiber optic cable, because of the refractive index of the material, it travels more slowly: approximately 124,188 miles per second. The Earth’s circumference is 24,901 miles. So, in our world-spanning cable, light would complete its journey in about 0.2 seconds. In a perfect world, you could transmit a signal to the other side of the world and back again in a fifth of a second. Unfortunately, reality does not match the theory, and on the real Internet, there are many obstacles through which a signal has to pass before it gets to where it’s going and back again.
Future Hosting Partners with InterWorx to Bring Powerful, Unique Web Control Panel Features to Clients
Future Hosting (futurehosting.com), a web services provider renowned for their virtual and dedicated hosting services, has announced a partnership with InterWorx (interworx.com) web hosting control panel. Through this alliance, Future Hosting adds a unique new option for virtual, hybrid, and dedicated servers pre-loaded and licensed with the Internet’s most popular web-based solution for instant server clustering.
With the immediate ability to license InterWorx, Future Hosting’s administrators, resellers, and clients are given access to an extensive array of features that automate otherwise difficult and tedious tasks. At the forefront of this feature set is the software’s unique ability to scale out high-performance server clusters, and manage servers via powerful web-based, command-line, and API interfaces.
“As the Internet has evolved, hosting services from a single server is no longer good enough for many,” said Vik Patel, Future Hosting president & CEO. “InterWorx was an obvious choice, as it provides an affordable solution that allows successful ventures to grow to their full potential, free of complicated maintenance and otherwise unnecessary downtime.”
“We’re very excited that Future Hosting has begun offering InterWorx to their clients,” said Paul Oehler, CTO of InterWorx, LLC. “Our exclusive scalability options, coupled with our intuitive user interface, will certainly add significant value to Future Hosting’s offerings.”
InterWorx Control Panel is to be made available effective immediately with all Future Hosting server products. For InterWorx server hosting options at Future Hosting, visit http://www.futurehosting.com.
About Future Hosting
Founded in 2001, Future Hosting is a privately held leading Internet solutions provider specializing in managed hosting, including Dedicated Servers, Virtual Private Servers, and Hybrid Virtual Private Servers. The company has built a strong reputation for its high-quality service, innovative pricing models, and 3-hour Service Level Agreement. Future Hosting is based in Novi, Mich. For more information, visit http://www.futurehosting.com.
InterWorx, LLC is a Pittsburgh, PA-based software development company celebrated for authoring the InterWorx Control Panel. The InterWorx Control Panel is a powerful web-based solution for managing web hosting servers, equipped with unique features such as load-balanced server clustering, dense real-time reporting, maintenance of system services, customer account management, and much more. For more details, and a demo of what the InterWorx Control Panel can do for your organization, visit http://www.interworx.com.
Future Hosting Grows Global Footprint with Hybrid Servers in Chicago, Washington, D.C.
NOVI, Mich. (July 26, 2010) – Future Hosting, an Internet solutions provider serving SMBs and enterprises internationally and developer of Future Engineer™, today introduced Hybrid Servers in Chicago and Washington, D.C., further expanding the company’s global footprint. Hybrid Servers are also available in London and Dallas, allowing clients to now choose from four international cities.
Hybrid Servers from Future Hosting are unique, high-end Virtual Private Servers that deliver full management, high-availability, premium bandwidth, and access to either the cPanel or Plesk hosting control panel. Unlike standard Virtual Private Servers, Hybrid Servers offer an average of 1 CPU core per Hybrid Server and high-performance RAM, disk storage, and network connectivity.
“Offering truly global solutions for our clients remains a strategic goal for Future Hosting. In the past two years alone, we have more than quadrupled our international presence, and created the ability to offer full-service hosting solutions for companies based in the United States and abroad. By adding Hybrid Servers to two new major American cities, we are fulfilling our commitment to continuously build geographic diversity in our services and open the door to even further expansion in the months ahead. This is one more step in our march to globalization,” said Future Hosting Chief Strategy Officer Stephen Kowalski.
Hybrid Servers from Future Hosting are automatically backed up by the company’s next-generation Future Protect™ data protection service, which delivers the ultimate in protection against data loss. Starting Hybrid Server plans offer 80GB of disk space, 2GB of RAM, and 2000GB of premium bandwidth. All Hybrid Servers include SLA uptime of at least 99.9%.
In early July, Future Hosting added complete DNS management functionality to the Future Hosting Management Portal (FHMP), which provides dedicated server clients with complete control of their servers. FHMP is slated to eventually incorporate Hybrid servers, further enhancing user control.
Future Hosting is the developer of Future Engineer™, a technical support automation system designed to automate time consuming server configuration and repair tasks on Virtual Private Servers (VPS), Future Engineer™ allows clients to receive a greater level of technical support than is available when technicians spend their time performing routine or basic maintenance tasks. With Future Engineer™, technicians are free to spend their time offering a level of technical support that is unrivaled and otherwise unaffordable.
For more information about Future Hosting, please visit http://www.futurehosting.com.
For more information about Hybrid Servers from Future Hosting, please visit http://www.futurehosting.com/managed-hybrid-servers.
About Future Hosting, LLC
Founded in 2001, Future Hosting is a privately held leading Internet solutions provider specializing in managed hosting, including Dedicated Servers, Virtual Private Servers, and Hybrid Virtual Private Servers. The company has built a strong reputation for its high-quality service, innovative pricing models, and value-added features. Future Hosting is based in Novi, Mich.