If you’re reading this, you already know enough about web application development to be cautious when selecting your database. The better you name the specific goal of your application, the more likely you’ll be to make the smart choice. Read on to learn more about database types and the strengths and weaknesses of each.
Category Archives: Webmaster
If you’re planning to get into PHP web development, you can save yourself a lot of time (not to mention more than a few headaches) by working with a development framework. Designed to reduce development overhead and dial down complexity in the development process, a framework includes a series of tools, pre-built components, and code samples that you can use to eliminate a lot of the redundant legwork when it comes to web development.
When I talk to business owners about server hosting, we often talk about which content management system is best for their business. That discussion is often framed in terms of performance — they ask me which content management system is fastest.
Matt Mullenweg recently published We Call It Gutenberg for a Reason, an article that discussed, among other things, the future of WordPress. WordPress is the most popular content management system: there’s a big gap between the number of WordPress sites and the number of sites based on its nearest competitor. But Mullenweg thinks WordPress has room to grow. One of the figures he quoted to support that claim caught me by surprise.
That problem is only compounded by the growth of the Internet of Things. New devices are flooding onto the market at a downright alarming rate. Never mind computers, smartphones, and routers. Now, we’re seeing smart televisions, refrigerators, thermostats, and even coffee makers.
If you use Apache Struts, make sure that it has been updated to the most recent version.
Anyone who has worked as a system administrator or network engineer will understand where I’m coming from with that question. As business owners, online publishers, and web service providers, we often focus on the process of creating backups and designing disaster recovery plans. But far less attention is paid to regularly checking that those processes are working as intended.
The text in the book is justified, with each edge of the text block making a straight line. The web page is set ragged right — it lines up against the left margin, but not the right. Why is the text of most books justified, and the text of most web pages not?
I don’t think I’ve met a single person who was a fan of RSA Tokens. They’re cumbersome to use, and easily lost. They seem like they need to be reset at least once a week, and inevitably end up taking a massive chunk out of a security budget that honestly can’t handle the expense.
Worst of all, they’re incredibly outdated – everyone these days already carries a smartphone to work. Nobody wants to be saddled with another device. Especially not something so clunky and aggravating.
WordPress is hands-down the most popular content management system in the world. But it remains just the tip of the CMS iceberg. There are hundreds of alternatives, each designed with a different focus and philosophy. Without further ado, let’s take a look at five content management systems I think are suitable WordPress alternatives for blogs and more complex websites.
Out of the box, WordPress provides an intuitive and elegant editing and publishing experience, but part of what makes WordPress such a popular content management system is its vast ecosystem of plugins. WordPress Core provides all the necessary tools to publish content, but no single project can be all things to all bloggers.
There are thousands of plugins — both free and paid — to extend WordPress in any way you can imagine. The breadth and depth of the plugin ecosystem is astonishing, but it also poses something of a problem for new users — without testing hundreds of plugins how can they know which are right for their site?
Honestly, it’s sometimes hard not to miss the days with administrators had complete control over all the software and hardware within their organization. It made keeping things secure so much easier. There was no worrying about a bungling employee accidentally forwarding corporate secrets to a competitor, or someone making off with business-critical data.
No worrying about shadow IT, and the fact that people who know almost nothing about enterprise security are expected to be responsible for keeping your business’s information safe.
A couple of weeks ago I was talking to a marketer who was excited about the results of her most recent round of A/B testing. She’d hypothesized that a small tweak to the copy on a landing page of her client’s site would increase conversions, so she implemented a split test and waited for the results to roll in.
A/B testing is a useful way to gather evidence about the effectiveness of web design decisions, and I was interested to see her results. Or at least, I was until she said that her idea was so effective that she ended the test after three days and implemented the changes she’d suggested across the site.
Back when Ghost was the hot new thing in the content management system world after an incredibly successful Kickstarter campaign, many of us wondered whether it would take the place of WordPress as the preeminent blogging platform. As it turns out, that didn’t happen. Instead, from a great start Ghost steadily developed into an excellent blog engine with a loyal following.
This month, after almost four years of development, Ghost reached a new milestone with the release of Ghost 1.0. That version number is slightly misleading because Ghost has been fully functional for several years, but arbitrary though it may be, the release presents an opportunity to take another look at Ghost and see where it is headed in the future.
All users of Node.js on virtual private servers or dedicated servers should update their Node installation to the most recent version, which fixes a critical remote Denial of Service attack. In other Node news, the NPM project released npx — a new tool that makes life easier on Node users — as part of the update to NPM 5.2.
The security issue in Node, which was originally reported by Google’s Project Zero researcher Jann Horn, was caused by a hash flooding vulnerability that has the potential to allow malicious actors to inflict significant disruption on users of vulnerable Node versions. All maintained versions were vulnerable, including the 4.x, 6.x, and 8.x lines. The 7.x line and older unmaintained versions are also vulnerable but are not patched. If you’re using an unmaintained version of Node, update as soon as possible to a maintained version or your system will remain insecure.
Graceful degradation involves designing with the assumption that that most users will have most functionality available to them, while providing alternatives for those that don’t. The typical graceful degradation approach will use <noscript> element to offer equivalent functionality for JS scripts that aren’t run.
There’s no doubt that the web would not be where it is today without companies like Facebook, Google, Yahoo!, and even AOL. Corporations have driven growth, investment, and innovation on the web, and social media networks have made the web a part of everyone’s lives. There are some who regret the influence that big companies have had in shaping the net. I’m not one of them: the internet would be a far poorer place without the innovation of Google and Facebook and the many companies that went before and didn’t make it to 2017.
A couple of years ago, Buzzfeed released the results of the New York Times’ comprehensive survey of the state of their web presence. Overall traffic for the website was solid, but what’s of most interest is the numbers for the NYT homepage, it had plummeted over a couple of years. The NYT front page saw less than half of the traffic it did 2011, dropping from 160 million visitors to 80 million visitors.
The combination of strong overall traffic and a massive decline of home page traffic lead some to announce the demise of the home page. I think that’s a bit premature for most types of site, but it’s a strong indication that use patterns on the web are not what they used to be. Back in the day, news readers would either enter the URL of their favorite news source in the browser or search for the site in Google. Their journey would almost always begin at the home page, from which they would navigate to articles or sections caught their interest.
People make mistakes. Even developers aren’t infallible. With that in mind, there’s a good chance yours may have missed a few things in the testing phase for your latest app.
Let me stop you before you try to say that isn’t possible. See, not every software bug is going to be obvious, and not every issue is going to be caught by a few bug-tests. There are things you can miss – and the first step in not doing that is being aware of that.
Is it wrong to download an image made available under a Creative Commons Licence and put it on a t-shirt or poster to sell? I have photographer friends who absolutely insist that it is wrong. Why should someone else profit from their hard work. They don’t really mind if the occasional blogger uses their work in posts. But when it comes to a company taking those images and using them to make money, it’s a different story.
But, if images are released under a Creative Commons licence that permits commercial use, the company using the image to turn a tidy profit is legally in the clear: they owe nothing to the photographer.
JSON Feed is a new feed format from Brent Simmons and Manton Reece — developers who are probably familiar to those of you that follow the Apple world. JSON Feed is intended to do the same job as RSS and Atom: allow feed readers and other applications to access an easily parsed feed of items, including blog posts, social media posts, and podcasts.
You might be asking yourself why we need yet another feed format. RSS and Atom get the job done. There’s no clamor from users for a “new and improved” feed format. Nevertheless, I think JSON Feed is welcome, largely because it is just JSON. In recent years, JSON has become the “default” data serialization format of the web. Most developers — and all web developers — are familiar with JSON. Every language worth using has a library for reading and writing JSON. Modern APIs are JSON-based, and many legacy XML-based APIs are being rewritten for JSON. JSON is conceptually simple: it’s easy to write JSON by hand and to write code that interacts with JSON data.
For many years, I was a hardcore Linux user. Naturally, any servers I interacted with ran Linux, but so did my desktop. I was a fluent terminal user in Linux and often found it quicker to drop to the command line than to try to accomplish tasks in the GUI.
A couple of years ago I switched my main machine from a beat-up old Dell running Ubuntu to a shiny new MacBook. I wasn’t disillusioned with Linux, but as I became busier I found less time to spend creating the perfect Linux desktop experience and I wanted something that just worked. MacOS, then OS X, is Unix-based and has many of the same terminal apps as Linux, but the environment isn’t quite the same. Often the installed versions of tools are even older than they are on an LTS Ubuntu installation or CentOS, and, of course, Linux package managers aren’t available on a Mac.
Application development has come a long way over the past decade or so. We’ve gone from an era where apps were largely controlled by IT – an era where most everything was done offline – to one of constant connectivity. Software as a service is one of the greatest, most disruptive software developments in history.
And if you’re involved in development yourself, it’s one that could either make or break your firm.
In the days of yore, everything was sized in pixels because that was the only choice we had. Things have moved on. We can now choose between pixels, ems, and the recently introduced rems (and a bunch of others that aren’t often used in web design). The differences between them are considerable, and each has its benefits and shortcomings. In this article, we’re going to take a look at what each unit does and when it’s appropriate to use it.
(NB I’m sizing some elements in px that I would normally use percentages to size for the sake of clarity in my examples.)
This February, New York’s Metropolitan Museum of Art released 375,000 digital works under a Creative Commons licence that allows anyone, including bloggers, to use the images without paying for them. In theory, it’s a great resource, and it joins similar resources available from museums around the world. In practice, I rarely use images from these sources in my articles because searching through lots of different image databases would be a significant investment of time.
Google AMP pages will start showing a page’s canonical URL in the AMP header.
Google’s Accelerated Mobile Pages project has proven remarkably successful, achieving adoption rates that must surprise even Google. The lynchpin of AMP’s popularity is its effectiveness. If a site owner wants to offer a fast mobile browsing experience, AMP is the easiest way. But performance, although important, isn’t everything. There are payoffs to using AMP and some publishers find them unacceptable — especially if their site already offers reasonable mobile performance.
There’s been a war raging in the business world for several years now, one between security and convenience. Unfortunately, security has lost. Thanks to mobility and cloud computing, end users are more empowered than ever. What that means from an IT perspective is that if they don’t want to use a solution, they won’t.
It also means that if you want to keep your business secure, you need to work with your users, rather than against them. You need to make the solutions your business mandates easy for them to use. Because if you don’t, then it doesn’t matter how secure your systems are.
Good morning, ladies and gentlemen. Do you know where your data is right now? Maybe you should.
Security firm Kryptowire recently made a troubling discovery about a large number of prepaid ZTE and Huawei phones. Turns out, these devices contained an unwelcome addition – a security backdoor that sent its users data, including text messages, to servers in China every 72 hours. Yikes, right?
The Internet is a great way to connect with your customers, and to spread the word about your organization.That isn’t to say it’s easy, of course. Running any business website – let alone a successful one – is a challenge in and of itself.
There’s a lot that can go wrong, and a lot of mistakes you can potentially make. Today, we’re going to go over three of the most critical. Coincidentally, they’re also the ones I see made the most frequently.
Almost since the web was created, HTTPS has been a vital protection for users of eCommerce stores and other sites that deal with financial data or data that falls under specific regulatory protections like HIPAA. But for many small businesses – and not-so-small businesses that should know better – HTTPS adoption has been slow.
Over the last few years, adoption slowly crept upwards as site owners responded to growing concerns about online security from ordinary users. Most of the people reading this article have long understood the necessity of encryption for sensitive data, but for ordinary users, it’s been less of an issue. Without demand from users or industry regulation, site owners had little incentive to invest in SSL certificates, especially given the complexity involved.
Facebook seems like the ideal advertising platform. Facebook has more information about its users than almost any other platform. And almost everyone a company might want to advertise to will have a Facebook account.
People who don’t write a lot think that writing — the act of creating sentences and paragraphs — is the toughest part of being a blogger. In reality, the mechanics of writing are not all that difficult. It takes some practice, but most people can get the hang of structuring and writing a blog article without much trouble.
The really hard part of blogging is finding something to blog about. When you sit down to blog, you are faced with the task of reducing an infinity of potential topics to one topic. That topic has to be relevant, interesting, and it has to fit within the format of a blog. The journey from infinite choices to writable topic is often painful. A desire to avoid taking that journey is a major cause of writer’s’ block and procrastination.
It’s no secret that the Internet is always evolving. What was considered cutting edge 5 years ago is now antiquated at best. That’s why web developers are best served to get plugged into the latest trends and best practices. Following is a look at what’s on deck for 2017:
Speed And Performance Matter
If you sell more than a dozen products on your eCommerce store, developing original content for each product page can be a big investment. That’s why many retailers simply copy the manufacturer’s product descriptions. Although it’s a cheap and easy way to get copy onto product pages, it’s not a great choice for branding, SEO, and sales.
What is it that distinguishes your eCommerce store from its competitors, many of whom may sell the same products, and some of whom — including Amazon — have marketing budgets that a smaller store can’t hope to match? Often, it’s branding: the way a store presents itself to its customers. A store’s brand — communicated through design, attitude, style, and content — is its most powerful weapon in the battle against the competition.
Although web typography has come on leaps and bounds since the introduction of web fonts, it’s still in the Stone Age compared to print typography. At the recent, ATypI conference in Warsaw, a group of industry heavyweights announced OpenType Variable Fonts, a new specification that could substantially enhance the flexibility of typefaces on the web while reducing the bandwidth needed to download them.
Print typographers and type designers have tools to extend, compress, shorten, lengthen, thicken, and otherwise modify typefaces. The web has none of that flexibility. When type designers complete their design, they’re forced to “flatten” typefaces so they’re available in the limited set of weights and variations that browser type-rendering engines can understand.
Take a look at an article on your business’s blog and ask yourself this question: how does this article help to convert visitors into customers?
For many businesses, the answer is not at all. Their blog is used to bring people to the site, but does nothing to lead them towards buying or subscribing. Visitors stumble across an article, read it, and move on with their lives.
That’s a wasted opportunity. Your business’s blog is a key component of your inbound marketing strategy, but a popular blog that doesn’t convert is almost worthless to the business. It may help increase mind-share for the brand, but it has the potential to do so much more.
We often think of web apps and native apps as completely different things. For full coverage, we need to create a web app, a native Android app, a native iOS app, and perhaps native apps for other platforms too.
Progressive web applications are an attempt to consolidate development on the web platform, while bringing many of the features we associate with native apps to the web. Progressive web apps are intended to work reasonably well on all devices, to be installable, and, most importantly, to offer at least some of their full functionality while the user is offline.
There’s an old adage that I’m sure most of you are familiar with: if it ain’t broke, don’t fix it. Unfortunately, that’s become something of a rallying cry for enterprises with extensive legacy infrastructure. These ancient systems are loaded with custom applications, and though they aren’t necessarily “broken” in the customary sense, they don’t necessarily work all that well in a modern context, either.
“Sooner or later, most organizations face a problem with legacy systems,” writes Information Technology expert Michael Issaev. “Even if a legacy system was architected properly years ago, it is very stable and it does its job, the language it was developed in might no longer be commonly used, and you might experience problems finding talent to support the system. If your legacy system services external clients, you might face a problem with having to replicate the system for each new client. At some point, the number of instances will become unmanageable, and so you would want to implement a new multi-tenant solution.”
The most prominent part of a web page is the area visible when the page first loads. The importance of the “above the fold” (ATF) area influenced web design and conversion rate optimization strategies for many years, but does designing with ATF in mind matter any more? Will users scroll to see content below the fold? Does it make sense to use sliders and carousels to keep content above the fold? Does Google care about the content at the top of web pages?
Newspapers referred to the area of their front page that was immediately visible to readers as “above the fold,” because broadsheet newspapers were folded on newsstands and only half the page was visible. To maximize sales, the above the fold area was used for the most important and attention-grabbing headlines and content. The concept carried over to the web, where “above the fold” referred to the area that appeared in the browser’s window when a page first loaded.
Developers often depend on third-party platforms for distribution of their products and access to the platform’s users. Developers are also often unwilling to stick to the platform’s rules if they think the rules stop them creating the experiences they want to give users.
Zerif Lite, a WordPress theme by Themeisle with over 300, 000 users was recently suspended from the WordPress Theme Repository for multiple violations of the repository’s rules. All of those users are now unable to update their theme.
The canonical technique for choosing keywords is to research relevant keywords with high search volumes and low competition. Although that’s an effective technique in many scenarios, it doesn’t always produce the highest conversion rates because it ignores user intent.
In many cases, lower volume keywords — those that are searched for less often — generate less traffic but higher conversions. Smart content marketers use high-volume keywords, but are careful to monitor the performance of content with an eye to discovering the long-tail keywords and phrases that actually convert.
Articles about Flash usually start by bemoaning the negative impact it has had on the web. I don’t want to do that because, in its day, Flash wasn’t so bad. If you remember the early web, with its blinking headers and tiny images, you’ll remember how impressive Flash seemed as a technology. The jump from plain-old website to interactive games was awesome. I know plenty of developers who love Flash, and who built a career on it. And I know many web users who have spent thousands of happy hours playing games and watching videos powered by Flash.
Many business owners aren’t clear on the difference between a web developer and a web designer. They know they need someone to create a website for them. But, without understanding the roles of web professionals, business owners have trouble hiring the right person and communicating their needs effectively. From the perspective of web professionals, clients who don’t understand what to expect can be hard work.
The divisions we’re about to discuss are somewhat arbitrary. Web professionals are capable of wearing different hats. Some designers are happy to offer front-end development services, for example. Nevertheless, an understanding of what the terms web designer and web developer mean will help business owners get the site they want with less stress, confusion, and expense.
Google’s Accelerated Mobile Pages project, which aims to improve the performance of the mobile web, is set to become available for a wider range of search results in the coming months. Currently, mobile searchers are able to access AMPed pages via the “Top Stories” section of Google’s search results. Google are currently testing a wider roll-out, which should see more AMP pages included in regular search results.
There are two options for web hosting clients who need more resources than a shared hosting account can offer: a Dedicated Server or a Virtual Private Server (VPS). Let’s start by taking a look at how Dedicated Servers and Virtual Private Servers are similar, before investigating the differences that will help you choose.
What is a Server?
Both Dedicated Servers, often called bare metal servers, and Virtual Private Servers offer a full server environment. A server is essentially a computer like your laptop, except that servers are specialized for tasks like hosting websites or web applications. Both Dedicated and Virtual Private Servers have a set of resources and an operating system. Both allow a user to install software of their choice.
You can think of these hosting options as self-contained hosting environments, in contrast to shared hosting, where lots of hosting accounts are served from one computer with a single operating system, each with a share of the resources of that computer. As you might expect, Dedicated Servers and VPS’s typically have more resources — RAM, storage, bandwidth — available to them than shared hosting accounts. They can support more websites and websites that have more visitors.
Differences Between VPS and Bare Metal Servers
Our data center is full of racks of computers. Each contains higher-quality (less prone to failure) and more powerful processors, memory, and storage than your computer at home. A Dedicated Server is one of these computers, which is why it’s also called a bare metal server. An operating system runs on the “bare metal” and your services and applications run on the operating system. Dedicated Servers can be as powerful as it’s possible for a single computer to be.
To create virtual servers, we take one of these very powerful Dedicated Servers and install an operating system and a piece of software called a hypervisor. Hypervisors are complex, but in a nutshell they simulate a server with software. Each simulated — or virtual — server has its own operating system and, from the perspective of the user, is identical to any other server.
Because the Dedicated Servers that run the virtual servers are so powerful, each of the virtual servers has access to a considerable chunk of resources. They can’t be as powerful as the most powerful Dedicated Servers, but they can be more than powerful enough to run relatively high-traffic websites and applications, with the added benefit that VPS’s are less expensive because they use the resources of a Dedicated Server more efficiently.
How to Choose Between a VPS and Dedicated Server
To choose between a Dedicated Server and a Virtual Private Server, you should think about the resources your application needs. If you want all the power that a bare metal server can deliver — to run a high-traffic eCommerce store or a publishing operation that depends on the lowest possible latencies — then choose a Dedicated Server.
If your needs are more modest but still substantial — you want to host several sites or stores, or a single popular site, for example — then a Virtual Private Server might be right for you.
Over the last couple of years, startup and business website design settled around a few recognizable themes: large hero images with contrasting text, a vertical layout with large icons / images and concise text, navigation in a banner across the top which may or may not be sticky, and so on. We all recognize this design — AirBNB might be considered the paradigmatic example.
Some in the web design community have a problem with this sort of design. It lacks imagination, it’s not radical, and for a designer who loves a challenge, implementing such a site is frankly boring.
In the days of yore, everything that appeared on a web page was stored on a server or generated by a process running on a server. Web browsers received and rendered HTML pages that were more-or-less complete in themselves. From a search engine crawler’s perspective, that was all good, because it could just take the HTML and index the content.
If you’re an ordinary web user, you might wonder why so many websites are intent on annoying you with modal popups and interstitial ads. The answer is simple: they work. A signup box in a modal popup will collect more email addresses than a sign-up box sitting in a sidebar. Interstitial ads — those full-page adverts on sites like Forbes — get more clicks than banner ads.
They work, but they also create a terrible user experience — particularly on mobile. When a user clicks on a link, they want the content, not a “unique opportunity to sign-up” to yet another self-serving newsletter. Even though I understand why website use these attention grabbing techniques, as a writer, I avoid linking to sites that use them because I don’t want to annoy the people who read this blog.
One of the most off-putting aspects of building a software-as-a-service platform is the yak shaving. If you’re not familiar with the concept, yak shaving is a term coined by developer Carlin Vieri. To shave yaks is to spend time on tasks that are seemingly pointless but that contribute to a greater goal. Let’s say you have a great idea for an application and you think you’re a good enough coder to build it. You build a modest prototype, and it works great — time to start marketing and attracting users! Unfortunately, it’s actually time to start shaving yaks. Before you let users anywhere near your application, you need a way to charge them and to manage their accounts. That means developing a payment system and an authentication system — neither of which are trivial tasks.
System UI fonts were once considered the safe and boring choice by designers, but in the years since webfonts solved the font availability problem, system UI fonts have improved enormously.
Anyone who has been involved in web design for more than a few years remembers the tyranny of web-safe fonts. Designers were forced to choose from a limited pool of — usually terrible — fonts that could be relied on to be present on everyone’s computer, which at that point meant Windows or Mac. Many of these “safe” fonts weren’t optimized for screen display, and all of them were so common it was impossible to do anything interesting with web typography. But even way back when, choosing the system UI font — the font used by the operating systems itself — was frowned upon.
A side project is a great way to learn something new and expand your horizons. It may be a first step towards to a new business or career. Even if you already run a business or have a career you love, a successful side project can strengthen your reputation as a proactive and creative maker. Side projects have been hailed as the new resume.
Unsplash, a hugely popular free image service, started out as a side project by design community Crew. The company makes no money from Unsplash directly, but its core business receives millions of visits from potential users every month because of Unsplash. Side project marketing is being called the best thing since content marketing — you don’t just write something useful, you build something useful.
Zend Framework is a powerful PHP framework for building complex web applications in PHP. It’s an open source framework sponsored by Zend Technologies, a leading light of the PHP world which is also responsible for numerous other PHP-related products, including Zend Studio, and is a prime mover in the development of the Zend engine, which underlies PHP itself.
Famed for the comprehensive range of components it provides, the Zend Framework gives developers the tools they need to quickly build complex web applications of any size.
It won’t be news to readers of this blog that Distributed Denial Of Service attacks are a growing problem. This July, a European media company was the victim of an attack that peaked at 363 Gbps. The volume of the attack is par-for-the-course these days, but it is interesting to note that the attackers used several vectors to amplify the attack, including DNSSEC.
For those who aren’t familiar, here’s how a typical reflected amplified DDoS attack works. Even the most well-equipped of attackers don’t have access to the amount of bandwidth we commonly see deployed in DDoS attacks. To achieve such huge volumes of data, they need to amplify their bandwidth. There are many ways to do this, but a typical approach is to use open DNS servers.
Twitter is far from the largest social media network, but it gets a lot attention relative to the size of its user-base because of who those users are. The vast majority of online content writers, journalists, trend-setters, and social media influencers are Twitter users. Twitter presents an opportunity for brands to increase their engagement with influencers, and hopefully increase awareness of the brand.
Engage is a new app from Twitter that helps do just that. It’s being touted as an app for celebrities, but unlike Facebook Mentions, Engage is available to everyone (with an iPhone for the moment), and it includes useful tools that small and medium business social media teams and others that rely on online exposure can leverage.
In the modern hosting world, there are three basic choices if you want access to a complete server environment: virtual private servers, cloud servers, and dedicated servers. Each has benefits and potential drawbacks, and it’s important that application developers try to see through the marketing hype to the real-world capabilities of each platform.
It might seem as though the default choice these days is a cloud server — but, while the cloud does offer clear advantages in many scenarios, it’s often not the optimal choice. I’d like to take a look at a few of the reasons many application developers choose dedicated hardware instead of opting for a cloud server.
New web hosting clients are often confused by the way the domain name system and the domain registration system work. More specifically, they’re confused that there is a difference between registering a domain name and actually linking it up with their site via a DNS hosting service.
I’d like to take a look at the three services that work together to ensure that when a user puts a web address into their browser, the appropriate web site appears. Those three services are web hosting, the domain registration system, and DNS, the Domain Name System.
Server administrators will, at some point, have to edit text files on their server. Most Linux application configurations are handled via text files, and making configuration changes is done by changing the content of those files. The naive way to edit files on your server would be to FTP into the server, download the file to your local machine, edit it there, and then upload it to the server, overwriting the original. If you edit a file once in a blue moon, that’s just about justifiable, but if you regularly edit files, it pays to familiarize yourself with the command-line editing tools you’ll find on any Linux server.
Last year, the Federal Communications Commission adopted Open Internet rules that are intended to protect the web from practices like double-charging for bandwidth use and throttling bandwidth for users who decline to pay an extra toll.
A number of organizations — most prominently ISPs — are opposed these net neutrality rules, and launched a series of court cases designed to see the FCC’s decision overturned. In one of these cases, a Court Of Appeal has upheld the FCC’s ruling, and thereby the principle of an open internet. It’s likely the ISPs will take the fight all the way to the Supreme Court, but the recent ruling is at least an indication that net neutrality has legal staying power.
Apple recently released the third update for Safari Technology Preview, a browser designed exclusively with web developers in mind. The release mostly contained a ton of bugfixes – nothing really noteworthy from a feature standpoint. Still, it’s great that they’ve a regular update cycle.
Only problem is, it’s not really regular compared to the competition.
Most medium and small business websites are hosted on the Linux operating system. Linux is a (usually) free platform comprised of open source software — including the Linux kernel and GNU tools — created by thousands of development projects over many years. In the modern web hosting world, building hosting plans on a Linux distribution like CentOS is almost the default choice.
But that doesn’t mean there aren’t other options, including Microsoft Windows. Windows is better known as a desktop operating system, but Microsoft also develops an excellent suite of tools that include a server version of their operating system and a web server.
Your site’s files are stored on a server in a data center. When a browser requests a page, it connects to the web server — usually via lots of other machines like routers and switches — which will send the browser the necessary files. If the browser is close to the server, it won’t take long for the files to traverse the internet. If the browser is on a computer on the other side of the world — or even just the other side of the country — it might take a relatively long time. The browser has to send requests to the server, and the server has to send data to the browser, a process repeated many times for every page request.
As a content management system, WordPress seems all-conquering. It is by far the most popular CMS on the web, with 26 percent of CMS-based sites using WordPress. That seems like an impressive number, but the remaining 74 percent leaves plenty of room for competition.
As web development technology advances, new content management systems are developed, so it pays to keep an eye on the state of the CMS market. That’s especially true if you’re a developer who builds sites for clients. WordPress is a phenomenon, but it’s not suitable for all scenarios.
According to The Register’s Trevor Pott, Windows 10 is a resounding “MEH” from a sysadmin perspective. To be fair, Pott likely wrote his blog before Microsoft began aggressively pushing the operating system onto its users, resorting to everything from incessant nagging to outright deceit. If you think Microsoft’s behaving like a purveyor of malware or adware, you aren’t alone in that.
“It appears as if Microsoft designed the Windows 10 upgrade mechanisms in a way that makes it very complicated for users to block the upgrade offer for good on machines running previous versions of Windows,” writes Martin Brinkmann of Ghacks. “This persistence is similar to how malware evolves constantly to avoid detection or come back after it has been removed from operating systems.”
If your business sells something, there’s a strong chance that the people you want to sell to have a Facebook account. As of the beginning of 2016, Facebook’s active user count was over 1.6 billion people. That’s a staggering number of users from every demographic and almost every location on Earth. For some people, Facebook is the internet.
Facebook offers a diversified set of promotional and advertising products. In this article, I’d like to focus on three of the tools Facebook offers for getting content and advertising in front of a larger audience.
GitHub has introduced new pricing policies that are great news for some developers, and very bad news for others, particularly web development agencies and projects that have a lot of collaborators.
GitHub has always been free to developers who are happy to use public repositories, a policy that has seen GitHub become the de facto standard for open source version control. If you needed a private repository, you paid a relatively small monthly subscription, for which you got several private repositories. If you wanted more private repositories, you paid a little more.
The open source WebKit project has announced that it intends to replace vendor prefixes for experimental CSS properties with a runtime flag. Developers rely on vendor prefixes to experiment with — and often to use in production — CSS features that have yet to be finalized.
Vendor prefixes were introduced because the web platform needs both stability and the ability to evolve. Standards bodies are constantly working on new CSS features, and those features need to be tested by browser developers and web developers. As the standards develop, different browsers implement them at different times with different interpretations of the specification. Prefixes allow browser developers to implement experimental features without exposing web developers to a constantly changing set of standards.
I’ve written about URL shorteners on this blog before. I’m not a fan. While I recognize that short URLs are great for branding and sharing, they’re also fragile and prone to link rot. A recent study from Martin Georgiev and Vitaly Shmatikov reveals yet another reason to be careful with short URLs, especially when they’re used as a quick and easy way to share information from web applications and the cloud. Short URLs can create security and privacy issues.
The vast majority of web hosting companies base their hosting plans on CentOS, which is essentially a free clone of Red Hat’s hugely successful Red Hat Enterprise Linux. Web hosts choose CentOS because it provides a stable and secure platform. CentOS is a conservative distribution, with major releases happening only once every few years.
CentOS is a stable foundation for web and application hosting, and much of that stability is the result of the glacially slow — or sensibly cautious, depending on your perspective — rate at which new software versions are incorporated into the distribution’s repositories.
Life is not easy for smaller eCommerce retailers. The eCommerce market is dominated by giant platforms with global reach, buying superpowers, and bottomless product catalogues. Product manufacturers often retail their products through Amazon or eBay, making it difficult for smaller retailers to turn a profit by buying from those manufacturers and selling with a reasonable markup. Small eCommerce retailers have promotional budgets dwarfed by those of eCommerce behemoths. They have fewer promotional opportunities, can’t use loss leaders as widely as big retailers, and have fewer chances for successful cross-selling and upselling.
For many small and medium business website owners, ranking locally is more important than achieving a good ranking in the general SERPs. The majority of searches with transactional intent are local and mobile. You’re losing custom if your business doesn’t appear in a prominent position in the search engine results when a potential customer picks up their smartphone and enters a query that relates to your business.
In this article, I’m going to assume that your site’s search optimization is in good condition: that you understand the value of high-quality content, that the pages of your site are optimized for relevant keywords, and that the site itself is optimized for performance and crawlability. If you’ve got all your ducks in a row, but your site still doesn’t make a good showing in local results, here are three things you can do to improve local visibility.
Even for developers with a modicum of PHP experience, developing for WordPress can be tricky. GenerateWP is a useful tool that makes it easier to create snippets of PHP code that can be added to WordPress plugins, to the functions.php file, and anywhere else that you might want to make a tweak to add functionality to your WordPress site.
GenerateWP is a web app — users choose the type of functionality they want to create code for, fill in a few options, and the app will spit out a code snippet that can be pasted into a plugin or theme file. Among the snippets that can be generated are code for custom post types, menus, and shortcodes, but it’s the newly added ability to generate code that uses WordPress hooks that interest me the most.
Earlier this month, Apple announced the introduction of the Safari Technology Preview, a bleeding-edge version of Apple’s browser that will give developers an opportunity to check out technology to be included in future releases of the browser.
Safari is a hugely popular web browser, with many millions of users across Apple’s OS X and iOS platforms. Safari is based on the WebKit layout engine, an open source software project that originated when Apple forked the KHTML project. Until recently, WebKit also underpinned Google’s Chrome browser, but the search giant forked the project and now uses its own WebKit-derived layout engine, Blink.
A couple of weeks ago, we wrote about why static site generators aren’t the next big thing. One of my reasons for thinking so is that the majority of web publishers want an integrated content management and publishing system, which is exactly what a CMS like WordPress or Joomla! provides. But I also mentioned that some publishers see the virtue in completely separating the content management system from the front-end.
If you ask five different Linux developers which distribution is best for server management, you’re liable to receive five different answers. Because it’s open-source, Linux has one of the most diverse development ecosystems of any operating system on the market. Anyone with the necessary skills and time can code their own distribution (and many do).
From the perspective of a sysadmin, that means that your choices are effectively limitless when it comes to choosing a hosting distro.
That isn’t to say you can just pick any old distribution and spin up your server, of course (that’s something I wouldn’t really recommend). It just means you’ve got plenty of options. On the one hand, that’s pretty great; more choices means you’re that much more likely to find something that’s a perfect fit for you.
Web applications are becoming increasingly complex, and as HTML is asked to accomplish user interface miracles it was never designed for, the language that structures the web’s documents is showing its rough edges. HTML itself has a limited number of native UI elements like tables. Creating new ones often involves copy-pasting huge chunks of code or using complex frameworks, the guts of which most developers have little understanding of.
In the mid-nineties, when the web was young, Sun introduced a technology that brought some of the power of desktop applications into the browser. We all remember waiting for Java applets to load so that we could play a game or join a chat. We also all remember the horrendous security record of that technology — a legacy we’re still dealing with today.
The advent of the web was a mixed blessing for publishers. On the one hand, the web gives publishers access to a larger audience than they could have dreamed of a couple of decades ago. On the other hand, the deluge of content and the expectation that content will be free has practically destroyed the economy on which publishers rely. The result: publishers desperately searching for revenue opportunities and throwing every potential money-making strategy at consumers who are becoming increasingly tired of the bloated slow-to-load pages they’re expected to pay to download.
Content is key to successful marketing on the web. The web is a content-based network, and although traditional advertising has been hugely successful on the web, it’s content marketing that has proven itself to be the most web-like way of building an audience, generating engagement, and increasing reach.
However, that doesn’t mean all content publishing for marketing purposes is content marketing. If we exclude advertising, social media marketing, and email, the latter two of which it could be argued aren’t really “web”, we’re left with two broad swaths of content strategy applicable to online marketing: content marketing and native advertising.
It’s widely acknowledged that offering HTTPS connections on sites of all different types is a good thing for security and privacy. Encrypted connections prevent eavesdropping, man-in-the-middle attacks, and the altering of data traveling over the connection. However, owners of some types of site — although they may acknowledge the theoretical benefit — think the negatives outweigh the positives. They worry about the cost and complexity of implementing SSL / TLS, the difficulty of managing certificates, and I’ve quite often heard site owners complaining about the potential performance impact of establishing SSL / TLS connections.
As the web became the most successful publishing and reading platform in history, it brought with it a catastrophic decline in the richness and sophistication of one of the pillars of content design. Typography was laid low. Designers had a mere handful of web safe typefaces to choose from. The vast and rich resources available to pre-web typographers were not available to their modern cousins.
Towards the end of last year, Mathias Biilmann Christensen published an excellent article in Smashing Magazine entitled Why Static Website Generators Are The Next Big Thing. It’s well worth reading, and I agree with most of the points Christensen makes. I’m a fan of static site generators and I’ve written about them many times on this very blog, but — being in a contrarian mood — I want to argue against the basic premise. I’m not so sure static site generators are the next big thing.
In a recent issue of Smashing Magazine’s Web Development Reading List, Anselm Hannemann laments the ignorance of web development fundamentals in developers applying for front-end dev positions. According to Hannemann, most of the applicants didn’t have a clear idea what a clearfix is used for, what ARIA roles are, or the basics of HTML and CSS. They did, however, know high-level frameworks like React and Angular.
As the size of web pages trends ever upward, images are a leading culprit. Modern web sites are stuffed with large full bleed images. Image sliders the width of browser windows are de rigeur. If your blog post doesn’t have a massive featured image, it looks unfinished.
I’m all in favor of the way web designers and bloggers are exploiting the visual capabilities of the web, but I’m not at all happy about having to download multi-megabyte images on my iPhone because they haven’t been properly optimized.
To state the obvious: WordPress is hugely popular. In fact, it’s so popular that it’s become the default choice for bloggers and small businesses launching a website. WordPress’ popularity is well-deserved — WordPress pulls off the tricky feat of being both relatively easy to use and enormously powerful. Its huge ecosystem of plugins and themes means that, with little effort, we can build websites that can do almost anything we might want them to. And, if we can’t, there are thousands of WordPress professionals who can. In short, WordPress is a great content management system.
It’s no secret that IPv4 addresses are in short supply. The number of connected devices has exploded beyond anything the creators of the protocol imagined. IPv6 is not supported widely enough that we can do without IPv4 addresses, which means that web hosting companies have to be careful how many IP addresses they give out to clients.
SNI or Server Name Indication helps us to preserve the stock of IPv4 addresses for cases where they’re really needed.
Most startups are hungry for the oxygen of publicity. The penetration of social media, and especially of Facebook and Twitter, has reduced the reliance of business on traditional news media and bloggers, but a positive write-up on a popular site or blog can turn a trickle of social media referrals into an avalanche of new users.
The media still matters, which is why businesses should make it as easy as possible for writers to find the information they need to build a story.
Static site generators have taken off in a big way over the last couple of years. They’ve moved from the realm of the uber-developer to the almost mainstream, and — even for a SSG geek like me — it’s become difficult to keep track of all the new SSGs arriving on GitHub; there are thousands of them. Most SSGs tackle a basic problem in more or less the same way, albeit with tweaks to the process, a different functional emphasis, or a different programming language. Most aren’t worth a blog article, but Lektor, a new Python static site generator from Armin Ronacher — creator of the Flask web framework and many other things — solves one of the big problems with static site generators: they’re almost unusable for the average writer or editor.
I’ve written branded content for the web for many years. When I first started, the process looked something like this: develop a list of keywords based on the specific business; research those keywords in the Google Keyword Tool or similar, looking for high-value keyword combinations; create content with as many of those keyword variations as possible within the text. Ideally, the content would be good, but the quality of the writing and information was far less important than the presence and distribution of keywords. Once that was done, we’d move onto the next collection of keywords.
If you’re new to the world of small-business websites, you might assume that the space is full of well-meaning honest professionals. For the most part, you’d be right. The vast majority of web developers, designers, search engine specialists, and web hosts are trustworthy, but there are bad apples that small business owners should wary of.
Consider this scenario — one that I’ve encountered many times. A small business owner decides to launch a new website to publicize her business. She doesn’t have a clue how web design or web hosting works, so she hires a “web master” to take care of the whole thing. He registers a domain for her business. He builds a good-looking WordPress site with a premium theme and the copy our business owner supplies.
It’s been said that software is the most complicated thing human beings have ever created. I’m not sure if an operating system is more complex than the (software designed) processor on which it runs, but I do know that modern operating systems are so complex that no single human could hope to hold every part in their mind. The same is true of other software and software ecosystems.
WordPress is not as complex as an operating system like Linux, but it lies at the center of an ecosystem that comprises thousands of plugins and themes, each of which adds code to a WordPress installation. That code communicates with WordPress via well-understood APIs, but the ecosystem of plugins, themes, and WordPress Core creates a combinatorial explosion of possible configurations. And, although developers try to foresee as many of the potential issues as possible, it’s all but impossible to guarantee that no combination will cause an error that will prevent a site from functioning properly.
Many bloggers and small business site owners opt for shared hosting when they first create a new website. Shared hosting tends to be inexpensive, and it often requires little in the way of technical experience to get a site up and running. But once a site generates more traffic, and once the site owners become confident managing their site, it’s time to consider moving to a more capable form of hosting.
First, I want to take a look at why shared hosting is not a good option for moderate to high traffic sites, and then I’ll take a look at what the alternatives are.
Most small business owners aren’t all that interested in the war between flat-design fans and lovers of skeuomorphism, or in the author’s favorite web framework. Instead, they want to know about the technological trends that have the potential to impact the bottom-line performance of their site.
To that end, I’m going to look at three key developments in the web technology world that were bubbling under the radar in 2015, but that are likely to make a real difference to small business owners in 2016.
Over the last few years grid-based design has taken off in a big way in the web hosting world. There are hundreds of different CSS and SASS frameworks to choose from, many of which are functionally indistinguishable from each other, and a small portion of which tackle the problem in novel ways. Most grids are concerned with splitting pages into columns and rows, into which elements are organized. Grids of this sort are great, but they tackle a different problem to the one we’re looking at today; they’re more concerned with layout than vertical rhythm.
The web relies on any number of technologies, from the servers that host websites to the routers that direct endless streams of information around global networks, but if I had to name one core technology specific to the web that gives it its unique character and capability, it would be the humble hyperlink. Hyperlinks are the sine qua non of hypertext, and hypertext documents constitute the bulk of the web. The use of “hyper” seems outmoded in 2016, but the technology that it denotes and the idea behind it has had an immense impact on our society.
I’m a big fan of the Gulp task runner. I can’t imagine doing any serious web development work without Gulp in the background doing all the jobs that used to take time away from what I really want to focus on — coding and design.
Over the past few months, I’ve tried out dozens of Gulp plugins. Some haven’t found a regular place in my development workflows. Some have a specific role according the project I am working on. And some have become so essential that I install them in every project.
Twenty-five percent of the world’s websites are built on WordPress, and a majority of others aren’t based on a content management system at alI. It’s no surprise that WordPress gets by far the most attention and seems almost to have become the default option for many projects. The success of WordPress is to be admired, but it’s far from the only content management system out there. In this article, I’d like to take a look at Craft, a PHP-based content management system that differs in some key ways from WordPress, and is, for some use cases, a superior option.
If you were a web designer in the mid-to-late 2000s, you’ll remember the problems inherent in creating a site with great typography. Suffice to say that since the advent of web fonts and web font hosting services, we’ve had it good. Every web designer and developer has a choice of tens of thousands of typefaces — many of them free.
Fontdeck was one of the first and most popular of the font hosting services. It gathered together the best fonts from a diverse range of foundries and made them available for a price within reach of most projects. Sadly, Fontdeck recently announced that it will not allow the creation of new accounts from 1st December 2015 that it will close altogether on 1st December 2016.
PHP developers have any number of options when choosing a development environment. From command-line editors like Vim, via richly featured code editors like Sublime Text or Atom, to IDEs like the Zend IDE and PhpStorm. It’s the latter I’d like to take a look at in this article. PhpStorm is a full-featured IDE that has become increasingly popular over the last couple of years, largely because developers have been looking for an alternative to the sluggish performance of the most popular PHP IDEs. Earlier this month, PhpStorm hit version 10, and with it came a boatload of new features to complement its existing feature-set.
Responsive design is one of the best things that ever happened to web design. With responsive designs we can use one codebase to support everything from small inexpensive Android phones to 5K Retina iMacs. Not only is that great for mobile web users, who get usable sites across all their devices (assuming responsive design is implemented properly). It’s also great for developers and the people who pay them — no more mobile-specific sites.
Responsive design makes the web more usable, more flexible, and more accessible for users, and it makes building websites that support all devices more efficient, more cost-effective, and more enjoyable for developers.
Last month, President Barack Obama met with Chinese President Xi Jinping to discuss the strained relations between their two countries – primarily the prevalence of digital espionage.
As you may recall, the two nations have a long history of being at one another’s throat in the digital realm. Neither is entirely blameless in this, of course – while China has made a habit of targeting American businesses with aggressive hacking campaign, digital espionage in the States is nothing to sneeze at, either. It’s hoped by everyone involved that the pact signed between Obama and Jinping will serve to warm relations between the two countries, and make aggressive digital attacks a thing of the past.
Ever since social media marketing became a thing, its practitioners have cast about for a measurement that reflects the value of what they do. Follower counts (along with shares) seem an obvious candidate.
A follower represents someone who has expressed an interest in a brand and its content. The more people who are interested in what a brand shares, the more chances to convert leads into buyers. Follower counts also have a snowball effect. As an account’s follower or friend count increases, so does the likelihood that content will be shared, which will attract even more followers. And they have a reputational effect; a brand with 100,000 followers is clearly more awesome than a brand with 10,000 followers, and we can look with pity on the businesses with sub-1000 follower counts — clearly they won’t be around for long.
That’s an exaggeration. Calypso is a web app (and a Mac app) that does many of the same things as the existing WordPress admin interface, but it doesn’t replace that interface; at least not on self-hosted sites.
I’m a keen amateur photographer. My workflow looks something like this: go to a nice place and take lots of photos, come home, import them into Lightroom, fiddle about with the sliders for a few of the images, export a couple that look okay, upload them Facebook, and then forget about the rest. Ideally I’d like to upload compelling photo-essays of my images, but that seems like too much hard work, so I’ve never bothered. Exposé, a new tool from developer Jack Qiao, might just change that.
CSS has numerous limitations. You can build any website it’s possible to imagine in CSS (with the addition of various other technologies), but CSS doesn’t make life easy for developers. It’s too easy to make mistakes, to build massively complex style sheets that are a nightmare to maintain, and to end up with a mess that no designer can properly get their head around.
Tools like SASS were created to mitigate some of the problems with CSS. They add features like mixins, variables, and mathematical notation that make it easier to write elegant and maintainable stylesheets which can then be processed into valid CSS browsers can understand.
One of the reasons I’m fond of static site generators like Jekyll is that they allow developers to use version control systems, such as Git. The development files and the site itself are plain text, so it’s easy to keep a complete record of any changes and to use development workflows like feature branches — not to mention the benefits to collaboration.
Static sites have a limited user-base and they aren’t suitable for non-technical users or complex publishing workflows. That’s where content management systems like WordPress take over. WordPress is easy to use for non-technical publishers and companies, but WordPress makes it difficult to use version control systems.
WordPress and most other popular content management systems store content in a relational database like MySQL. While there are advantages to using a database, it adds a layer of complexity that isn’t necessary for many sites. An alternative is to simply store content, template, and configuration files on the server’s filesystem. Content management systems that take this approach are known as flat-file CMS’s. Over the last couple of years, a number of flat-file CMS’s have entered the market, most notably Statamic, which now has competition from Grav, a new flat file CMS from RocketTheme.
Small businesses are usually started by people who are passionate about doing one thing well. Whether it’s making the best cupcakes, taking awesome photos, or building a service that empowers users, we start businesses because we see a market for something that we can do well — perhaps better than anyone else.
Once it’s time to move from idea to execution, small business owners have a problem. It’s not enough to be great at what you do, you also have to let potential customers know you’re great. We turn to web hosting and web design, to social media, and to marketing to get the word out and build a customer base. But there’s the rub: being able to bake delicious cupcakes won’t make you a good marketer.
SSL certificates underpin online security and privacy. Using an up-to-date version of SSL / TLS, they are a practically undefeatable mechanism for ensuring the privacy of data transferred between servers and web browsers. But SSL certificates have another job to do: they are used to verify the identity of domain owners, which needs more than math. It needs, among other components, a group of organizations to validate the identity of domain owners and create certificates — the Certificate Authorities.
Technically, you don’t need a framework. You can easily handle the ins and outs of development on your own. You can create your own libraries, download independent modules, and tweak free-flowing code however you see fit.
The question is, why would you really want to?
“A framework is not absolutely necessary: it’s “just” one of the tools that is available to help you develop better and faster,” reads a post on the Symfony Blog. “Better, because a framework provides you with the certainty that you are developing an application that is in full compliance with business rules, is structured, and both maintainable and upgradeable. And faster, because it allows developers to save time by reusing generic modules so they can focus on other areas.”
There are over seven million legally blind people in the US and tens of thousands of legally blind children. A significant proportion use screen readers to access the Internet. Screen readers present online content through an audio interface, and they’re an essential line of communication and education to many who would otherwise be unable to use the Web with anything like the efficiency screen readers allow.
Screen readers are excellent at translating the visual medium of the web into verbal content, but poor web design practices can impact the user experience of visually impaired users. For the most part, adhering to established web design standards will ensure that your site works well with screen readers, but there are several best practices that can make a big difference to the experience of screen reader users.
First things first, why 4.0.0? Most users of Node are using a version that hasn’t yet hit 1.0. I don’t want to get far into the weeds on this issue, but the nutshell explanation is that last year Node was forked into a couple of competing versions: Node.js, which was overseen by Node’s original creators Joyent, and io.js, which was a community project. The fork occurred because the community was unhappy with the pace of development.
The question of whether to invest in the development of a native application has vexed publishers over the last few years. I’ll lay my cards on the table: I’m in favor of the web, but in this article I’d like to take a look at both the pros and the cons of each choice — web or native?
What Are The Benefits Of A Native Application?
The most obvious benefit of native applications is their access to device features that aren’t available to web applications and sites. For publishers, most of those features are irrelevant (the camera and sensors), but one in particular is important: native apps are faster than the web.
I’m a fan of opinionated design. The most innovative designs tend to be the result of smart designers solving problems in novel ways. The solution often comes at a problem from an angle that hasn’t been tried before. Sometimes it works, sometimes it doesn’t, but that’s true of everything innovative.
I tend to look down on design entirely driven by the numbers. Design dictated by metrics and analytics may be effective, but it can lack daring, flair, innovation, and soul.
A couple of weeks ago, I was talking to a writer friend of mine. It wasn’t long before the conversation turned to tools of the trade. Both of us write for the web, but the tools we use are quite different. I’m a plain text and Markdown fan, she prefers to write in Microsoft Word.
I found her use of a word processor surprising. I haven’t touched a word processor like Word or Pages for years. Almost everything on this blog was written in Markdown in a text editor. Word processors are overly complex, packed with features I don’t need, and — worst of all — they put too much focus on formatting text instead of creating content. The only possible reason I can think of to use a word processor to write web content is if it will be sent to a non-technical user who insists on delivery in Word’s proprietary file format.
In the early years of software development, applications were complex monoliths with logic and UI inextricably entangled. In the late 70s, the designers of the Smalltalk programming language developed a different way to architect applications — one that separates the “business” logic of software from the elements that deliver the interface to the user and accepts their instructions.
That principle of software architecture — MVC, or Model-View-Controller — later become the design pattern for the vast majority of web applications and the frameworks on which they are based. PHP frameworks like CodeIgniter, CakePHP, and Laravel are designed to make building MVC-style applications as easy as possible.
Since it was released four years ago, Bootstrap has taken the web design world by storm. So much so that a bare Bootstrap site has come to be a byword for quick and easy — or lazy — design.
Bootstrap’s popularity stems from its easy-to-use grid, but in the modern web design space, you can’t throw a stick without hitting a grid framework. Its popularity is maintained because of its plethora of extra features, from sane CSS defaults to user interface elements that make it relatively straightforward to build a range of common site layouts.
There are two major groups of performance optimizations that can be implemented on websites (if we set aside network optimizations). The first reduces the weight of a page — compressing HTML files and optimizing images are examples. The second doesn’t change the amount of data that the site sends to browsers. Instead it involves taking control of when elements on the page load to improve users’ perceptions of load times.
Lazy loading is an example of optimizing perceived performance. Modern sites tend to be image heavy — designers favor large full-bleed images that inflate the size of their pages. For long pages, that means a significant amount of image data has to be downloaded before the page becomes usable.
In the early days of the web, everyone used HTML tables for layouts. At the time, it made sense. Print media has always made heavy use of grid layouts. The HTML table element was a straightforward way to bring some of the power of print media’s grids to web design. As we all know now, using HTML for page layout isn’t a good idea, and any designer who has been in the business long enough will remember tables nested within tables nested within tables, all of which was muddled up with the page’s content.
When CSS came along, we learned to separate design and content. Unfortunately, CSS’ layout tools were not great — they relied on low-level fiddling with nested divs, absolute positioning, and floats. The box model is not intuitive, and many a designer has lost hair trying to figure out why their divs refused to appear in the right place.
Although the number of people installing ad-blockers on their desktop machines has steadily increased over the last few years, publishers could at least rely on their ads being seen by users of iOS devices. As mobile became the favored platform for web use, it helped offset at least a little of the impact of desktop ad-blocking software. That’s about to change with the introduction of Content Blockers for iOS.
I’m an admirer of Ev Williams. When he founded it, Blogger was an innovative platform — one of the first to bring accessible online publication to non-technical users. I use Twitter — also co-founded by Williams — dozens of times a day. His most recent project, Medium, continues his commitment to make publishing online as easy and elegant an experience as possible.
I’ve banged the better typography drum numerous times on this blog. Typography is the beating heart of web design, and without decent typography, even the best designs will provide a poor user experience. After all, reading is at the core of the what we do on the web, and typography is all about improving the reading experience.
In previous articles, I’ve talked about simple ways we can make typography better. In this article, I’m going to look at some of the tools you can integrate into design workflows that make it easier to achieve better typography.
For an experienced WordPress user, managing a WordPress site is a doddle. Most can install a site and a selection of plugins, tweak the defaults settings to their liking, and load up a theme in no time at all. If you’ve tangled with other content management systems, WordPress is a breath of fresh air. Its interface is logical, clean, and intuitive.
On the other hand, if you’re used to writing in a word processor and the most complex content management system you’ve used is Facebook, you’ll have a different opinion of WordPress’ user friendliness.
As a PHP developer, it can be difficult to decide which framework to use – there is, after all, a rather extensive list of them. Today, we’re going to help you sort through that daunting selection. We’ll be going over some of the most popular PHP development frameworks on the market and taking a look at their strengths and weaknesses.
Once you’ve learned a little about what each framework can do, you’ll be better-equipped to determine which is the best option. Now, at this point, there’s one thing worth mentioning – although no two PHP frameworks are created equal, and every framework has certain projects it’s better suited for, the majority can be used for whatever projects you desire. Which you choose is largely a matter of how you code and what you want to accomplish.
This year will be the twentieth anniversary of PHP’s first release. Back in 1995, its creator, Rasmus Lerdorf, could have had no conception of the impact it would have. What was originally called Personal Home Page Tools is now the foundation of much of the modern web. WordPress alone accounts for 24% of all websites. Publishers, eCommerce merchants, and businesses the world over rely on PHP every day. Many thousands of developers make their living from PHP. All of which means that the imminent release of new major version of PHP is a big deal.
PHP 7, which is currently in beta and is expected to have its final release at the end of this year, brings some significant changes. But first, why PHP 7? The current stable version of PHP is 5.6. There will be no public release of PHP 6, which briefly existed as a development project, but was abandoned before it reached completion. In order to avoid confusion with the aborted version, the new release will skip 6 and go straight to 7.
The number of freelance businesses has increased enormously over the last few years: developers, writers, photographers, designers, customer service operatives, personal assistants, eCommerce retailers — the list of jobs amenable to the freelance business model is long and growing.
Many freelancers find work through sites like oDesk, but once they’ve established a reputation, much of it will come from referrals — do a great job for a client and they’ll refer someone else.
Not so long ago, you could count the number of generic top-level domains on your fingers and toes. The domain names of most sites ended in “.com,” “.net,” “.org” or one of a few other familiar domains. Added to that were the several dozen country-code top-level domains, most of which are localized to a geographic region by search engines; sites with those domains would rank higher in their specific region, all things being equal. And then there were the country-code domains which were treated like ordinary generic domains in search: “.tv,” “.me,” and “.bz” are in this category.
The way these domains were treated in search was well understood, and, in-spite of the usual flimflam from the less honorable parts of the SEO industry, few site owners were likely to buy or avoid a domain name with the misconception that it would harm or benefit their site’s performance in the SERPs.
We all know that passwords aren’t a good method of authentication. Complex passwords are hard to remember, simple passwords are next to worthless. And yet, most web developers who log in to production and development servers use SSH with a password.
The dangers are obvious: for even fairly long passwords, which most people don’t use, a brute force attack against an SSH server can prove effective. Passwords are bad at protecting the files that constitute your website, not to mention any sensitive data that might be stored in your databases.
Web fonts opened up a new world of typographic possibilities for web designers. No longer did we have to rely on a handful of “web safe” fonts. We could choose from thousands of fonts offered by services like the Google Fonts and Typekit. We could experiment with fonts to our hearts’ content, finding the perfect combinations without having to spend big money on licenses. In short, web fonts are awesome.
However, web font services are not without their limitations, one of the most pernicious of which is the dreaded Flash Of Unstyled Text (FOUT). We’ve all experienced this: we load up a web page, and for just a second, we see an ugly fallback font, which is quickly — or not so quickly — replaced by the designers’ chosen font.
In this article, I’d like to have a look at what Node.js is for, its major benefits, and why you might want to think about adding yet another tool to your web development kit.
Most readers of this article will have set up SSL/TLS encryption for a website at some point in their career. It goes with the territory for system administrators and site owners. But for the average website owner, the processes is fraught with difficulty and opportunities to make a mistake. Let’s Encrypt — which will become available to the public next month — is a new way of adding domain-validated SSL certificates to a site that aims to make it easy for everyone.
It’s reasonable to ask: does everyone really need SSL encryption? It’s obvious that eCommerce sites and sites that handle sensitive information need a way to protect data that travels between server and browser from snoopers. The case for the average blog is somewhat less clear, but with the advent of ISPs that choose to inject their own advertising into blogs, the proliferation of content management systems that require authentication to post, and the eagerness of certain organisations to track what people are reading on the web, there’s a strong argument that all sites should be protected.
Over the last couple of years, web design has homogenized around a small set of tropes — one-page sites, full-bleed images, parallax scrolling, and what’s come to be known as scrolljacking. I’ve loved many of the sites that adopted this set of design trends, but scrolljacking has never been on my list of things to admire about a site’s design.
I’m not going to single out any particular site, but I’m sure you’ve all experienced a site with scrolljacking. You click on a link and are taken to a site that looks beautiful. You scroll down to see more, only to find that your mouse or trackpad is malfunctioning. The page jerks about or scrolls in slow motion. It doesn’t stop moving when you stop scrolling. Sometimes you scroll and everything disappears from the screen; you have to keep scrolling until content reappears. And sometimes instead of scrolling top to bottom; the whole page lurches sideways!
Material Design Lite is a web framework more suitable for blogs, news sites, and business sites than its app-focused big brother.
Front-end developers have no shortage of frameworks to choose from. Bootstrap is hugely popular, as is Foundation, and sometimes it seems as if every front-end developer with the coding chops has a swing at building their own. But, there’s always room for one more, especially when it comes with the imprimatur of a company like Google.
Google was never known for its design abilities. It was often seen as an engineering-centered company more focused on developing cloud services than maintaining a coherent design language across its web products. All that changed with Material Design, Google’s card-based user interface paradigm which is seen right across the company’s product range from Google Plus to recent versions of Android.
A couple of weeks ago, I wrote about some of the best tools for generating topic ideas for content marketing and blogging. In this article, I’d like to go a little further and talk about how I maintain a writing schedule for which I produce many thousands of words each month. In the average month I’ll write dozens of blog articles, several press releases, and numerous other pieces of content.
A couple of weeks ago, Facebook published a post on their blog about using author tags on Facebook. It was taken up by many social media and SEO sites as something new and exciting, but, as Joost de Valk points out, Facebook authorship markup has been around for quite some time. Nevertheless, it’s worth taking a new look at what authorship is and the potential benefits to bloggers, especially in the light of Facebook’s Instant Articles (the release of which probably prompted the authorship post in the first place).
I’m sure all of you remember the excitement that surrounded the introduction of Google Authorship and the associated authorship markup. For a while, bloggers who properly set up authorship and linked it with their Google Plus profile were rewarded with a byline and photo in the search engine results. Sadly, that experiment died as Google realized that the resources for maintaining Google Authorship weren’t worth the benefits (the benefits to Google anyway — some suspect Google Plus itself may be heading the same way in time).
The most important employers and money generators in the international economy are small and medium businesses. In terms of economic impact, the small, sustainable manufacturers and service providers who sell to limited markets make a big difference. 98% of US businesses have fewer than 20 employees. The Apples, Googles, and Ubers of the world make the headlines, but the real economic powerhouses are the thousands of little businesses that no-one but their customers has ever heard of.
If you follow the startup scene at all, you’ll have noticed that hardly anyone is aiming at small and sustainable. They want the next Google or Facebook. They want to ride the next unicorn to an astronomical payout. But most companies will not be the next Google or Facebook. There’s an inherent contradiction here: most companies will never be huge, and yet thousands of startups every year choose a business model that can only work at scale: free to the user.
If you’re a follower of this blog, you’ll know that I’m a fan of static site generators like Jekyll. Creating a basic Jekyll site is very simple: Jekyll will do the heavy lifting for you. But there’s more to a modern website than a basic scaffold of folders and files. Most depend on a number of external libraries, frameworks, and other tools.
The process of beginning a new project often involves heading to the sites of tools like Bootstrap, Foundation, and JQuery to grab the most recent version, unzipping them and dropping them where you want them in your project folder. That doesn’t seem especially onerous, and it isn’t, but it can be a drag when you do it for every new project.
A user interface pattern can be thought of as a good solution to a common problem that designers face. A good designer will have a toolbox of patterns they can deploy when the need arises. An obvious example of a UI pattern is the linking of a web page’s logo to the home page. This is a standard pattern, has been in use for many years, and users have come to expect it.
The opposite of a pattern is an anti-pattern. Anti-patterns are common solutions to problems but unlike patterns, they are bad solutions. They are considered bad because they solve the problem, but undermine the greater goals of UI design.
I live in a provincial town with a thriving small business economy. The place is full of solopreneurs and small-scale startups in various niches. Most sell online and they all understand the value of social media and web advertising. They’re smart, connected, and many are digital natives. Very few of them of them put any effort into content marketing. Time and again, I’ve asked small business owners about their content marketing strategy; the response is usually either a blank look or a sheepish admission that they’re doing no content marketing at all.
I think that’s a mistake, and I’d like to take a swing at explaining to small business owners everywhere what content marketing is and what it can do for their business.
I write dozens of blog articles every month. Writing and editing is the easy part — I’ve done it for long enough that it’s almost second nature. The hard part is finding something to write about. Every day, I have to come up with fresh and interesting topics.
Sometimes, my topics come from a combination of domain knowledge and inspiration. I know about the area I want to write about, and with a bit of thought, I can come up with something interesting to say about it. But more often than not, topics are a result of trawling through a huge number of news articles, RSS feeds, and social media sites looking for something fresh and interesting.
Let’s assume for simplicity’s sake that websites are made up of just HTML and CSS. The HTML contains the content and document structure; the CSS controls the layout.
What happens if you open a document in your browser without a CSS stylesheet?
If you have no knowledge of web design, you might assume that the result would be a completely unstyled document, but that’s not the case. Every browser has a default stylesheet that it applies to web pages. Even HTML without styling will take on a basic structure because of these default stylesheets.
That’s great, but it introduces a problem. Each browser has a slightly different default stylesheet, which makes the “unstyled” HTML appear differently. Because of how CSS inheritance works, it can also make it difficult to predict how a styled page will appear. If a designer doesn’t explicitly override a style in the default sheet, it will still be applied.
We’ve talked about some of the leading static site generators several times on this blog. I’m a big fan of the way they provide a framework for people to learn about the technology underlying the web, without making them do all the hard work of managing a non-CMS controlled site.
They offer a happy medium: once a static site generator and its site are configured, adding new content can be as simple as dropping a text file in a directory. It’s often simpler than publishing to a full-fledged CMS like WordPress, with all its bells and whistles. But in creating the site, bloggers and site owners have to learn about the underlying technology of HTML and CSS and develop an intimate understanding of how their site is created.
Last year, the HTTP/2 protocol was finalized. Support has already been implemented in Firefox and Chrome. Work is underway to integrate the protocol into web servers like Apache. It’s expected that in 2015, adoption of HTTP/2 will grow. It’s likely that adoption will proceed slowly, but it’s in the interest of web designers and developers to know what to expect.
Although most of the work towards HTTP/2 will be carried out at a lower level of the stack than web designers need to worry about — within web servers and browsers — it will have an impact on how web pages are put together. In fact, web design and front-end development habits now viewed as best practices are likely to harm performance under HTTP/2.
All software has flaws. Sometimes those flaws lead to security vulnerabilities that put users at risk. Security researchers work to find those vulnerabilities. Responsible researchers report vulnerabilities to developers and give them time to release a fix — and if they don’t, the researcher will release their findings to the public. This is an established procedure. The question is: how should a developer react?
Companies who are told that their products are vulnerable sometimes do not respond well. In some ways, that’s understandable.
I read a lot of business blogs. After a while, it becomes somewhat monotonous. Most businesses have invested at least a little time to make their blogs easy on the eye, or more likely, a little money to buy a theme that does the same thing. Typography has improved overall. Every business blogger knows that at a bare minimum, they should throw in an image or two. But for most business blogs, that’s a far as it goes. Every post looks more or less the same. It doesn’t have to be that way.
Art direction — in this context — is the process of creating a unique look for each blog post. In addition to images, art direction can involve using customized header fonts, body fonts, pull quotes, layouts, colors, and even JQuery effects on individual posts. The idea is to create a cohesive experience in which design is closely related to content, to make each article something unique rather than yet another post in a long list of identical posts.
You decide to create a new WordPress site. You buy hosting, install WordPress on it and select a theme. The theme almost looks how you want it to, but not quite. You make a few tweaks to the CSS: changes to the typeface and font size, a new color scheme, and perhaps some adjustments to column widths and borders. When everything looks just so, you start publishing content. A couple of weeks go by and you notice that your theme needs updating. After running the update, you’re horrified to discover that all your hard work has been undone. Your site looks exactly the same as it did when you first installed it.
This is, unfortunately, a common story and it’s why, when you edit a WordPress theme, you should always use a child theme.
A vulnerability has been discovered in a cryptographic algorithm used by tens of thousands of web servers to create secure TLS connections with browsers. As I write, almost nine percent of the web servers in the Alexa Top 1 million sites are vulnerable, as are a huge number of mail servers.
The best way to mitigate the risk of attack is to ensure that you’re running the latest version of your browser. If you’re a server administrator, you should ensure that your server does not support export cipher suites and upgrade OpenSSL and other TLS libraries to their most recent version.
In a post with the unnecessarily inflammatory title of Why Free WordPress Plugins Are Bad For Everyone, WordPress plugin developer Vova Feldman makes the point that free plugins are unsustainable and pollute the WordPress ecosystem with unmaintained and shoddy plugins that aren’t of much use to anyone. He claims that the freemium model, where developers make available a limited set of features for free and charge for premium features is a more healthy because it allows developers to make a sustainable living, which incentivises them to maintain their plugins.
WordPress users have three basic choices when it comes to picking a theme. They can download a free theme. They can buy a ready-made premium theme. Or they can pay a developer to create a unique theme for their site. Each of these has advantages and disadvantages, but most WordPress pundits will tell bloggers and businesses that they should go with a premium theme — they’re relatively inexpensive, they have more features than free themes, and you’re likely to get better support. All of that’s true, but premium themes are not without their problems, most of the time WordPress users don’t need the features offered by premium themes, and although users of free themes aren’t entitled to support, there’s a WordPress community out there that is usually more than willing to come to the aid of less experienced WordPress users.
Last month, a new and critical vulnerability was uncovered in the GNU C library (glibc). Dubbed colloquially as “GHOST,” the flaw made use of gethostbyname(), one of the most common function calls in Linux. In so doing, it allowed attackers to gain remote control of just about any Linux machine, executing malicious code at their leisure.
“This bug can be triggered both locally and remotely via all the gethostbyname*() functions,” explained Qualys, the company responsible for uncovering the vulnerability. “Applications have access to the DNS resolver primarily through the gethostbyname(*) set of functions. These functions convert a hostname into an IP address.”
It’s not all that surprising that web development’s become such a thriving industry. After all, more and more of our lives are spent on the Internet; a presence on the web is becoming ever more essential for success in enterprise. What that means is that there’s no shortage of work for an aspiring developer – provided they’ve the necessary experience.
And there’s the caveat. Gaining that experience can be extremely difficult, especially if you’ve no idea where to start. That’s where we come in.
In today’s piece, we’re going to offer some advice to first-timers to help get them on the right track. By the time you’re done, you should have a general idea of how to get your foot in the door as a developer. More importantly, you’ll know whether or not web development’s the right career for you.
This January, security researchers at Check Point discovered a set of vulnerabilities in Magento that could potentially allow a malicious actor to execute arbitrary PHP code on eCommerce sites, allowing an attacker to create a new admin account or to steal sensitive information, along with any number of other actions harmful to both eCommerce retailers and their customers.
Check Point disclosed the vulnerability to the Magento team, who quickly issued a patch. The patch has been available for more than two months. Last week, in accordance with the doctrine of full disclosure, Check Point released comprehensive details of the vulnerability, explaining how it was discovered, the code flaws that made it possible, and how it can be exploited.
The WordPress team has been getting better at hitting their release schedules of late, and that’s great news for WordPress users who have been anticipating the arrival of WordPress 4.2 this April. Named after jazz pianist Bud Powell, WordPress 4.2 follows hot on the heels of WordPress 4.1.2, which was a security release that you should install right away even if you don’t immediately plan to upgrade to 4.2.
So, what do WordPress users have to look forward to in 4.2? There are no huge new features —the WordPress Rest API won’t be ready for a while yet — but we do get some iterative improvements and enhancements to existing features that should please most WordPress bloggers.
Not long ago, Matt Mullenweg, the original developer of WordPress and CEO of Automattic, took to Quora to argue against those who were claiming that WordPress is not a secure content management system. A couple of days later, he must have been embarrassed to learn, a serious vulnerability was discovered in potentially dozens of plugins. And, to make matters worse, it was the fault of WordPress’ documentation.
Vulnerable plugins included such favorites as Jetpack, All In One SEO, Gravity Forms, and WPTouch.
In an impressive show of cooperation, Joost de Valk, who originally reported the vulnerability and who is the creator of the WordPress SEO plugin, which was vulnerable; the Sucuri WordPress security company; the WordPress security team; and numerous other plugin developers worked behind the scenes to implement fixes on as many plugins as possible.
Take a look at a blog or magazine site that you consider well designed. It’s almost certainly an elegant fusion of well-placed images and other graphical elements, intelligently chosen colors, and empty space. But when most non-designers are asked to articulate why they think a page is well-designed, there is one area that gets little attention. And it’s the one area that makes the most difference: typography.
It’s impossible to look at a web page with poor typography and think that it is beautiful, graceful, or elegant. Bad typography is ugly, but — more importantly — it also makes it hard for readers to read.
Freelancing is more popular than it has ever been. The number of people creating and managing their own micro-businesses has grown sharply since the financial crisis of a few years ago and shows no sign of waning as the economy recovers. A recent survey showed that there were more than 53 million freelance workers in the US alone.
It’s hardly surprising: the autonomy accorded to freelancers is unlikely to be matched by more traditional employment — even if freelancers work just as long hours. The Internet has made it possible for freelancers to be “present” wherever they are needed.
Footnotes are curious things. They occupy a no-man’s-land outside of the main flow of an article, but they carry content that can substantially alter the meaning.
I have mixed feelings about footnotes. The editor in me wants to say that if something is unimportant enough to be relegated to a footnote, then it probably shouldn’t be included at all. The writer in me wants to keep all those asides, qualifications, and expansions in the main text. Footnotes are the compromise. The writer gets their supplementary material, and the editor get their tight concise paragraphs.
In a post on the WPMUDEV blog called Upscale Your WordPress Post Pagination: Your Readers Will Thank You, Chris Knowles discusses how WordPress users can chop longer posts into smaller chunks. The technical details of the post are fine — except for the risky suggestions that users edit plugin code — but very little thought is given to the basic question of why site owners and bloggers would want to paginate their posts. It’s just assumed that pagination of long posts is a good thing for users, and I think the evidence for that is lacking. In fact, there’s quite a lot of evidence to the contrary.
There’s a new blogging tool that’s been making the rounds for the last few years, and it’s got people incredibly excited. Some are even going so far as to say it’s going to be the death of WordPress. Whoa, wait, what?
Now, most of you have probably assumed that this is just more hype – and rightly so. It seems like every time a new product of any kind breaks into the market, people start buzzing and babbling about how it’s going to dethrone the current king of the court. And guess what?
That never happens.
It wasn’t so long ago that Laravel was the new kid on the block – an irrelevant upstart of a framework that couldn’t possibly hope to challenge the then-titans; Symfony, CakePHP, and CodeIgniter. Oh, how the times have changed. When 2013 rolled around, Laravel began almost immediately to experience a period of explosive – some might even say meteoric – growth.
Now, it’s right up there with the other most popular frameworks – some even believe that, as 2015 moves forward, it might overtake them both in user interest and adoption.
But how did we get to this point? What was it that made Laravel grow from a relatively unimportant framework coded by a single developer into one of the most widely-used PHP programming tools in the world?
Never mind Ghost, WhatsApp’s broken privacy, or all the bugs surfacing in Windows 8. One of the most enduring security threats on the modern web is something known as CryptoPHP. This nasty little piece of work installs a backdoor onto content management systems by way of an infected theme or plugin; these addons are usually pirated copies of the real ones.
An attacker can then use a connected platform to gain administrative access to the compromised site. This allows them to do … pretty much anything, actually. Worse still, this is one of the most versatile security exploits we’ve seen in a while – it can self-update, makes use of strong encryption, has an application infrastructure that rivals some businesses, and includes a number of backup mechanisms that make it a distressingly insidious presence on the web.
On the surface, PHP development isn’t really all that different from any other technical profession in the world. It’s a given that there are certain rules you’ll have to follow; certain best practices you must adhere to. And really, you’ve no reason not to follow them – doing so will make your life as a developer significantly easier, and make you better at your job besides.
That’s what we’re here to talk about today, folks. We’re going to go over some of the best pieces of advice ever given about PHP development – best practices that any developer worth their salt should definitely have in place. Take a look, and see which you use (and which you probably need to implement).
Hey there, folks! Today, we’re going to be addressing a pretty niche topic. We’re going to be talking about PHP game development.
These days, gaming is more popular than it’s ever been. Mobile and browser-based games are taking the world by storm, and the games industry itself is a multi-billion dollar juggernaut that could soon even rival Hollywood in size and cultural influence. With that in mind, it’s not surprising that more and more programmers are starting to find their way into gaming – and consequently into game development.
Now, before we move on, there’s one thing worth mentioning at this point. Chances are pretty good that if you want to code something more than a browser game, you’re going to need to use a game engine like Unity or The Unreal Engine 4. The reason for this is that PHP is primarily intended for server-side scripting; it’s not meant for client-side games.
A vulnerability that could potentially allow an attacker to execute SQL commands on WordPress sites has been discovered in the popular Yoast SEO plugin. An update to fix the exploit has been pushed to WordPress sites that have automatic updates turned on, but if you’re still using an older version of the plugin, you should update immediately. Versions older than 1.5 are not vulnerable, but that’s seriously out of date and if you can update to the newest version you should. Oddly, this plugin uses different version numbers for its free and premium offering; we’re using the free plugin version numbers in this post. If you’re a premium user, take a look at Yoast’s post on the topic.
Yoast’s SEO plugin is one of the most popular plugins in the WordPress repository and is installed on many millions of WordPress sites.
In my “Six Things You Need To Know If You’re Starting A WordPress Blog” post from a couple of months ago, one of the non-essential but beneficial improvements I suggested was to install a caching plugin like W3 Total Cache. Caching helps make a WordPress site faster, but there’s a way to make it even faster still — you can turn it into a completely static site.
If you’re a follower of this blog, you’ll know that we’ve talked about static sites often, usually in the context of static site generators like Jekyll. I’m a fan of static site generators, because I like to encourage people to ditch the content management systems that put a layer between them and the technology they’re using. But, static site generators aren’t for everyone, and for anything more than a personal blog or portfolio site, they’re lacking in much needed functionality. A major reason we use content management systems in the first place is to make it easier to handle more complex publishing workflows, and static site generators aren’t great for that.
Mark Nottingham, the chair of the HTTP working group overseeing the development of of HTTP/2, has announced that the standard is complete and headed to the RFC editor for tweaks before publication as a standard.
HTTP is one of the fundamental technologies underlying the web, and it hasn’t seen a comprehensive upgrade since 1999, when HTTP 1.1 was introduced.
The web has changed enormously since 1999, but the technology that makes the web possible changes achingly slowly. While the tools used by developers and designers for creating sites is constantly moving forward, they have still had to contend with an protocol that was created back when dial-up was all the rage and blinking text was considered a bold design choice.
Product pages are a perennial source of headaches for eCommerce retailers, especially mid-sized retailers that have large catalogues but not the manpower to create a unique experience on each page. Creating compelling product pages can be a significant investment of both time and money, but I’d suggest that it’s more than worthwhile in many case because the result will be more search traffic and increased conversion rates.
In this article, I’m going to take a look at a number of strategies that eCommerce retailers can use to improve the quality of their product pages and, hopefully, their sales.
The link is one of the foundational technologies of the modern world. The ability to connect a web page to multiple other pages was the genius innovation that made the web possible and made web pages a medium in their own right, rather than being simply a digital representation of the physical page. But links are limited. The web page as the gross unit of differentiation for web content is somewhat clumsy — why a page? Each page can contain any number of levels of organization that we might want to link to.
Of course, it is possible to link to so-called fragments within a web page, allowing us to navigate to a specific section, but even that is somewhat limiting — ideally, we’d be able to link directly to a specific paragraph or sentence or to any arbitrary chunk of text.
Branded links are cool. It’s great to be able to get rid of ugly long links replete with strings of tracking and affiliate codes and replace them with short links that contain the name of your company or site. It looks good on social media, and it helps present a coherent brand image to web users. But, however cool they are, short links suck, and they suck because they break online transparency and user experience in some key ways.
Firstly, they’re mostly unnecessary. Back in the day, if you wanted to share a link on Twitter, the link counted towards your character limit. If your link was long, the rest of the text in your tweet had to be very short. That’s no longer the case. Twitter uses its own link shortening technology, and however long the link you paste into Twitter, it’ll only take up 22 characters of your Tweet (it’ll also take up 22 characters if the original URL was shorter than that — all links on Twitter are “shortened”). One of the benefits of relying on Twitter’s own shortener is that it displays the beginning of the original link, rather than a shortened version.
In general, the ability to track users with cookies has been a good thing for the web. Tracking within sites allows us to maintain state, tie together a user’s page loads and data into a coherent session — without that eCommerce as we know it and most other web services would be impossible. Tracking across sites powers the targeted advertising that drives much of the online economy.
Tracking with cookies, however distasteful it may seem to some privacy advocates, gives the user a large measure of control. They can delete cookies, choose not to tracked (with varying levels of compliance from sites and browsers), and they can use an Incognito or Privacy mode that cuts sites off from cookies altogether.
Hey there, folks. Today’s piece is going to be a sort of primer on SSL Certificates. See, most of you probably already understand how important it is that you encrypt your communications. What a lot of you may not know is what actually goes into selecting the right certificate for your business.
That’s where we come in. We’re going to go over all the stuff you need to take into account when you’re choosing an SSL Certificate for your site. Best be sure you don’t ignore them – if you simply blunder out and buy the first certificate you come across, you’ll regret it.
It’s uncontroversial that sites handling sensitive data like credit card numbers should implement HTTPS to protect that data from snoopers. It’s also best practice to encrypt connections for sites that allow users to log in — not only is their data protected as it travels from the site to the browser and back again, but so is the authentication cookie that maintains their session.
But it’s becoming increasingly common for security experts and online service providers to recommend that all sites are encrypted with SSL. Google gives secure sites a bump in the SERPs and its Chrome browser may soon give users a visual warning if sites aren’t encrypted. That doesn’t just apply to the classes of sites for which encrypted content is now the norm, but to read-only sites with no sensitive user data and no logged-in users.
Virtual private servers are one of the most flexible and cost effective ways of acquiring a powerful server without breaking the bank. They’re much more powerful and versatile than shared hosting, and they’re much less expensive than dedicated servers.
You might think I’m biased (and I am), but I believe that web hosting is something that almost everyone can find a use for, and that virtual private servers are the best option for most. An always-connected, remote Linux server environment is an incredibly flexible tool that can serve many different purposes. In this article, I’d like to take a look at three ways a virtual private server could be useful to you this year.
Every year thousands of people make a New Year’s resolution to take up blogging, and they turn to WordPress as the content management system on which to build their site. It’s a great decision on both counts. Blogging can be a very rewarding pastime. And for those new to building a website, there’s no better foundation than WordPress.
A long time ago I was one of the people that resolved to blog as a New Year’s resolution. I muddled through, but there were numerous stumbling points, and I wish someone had taken the time to give me a bit of advice before I started. I’m now experienced with WordPress, and I’d like to use this post to tell newbie bloggers some of the things I wish I had known when I started. This isn’t a comprehensive guide to using WordPress, there are many of those available already on the web, but it does contain some pearls of wisdom that I learned the hard way.
You may have recently seen a story in which it was reported that airline wireless internet provider Gogo was issuing SSL certificates for domains owned by Google. There was a small storm of controversy around the story, because, in theory, issuing such certificates could allow the bandwidth provider to see content flowing over its network that the user assumed to be encrypted.
— Adrienne Porter Felt (@__apf) January 2, 2015
There are many different ways of thinking about web performance. In an earlier article, we looked at time-to-first-byte, and offered a number of suggestions for improving the time it takes for a site to get its initial data to the browser. But web developers need a more expansive conceptual toolbox than something as simple as time-to-first-byte, which is one component of a fast site, but far from the only important consideration. In this article, I’d like to talk about a more overarching way of thinking about performance optimization: the performance budget.
Performance budgets set a hard limit on some metric that impacts a user’s experience of a site. There are many combinations of metrics that can be used, and I’ll link to some resources that go into that in detail at the end of this article, but for simplicity’s sake, I’ll stick to page weight here.
In my (admittedly anecdotal) experience, when I ask companies about their content marketing efforts, in seven out of ten cases, they answer that publishing blog articles on their site is all they do. The other three out of ten also do guest blogging: publishing more or less the same sort of content on other people’s sites.
A big part of my living comes from writing blog articles, so I wouldn’t knock it, but concentrating on one type of content on one publishing channel is shortsighted. There are many other formats for written content, and many other types of content beyond the written word. A more inclusive range of content marketing can help companies build a broader audience.
One of the interesting (and some might say depressing) details to come out the recent collapse of The New Republic was the comment of new editor Guy Vidra that he thought “the magazine was boring and that he couldn’t bring himself to read past the first 500 words of an article.” The battle between a century-old cultural institution and Silicon Valley’s “disruptive” mentality aside, what struck me about this was Vidra’s apparent distaste for longform content.
If you look at the most successful publishing ventures on the Internet at the moment, stalwarts like the New York Times aside, we see Buzzfeed and Upworthy: outlets that focus on short content that generates clicks and shares. On the other hand, there’s a strong movement to promote longform content on sites like Longform and Medium. And, as there is for every movement espousing a niche interest, there’s a reaction against the rising popularity of longform. The result is acres of content — from the longform to the tweet — being published about the correct length of content and the benefits of writing a lot of it, to which this (short) article is about to add.
A web robot, commonly referred to as a bot, is any non-human web user. They are usually scripts or computer programs that access web pages for a variety of different reasons. The most familiar bot is Googlebot, which accesses and analyses web pages for inclusion in Google’s search index. Most site owners want Googlebot to come calling, but there are also many bots with less friendly intent, including those that visit a site and attempt to hack it or scrape the content for use on other sites.
In fact, a significant majority of web traffic is generated by bots rather than humans. Towards the end of 2013, Incapsula estimated that about 61.5 percent of all web traffic was bots.
WordPress has always been an excellent platform for both amateur and professional photographers to display their work. It comes equipped with great media management tools, especially since the improvements that have been made over the last couple of releases. It’s free, and it has an abundance of themes appropriate for the elegant display of images. And for professionals, there are no end of plugins that can be used to make WordPress the ideal web presence for a photography business.
In this article, I’m going to leave aside the business aspects of photography and concentrate on plugins that beginning to intermediate photographers can use to turn WordPress into a great tool for showcasing their work.
That’s the idea behind Eager, a tool from some ex-HubSpot employees. I’ll explain with an example. As I’ve said before on this blog, I’m a web dev dilettante; I like to play around with building sites, but I’m not an expert by any means. When an article about better link underlining popped up in my Twitter stream, I immediately wanted to implement it somewhere. I spent a few minutes studying the explanation of the technique and the CSS involved, and then I spent a few more minutes studying it, and then I gave up because didn’t understand any of it.
In the grand scheme of things, the world wide web is a young technology — less than quarter of a century old — but it’s had an enormous impact on the way the we live our lives and communicate with each other. Yet very few of us really understand the technology that forms the world’s dominant platform for cultural expression, business, and communication. Most are vaguely aware of what HTML and CSS are, but very few would be able to knock together even the simplest web page. Even fewer understand what it takes to manage the Linux servers on which the Internet largely runs. I think that’s a shame for three reasons.
Firstly, although I’m not of the view that everyone needs to be a developer, I do think some practical knowledge of coding and web development is useful in any number of ways for almost everyone. Our world is built on software and our communities are built on the web. Building a web site is an excellent way to develop skills that turn internet users into full online citizens who can contribute. Without the ability to understand what lies beneath the surface of the sites they interact with, web users have limited control over their online experience.
Although there’s mounting evidence that page speed doesn’t have as much of an effect on PageRank as was initially believed, that doesn’t mean it’s not extremely important to your success in the online arena. A slow site can still have an adverse effect on your traffic, as frustrated readers decide to take their business elsewhere. It goes without saying, then, that if you’ve got a slow website, you want to do everything you can to address the problem.
We’ll lend you a hand with that. In today’s piece, we’re going to go over five of the most common causes of website slowdown. More importantly, we’ll detail how you can deal with each one.
Let’s have a chat about frameworks. Although certainly not the best choice for every development project, when used in the proper context, a framework can be incredibly powerful. By using one of the many frameworks available on the web, a savvy developer can significantly reduce development time while simultaneously creating compliant, structured, and easily-maintained apps.
There are a ton of different frameworks available to you as a developer, regardless of the language you’re programming in. Today, we’re going to focus on one in particular: Yii. Designed for the development of high-performance, Web 2.0 applications, the open-source Yii Framework has been around since 2008.
When the web was first developed, it was designed to meet the needs of academic and scientific institutions like CERN, for which Tim Berners-Lee worked when he laid the web’s foundations. There was no concept of secure communications baked into the protocols that underlay the web. Tthe web and the world have changed since the early 90s, and the need for secure encrypted connections is clear.
We have a reasonable technological solution for providing encrypted connections between clients (often browsers) and servers. SSL (more properly known as TLS) and HTTPS are conceptually sound, even if the implementation sometimes leaves a little to be desired. But although SSL works, it is a million miles away from being user friendly: even technically adept people have trouble implementing SSL on their domains, which is why many don’t bother.
If you’re planning on using Linux to run your server, then you’ve an impressive array of different distributions to choose from. In truth, there are so many distros that it can get a touch overwhelming. Faced with such variety, how can you determine which choice is the correct one?
It starts with knowledge. Each Linux distribution is best suited for a particular brand of computing – and each distro has its own set of strengths and weaknesses. Understanding those benefits and shortcomings is the key to making an informed decision.
We’ll help you along. Today, we’ll be focusing on CentOS.
Some time ago we looked at alternatives to using Disqus on WordPress. Disqus is an excellent comment system and it’s easy to install on WordPress and any other site, but I thought it was worth the effort to take a look at what else is out there. In recent days, since Disqus announced that they were to begin serving adverts in comment threads, the topic has come to mind once again. There’s nothing wrong with advertising, of course, but it puts a fine point on the way that Disqus, a free service, generates revenue; namely, by using the data it collects from publisher’s sites to target advertising. My data because I write posts, but also that of the people who are commenting so the adverts can be better targeted.
I’m not paranoid about exchanging my private data for services: I’m an enthusiastic user of Google’s services and I understand the payoffs involved. What I’m slightly more hesitant to do is to make that decision on behalf of readers who make comments on blogs I write for. As you can see we use Disqus comments on this blog, but without the advertising turned on.
Tracking is the holy grail of the online advertising industry. Randomly throwing advertising at users has a very low success rate. The better advertisers can predict what a user will be interested in, the more likely they are to serve advertising that gets more clicks that convert to more sales. To target advertising, networks need to develop profiles of users, and the most common way to do that is with cookies. A cookie is placed in the user’s browser containing a unique identifying number, and whenever a browser visits a site that belongs to the advertising network, code on the page looks at the cookie. In this way, advertising networks can track users across the web — and if those users are logged in to a service like Google or Facebook, the tracking can be all the more accurate, because they can associate it with much richer data.
If you’re a regular reader of this site, the question in the title may seem a little basic, but for millions of non-technical people with a desire to blog or create a site, choosing a content management system from the plethora of available options is a serious business. So, in this article, we’re going to have a look, in basic terms, at what WordPress is.
I have mixed feelings about blog comments. On the one hand, they can help build a community and the best comment threads are full of insightful responses from thoughtful readers. On the other hand, they’re often a spam-laden, rant-ridden reflection of the worst of the web. I’ve often mulled the idea of removing them altogether. Many bloggers feel that their blogs are their space, and others are free to comment on social media or their own site if they have something to say. I come down on the side of comments; their virtues outweigh their potential harm, but if you’re going to go with comments, you have to be aware of the potential risks and the work involved.
To anyone working in the web design and development fields, it probably seems like old news, but it’s worth taking a moment to acknowledge that HTML5 has been officially recognized as a standard by the W3C, the body in charge of web standards (for the most part).
Of course, HTML5 has been in active use for years, but the modern version of HTML now has the imprimatur of the organization overseen by the web’s inventor, Tim Berners-Lee.
“Today we think nothing of watching video and audio natively in the browser, and nothing of running a browser on a phone,” said the W3C Director. “We expect to be able to share photos, shop, read the news, and look up information anywhere, on any device. Though they remain invisible to most users, HTML5 and the Open Web Platform are driving these growing user expectations.”
In what is becoming a worryingly frequent occurrence, a vulnerability has been reported in the SSL protocol used to encrypt connections between web clients and servers. The good news in this case is that the vulnerability occurs in a relatively ancient version of SSL (so old that it was still called SSL, and not the more modern TLS). The bad news is that the way SSL is implemented on modern browsers and other clients means that the ancient protocol is sometimes still used.
Cutely named Poodle (Padding Oracle On Downgraded Legacy Encryption) and officially named CVE-2014-3466, the vulnerability has the potential to allow an attacker to read plaintext versions of data that should be encrypted.
Although Apache is an immensely powerful – and capable – web server, it’s not perfect. There isn’t a platform on earth that runs flawlessly all the time. With that in mind, you’re fairly likely to eventually run into a problem or two with your Apache installation. Whether those issues are minor road-blocks or website-crippling catastrophes is entirely up to you.
Knowing what to expect can make all the difference here. That’s why today, we’re going to go over some of the most common issues you’ll encounter with Apache – and how you can deal with each one. Armed with this knowledge, you should be perfectly capable of keeping things running smoothly.
Apache has an undeserved reputation for poor performance. While it’s true that Apache isn’t quite as resource efficient as servers like Nginx that implement an event-based rather than process based architecture, with a little bit of tweaking and the right approach, Apache can be as fast as, if not faster than, Nginx.
Let’s have a look at four things webmasters can do to improve the performance of their web server. There’s a lot more to be said about performance optimization than can be fitted into a four item list, but the tips I’ll discuss here are a good start.
Today, we’re going to talk about a growing – and incredibly important – tool in the world of web hosting: the content delivery network. You’ve probably heard the term, at least in passing – and wondered if your website could use one. Today, we’re here to help you decide.
First thing’s first, let’s talk about exactly what a CDN does.
A CDN works by caching your content at several points of presence spread out across a global network. When a user accesses something on your site, the CDN taps into whichever point of presence is closest to them. This has several effects.
Tracking is the holy grail of online marketers and businesses that rely on accurate information about their users. The motivation for tracking is hardly ever as suspicious as some privacy advocates would have us believe. Companies use the information to provide better services. Nevertheless, users should be able to decide for themselves whether to allow their online activity to be tracked. That many decide to install tracking blockers and deny the use of third-party cookies is evidence that there’s a proportion of Internet users that dislike the idea of being tracked.
ETags are a method used by some site owners to circumvent user choice where tracking is concerned, and they are an interesting illustration of how tracking works.
Business Continuance Protection helps companies avoid one of the little-considered consequences of a DDoS attack: a huge bandwidth bill.
Denial of service attacks are the bane of modern site owners and web service companies. It seems like every month we hear a story about the biggest ever DDoS attack disrupting some service or other. DDoS attacks are especially pernicious because they can be so difficult to mitigate. Most hacking attacks exploit vulnerabilities in software that can be fixed if they’re discovered. But denial of service attacks turn the fundamental technologies of the web into a weapon to be used against online businesses.
Click-baiting works if your goal is simply to drive traffic, but businesses that don’t survive on page views and advertising impressions should avoid click-baiting if they’re to create the right impression.
As a writer of online content, I pay attention to the tactics that my fellow writers and publishers use to attract traffic. In recent months, I’ve seen an alarming rise in the number of business sites that with click-baiting. The virtues of click-baiting for a certain type of publisher may be debatable, but for businesses that rely on consumer trust, click-baiting can be a big mistake.
It was a great idea in theory: some writers can be relied on to publish authoritative content on particular topics, but the implementation was flawed, and so, Google Authorship goes the way of all Google programs that have failed to demonstrate their value to the search giant.
In what will not be much of a surprise to those who have been following Google Authorship, the system for linking content to an author via a Google Plus profile has been retired. The writing was on the wall for Google Authorship since last month, when it was announced that the rich authorship snippets that Google has been including in the SERPs for the past couple of years were to be slimmed down with the removal of the byline head shots that had been Authorship’s major draw.
There are lots of things we can do to make our sites faster: caching, image optimization, compression, and, of course, low-latency web hosting. They’ll all have an impact on load times, but often, taking a scattergun approach to performance optimization isn’t the most effective method. To get the best performance from a site, we need to think about what needs to load to present a usable site as quickly as possible and then load that first.
I’m by no means a web developer, but I do like to play around with web technologies. I was recently looking into the new(ish) CSS Flexbox layout methods, and came across a problem that plagues developers all the time. Support for Flexbox is spotty: most of the major browsers support it but a handful of mobile browsers do not, and nor do some versions of IE and Safari. If I were to use it on a site, I’d have to use browser prefixes to support IE10, for example.
It’s a pain to have to write browser specific styles using prefixes and if you want to implement experimental CSS, you end up doing it a lot, which can make CSS messy and difficult to maintain. And, let’s face it, it’s just not fun at all.
Aside from choosing your host, selecting your control panel is one of the most important decisions you’ll make as a client. Whichever control panel you ultimately settle on, it’ll have a direct impact on both cost of ownership and functionality. Save yourself the headache, and make the right choice right at the start.
At this point, some of you are probably leaning towards cPanel. I don’t blame you. It’s the current industry leader, backed by a passionate and thriving community and equipped with a comprehensive set of features. It’s reliable, it’s durable, and it’s powerful; it’s not hard to see why it’s got the biggest market share.
Gulp is a powerful web development task runner and build tool. We take a look at what Gulp can do and why you should be using it in your web dev workflows.
If you’ve been around as long as I have, you’ll remember when writing a web site involved a text editor, HTML, and possibly some CSS if you were feeling adventurous. Times have changed. Web development, even for simple sites often involves complex workflows with many different moving parts.
Attracting people to your website is only half of the problem. Getting them to stay is equally important. Sites that generate revenue from advertising are motivated to keep people clicking through content for as long as possible. The more content they read, the more advertising impressions they are exposed to and the greater the likelihood that they’ll click on one.
There are various techniques for keeping people on a site, but one of the most used is related content. A reader reaches an article that they find valuable, they read it, and when they’ve finished they’re presented with a list of related content. Many content management systems have plugins for generating related content lists — YARPP for WordPress is one of my favorites.
A few years back, Apple made the highly-publicized decision to switch from MySQL to PostgreSQL. Many of their reasons for the switch are still valid today. Even though MySQL is still the most widely-used open-source database solution, that doesn’t mean PostgreSQL isn’t every bit as powerful. Quite the contrary; PostgreSQL might be the new kid on the block, but it has many dedicated supporters that consider it a superior database solution.
There’s no easy way to say it: online advertising is in trouble. An epidemic has been sweeping the web for the past several years; a widespread trend that’s costing webmasters all over the world a small fortune. I am speaking, of course, about ad-blocking software.
Users have grown frustrated with advertisers, and I certainly cannot blame them. Emboldened by lax regulations, two-bit advertisers have been loading the Internet with some of the most invasive, resource-intensive, poorly-designed advertisements I’ve ever seen. Sometimes, these ads are disguised as download buttons or Captcha boxes. Sometimes they play obnoxious sounds or slow the user’s browser to a crawl.
In the early days of the web, everything was coded by hand. HTML, and later CSS, were written by developers and uploaded to servers. There were no content management systems. In some ways, that made things simpler and it certainly made things faster from the perspective of users. Static sites without databases and complex server-side code to execute are blazingly fast to serve and have very low resource requirements.