What is Website Performance Monitoring?

Everyone knows what it’s like to arrive at a website only to have to wait, ever so patiently, for the site’s content to load.  For ordinary users, it can be annoying.  But for businesses, it can be disastrous.Web performance monitoring tools like AlertSite can drastically improve the efficiency of your WPM and WPO strategy and often times necessary to gain the upper edge in today's competitive e-commerce market. The fiscal significance of monitoring website performance has been driven home in recent years by repeated studies demonstrating that even barely noticeable sluggishness in site response times—including delays lasting just a fraction of a second—can lead to increased bounce rates and decreased sales on major ecommerce sites.  Add to this picture the rise of smartphones and mobile-optimized websites, which have enabled people to communicate, read, watch, and shop online while busy and on the move, and the sheer speed of the Web has never been more important.

There are two main types of Web Performance Monitoring, Synthetic Monitoring and Real User Monitoring. These days, a variety of automated software applications are available that can continuously monitor a website, alert webmasters or system administrators to potential problems, and identify specific errors that may be inhibiting a site’s optimal performance.  But these issues weren’t always so clear.

A Brief History of Web Optimization

Since the early days of the Web, making sites and their content load as quickly as possible has been crucial for increasing visitor retention and engagement.  Whether it was simple HTML text, massive 640x480-pixel BMP images, looped MIDI audios, tiny embedded AVI videos, e-shopping cart JavaScript, or pages packed full of tacky, blinking animated GIFs, the amount of content that lived on a website was tacitly understood by all Web visitors, within their first few experiences of using it, to largely determine how fast any given page would finish loading.  Pages containing plain text loaded the fastest; pages containing audios, videos, or high-resolution images took a lot longer.  Nobody had to explain this, because everyone experienced it, constantly. And everyone was—and, to this day, continues to be—frequently frustrated by it too.


How's Your Site's Performance?

Start a FREE 30-day trial of AlertSite


If only to keep their visitors happy, conscientious webmasters of yore realized that they had to do something about their less than impressive page-load and response times.  Among other things, cleaning up and optimizing lines of code, ensuring that the most important page elements loaded first, conducting various A/B tests, and maximizing server hardware to meet site visitors’ demands—taking advantage, always, of the latest telecommunications infrastructure and modem baud rates—all became relevant considerations in the development and maintenance of any website.  Although the source of problems or site slowness wasn’t always immediately evident or easy to figure out, webmasters interested in improving their visitor retention rates could spend some of their spare time making tweaks to likely trouble spots until site performance seemed to improve.

analyzing websites

But it wasn’t until 2004, as high-speed internet connections and ecommerce websites (often with elaborately overdone Flash-animated intros) came to dominate the digital marketplace, that the relationship between high-speed websites and business revenue became obvious enough for a few performance engineers, such as Google’s Ilya Grigorik and Yahoo's (and now Google’s) Steve Souders, to begin actively evangelizing the topic’s importance.  By early 2007—the year of the iPhone—Sounders was speaking and blogging about what he termed “Web performance,” publishing a book called High Performance Web Sites later that year and, in 2008, cofounding Velocity, a popular conference series devoted to the subject.  In May 2010, Sounders wrote:

[The] convergence of awareness, even urgency, on the business side and growing expertise in the tech community around web performance marks the beginning of a new industry that I’m calling “WPO”—Web Performance Optimization. WPO is similar to SEO in that optimizing web performance drives more traffic to your website.  But WPO doesn’t stop there….  WPO also improves the user experience, increases revenue, and reduces operating costs.

Web performance optimization, or WPO, now returns nearly a million Google search results, and its significance is only growing. Somewhat ironically, as websites begin to load almost instantly and the margin of difference between the speed of one site and another shrinks into the arena of milliseconds, those 1/1000th-of-a-second differences start to make all the difference between who profits and who loses online.  As the noted tech-industry venture capitalist Fred Wilson said back in February 2010:

First and foremost, we believe that speed is more than a feature.  Speed is the most important feature.  If your application is slow, people won’t use it… There is real empirical evidence that substantiates the fact that speed is more than a feature.  It’s a requirement.

More than anything else, it is the eye-opening empirical studies (see below) referred to by Wilson that finally drove home the necessity of Web developers, user-experience experts, and backend engineers making WPO one of their highest priorities. But with the rise of interest in WPO came a growing need for companies to be able to better detect, analyze, measure, and monitor Web performance issues in order to establish clear metrics and determine what, exactly, needed to be optimized.  And thus, over the last few years, a variety of Web performance monitoring tools and practices have sprung into being to simplify the lives of webmasters everywhere.

Analyze. Optimize. Monitor. Alert

Facts and Stats:

  • 73% of mobile internet users say that they’ve encountered a website that was too slow to load.
  • 51% of mobile internet users say that they’ve encountered a website that crashed, froze, or received an error. 38% of mobile internet users say that they’ve encountered a website that wasn’t available.
  • 47% of consumers expect a web page to load in 2 seconds or less.
  • 40% of people abandon a website that takes more than 3 seconds to load.
  • A 1 second delay in page response can result in a 7% reduction in conversions.
  • If an e-commerce site is making $100,000 per day, a 1 second page delay could potentially cost you $2.5 million in lost sales every year.

Source: Neil Patel

Nowadays, website performance is can act as a key differentatior and determinator for digital experience, brand perception, and ecommerce success.  Monitoring websites for availability, performance, and functionality is essential in ensuring that each user experience is exceptional, and as envisoned in the web design. 

Understanding Web Performance Monitoring

hourglass slow website

Web Performance Monitoring includes a few different components:

  1. Website Monitoring
  2. Web Application Monitoring
  3. Can include API Monitoring

Oftentimes, websites include web applications that drive user experience, information retreival & display, and more. Understanding how both of these, including any APIs you may rely on for web functionality is essential in understanding the web experience you're delivering to each user.

Website performance and web app monitoring requires an understanding of how a user would interact with a site or application, and identifying the 'business critical' components of these processes, so you can set up a monitor to ensure everything is up and running, fast, and operating as expected. Typically there are two layers of web performance monitoring - proactive monitoring and reactive monitoring.

Real User Monitoring, or RUM, is known as reactive or passive monitoring, as the systems or tools in place simply observe real user interactions with your website from the background and track the website or web app's reaction to these interactions. This is a good way to understand what actions users are taking or attempting to take, as well as some interesting information about how users react to your content.

Synthetic monitoring is known as active or proactive monitoring, as these tools emulate a real user and interact with your website or application on a schedule, so that if something does malfunction, you're able to find and fix the problem before users encounter the problem. 

Getting Started

How can you craft a strategy to empower your company to monitor continuously?

  1. Identify Roadblocks
  2. Identify Key Transactions and User Journeys
  3. Standardize Metrics and Share Data 
  4. Reuse Assets – Leverage API designs and test scripts to set up new monitors
  5. Reduce Risk – Monitor in Test
  6. Get Full Visibility – Monitor internally and externally

Identifiying tools that empower you to understand what is happening from both within your internal systems and from an external, end-user perspective is vital in ensure you have full visibility into the digital experience that you're delivering. 

Consequences of Ignoring Web Performance

Everyone remembers Twitter’s infamous fail whale, and more than a few of us have suffered the loss of Gmail during sudden unexpected downtimes. When thousands of visitors try to access a site simultaneously, the overwhelming load issues can bring otherwise speedy sites to a standstill or crash them completely, dashing the hopes of online retailers during, say, an otherwise promising Cyber Monday.

These are extreme examples of site performance issues—easy for all to see, understand, and react to. Plus, they relate more to sites not being accessible at all, as opposed to more common sluggishness and download problems.  But what happens when your site’s performance issues tend more toward the opposite extreme, almost not seeming like “issues” at all—consisting of delays that are subtle and difficult to detect, affecting users almost subliminally?

In 2006, Amazon’s Greg Linden reported that during A/B testing, their team found that even a 100-millisecond latency in page-load time decreased sales by 1%. That same year, Marissa Meyer, then a VP at Google, reported that just a 500ms delay in Google’s search-result response time resulted in a 20% drop in traffic and revenue.  (If just half a second dropped Google’s ad income by 20%, imagine the damage that a three-second-delay could do.)

And that was just the beginning.  In 2008, tech research firm Aberdeen found that a one-second delay in response time resulted in 7% fewer online conversions, 11% fewer page views, and a 16% decrease in customer satisfaction. In 2009, Akamai, in partnership with Forrester Research, discovered that the average online shopper expects pages to load in two seconds or less (which was down from the four seconds that used to placate shoppers in 2006).  After a three-second lag, they learned, up to 40% of potential customers would abandon a site.  Also that same year, the ecommerce experts at Bizrate and Shopzilla reported the completion of a 16-month engineering overhaul of their entire platform, resulting in better site uptime and dropping average full-page download time to just 1.2 seconds, which was down from a previous 6-9 second page-load time before the overhaul began. The changes delivered a 5% to 12% life in top-line revenue, depending on traffic source, and clearly thrilled the company.

By 2012, Web users’ impatience had reached unprecedented levels, according to The New York Times, with people opting to visit a site less often if it is slower than a close competitor by only 250ms. In other words, if you can blink your eyes before a site loads, it’s not worth visiting.  Mobile sites and apps still have considerable leeway, the Times reported, given that smartphone connection speeds are so variable, but users are getting more impatient on that front, too.

Considering these studies and others like them, the financial implications of site performance for every company hoping to conduct business online have never been more obvious.  And it goes for all sites, too, whether one is hoping to make money or not.  Website performance can be optimized for a huge variety of factors, including revenue, customer acquisition, visitor retention, user satisfaction, site uptime, site load capacity and traffic, time on site, and basic clicks.  Fortunately, various web performance monitoring tools can keep an automated eye on all of those criteria and more.

Many modern Web performance monitoring applications are cloud- and browser-based, enabling software engineers to check on their site’s performance from anywhere, at any time.  A common feature of these programs is continuous, 24/7 site monitoring, which can send automatic email or SMS alerts whenever key indicators—such as a delay in the site’s response time or sudden traffic spikes—veer outside of pre-established optimal levels.  But most importantly, however, for the sake of WPO, some of these tools also perform detailed analyses and diagnostics of websites on the level of user experience (UX) itself—determining where users tend to click the most, the order in which page elements are first displayed to a visitor’s eyes, and how a given site renders within different Web browsers. With performance monitoring software providing this level of insight, developers can delve into a site armed with crucial information to help them improve their code, streamline the user experience, and accelerate page-load times to be as fast as possible.

Of course, at a moment in history when websites are competing to load faster than we can blink, sheer speed as measured by software or research studies doesn’t count for everything.  The real question is, and will continue to be: How fast is fast enough?  

The Future: Digital Experience Optimization and Monitoring

user experience

It should always be remembered that monitoring Web performance is for the sake of human beings, not machines. The ease, enjoyment, and perception of your site’s users—not computer metrics—are everything.

With this in mind, one of the key distinctions in Web performance monitoring and optimization is between response time and page-load time.  Technically speaking, both of these terms have a wide range of meanings, but we’ll just say here that within the context of a specific website, response time refers to how long it takes for individual elements on a page to appear to a user, while page-load time refers to downloading and displaying the entire page itself (including all HTML documents, scripts, stylesheets, and embedded objects).  And this distinction is important because, as every developer knows, what your end-users don’t know won’t hurt them.  If some of a webpage’s elements don’t load, but it doesn’t noticeably detract from a user’s experience, then do those elements really matter? As long as the most essential and useful elements of a site display fast enough to satisfy a human user, then the overall page-load time might even be irrelevant.

This seemingly obvious idea—human-centric Web optimization—was perhaps the dominant theme at the 2013 Velocity Conference in New York City, and there is little doubt that as websites and mobile apps get more and more complex, getting clearer about the end-user’s actual perception of a site (versus the technical reality as determined by software) is going to be the defining characteristic of the future of both WPM and WPO practices. As Mike Petrovich of SNAP Interactive observed in his Velocity presentation, “A page is ‘done’ when it’s user-ready—when the essential elements of a page are ready for the user to use.”

This doesn’t mean that performance monitoring tools—and optimizing Web performance based on what those tools reveal—aren’t still essential.  It just means that user-experience needs to shift to center stage in the performance equation.  We can even write it out as WPM + WPO = UX, and sum it up with a question that may guide web performance practices for the foreseeable future:

How do we accurately monitor and measure the experience we are delivering to a user in order to optimize it for them?

Further Resources: