RUM vs Synthetic Monitoring
Understanding the performance of your website is vitally important for many reasons. And, odds are, someone in your organization is performing some form of monitoring on your site today. But is that important performance information being shared and used effectively by all parties that should care about it?
Website monitoring solutions fall into two categories: synthetic and real user monitoring (RUM). Synthetic monitoring uses canned scripts to simulate user activity on the site. The scripts execute from one or several cloud locations, execute at predefined intervals, and collect performance data along the way.
Real user monitoring, as its name implies, collects performance data from real users that visit the site. Additional information, like where the user is visiting from, the browser type and version, connection speed, etc, is also collected.
Both synthetic and RUM monitoring tools provide invaluable insight into site performance, but neither is a magic bullet. In fact, there is appropriate use for both on the same site.
The chart below summarizes the differences between synthetic and RUM and shows the types of information they provide to different parts of the application lifecycle environment.
Synthetic | Details | RUM | Details | |
Development | ||||
Controlled environment | y | n | ||
Understand 3rd parties – waterfall | y | y | ||
Full page weight – resource sizes | y | y | ||
Performance budgets | y | y | ||
Content checking – screenshot on error | y | n | ||
Film strips | y | n | ||
Test | ||||
Competitive benchmarking | y | n | can only execute on a site you control | |
Baselines | y | y | ||
Test all paths (including conversion) | n | Limited number of paths based on script | y | see all paths of real users |
Understand browser performance | n | Limited to a few browsers | y | data available for all real user browsers/versions |
Understand traffic patterns | n | y | ||
Understand performance at various percentiles | n | y | report on 10th, 25th, 50th, 75th, 95th percentiles | |
Compare releases | y | y | ||
Build accurate load tests | n | y | understand production peaks to replicate during load testing | |
Production | ||||
24/7 monitoring | y | y | ||
Availability % | y | n | ||
Alerting | y | y | ||
Capture all performance situations | n | y | ||
Understand performance of actual users | n | y | ||
Capture complete demographics and impact on performance (location, browser, device type, OS, network, etc) | n | y | use filters to isolate the desired user population | |
Understand peak concurrent users/ page views | n | y | important data for load testing | |
Understand performance of 3rd parties in real time | n | y | ||
Understand impact of performance on business goals | n | y | ||
Understand the opportunity cost/benefit of performance | n | y | how much more revenue will I gain/lose with a faster/slower site? | |
Identify the pages with the greatest impact on business goals/conversion | n | y |
There is no single magic bullet for performance testing. RUM numbers represent a broad spectrum of user experiences as measured from the field. Synthetic numbers represent a snapshot in time as measured from a clean room environment. It’s important to understand the strengths and weaknesses of each tool type and apply the proper tool to the situation.