I remember working on a subscription-based website ("Software as a Service", or "SaaSy") designed to help sales people put new deals together. It had been a few weeks since we cleaned up the database; the display tables on the web forms were getting bigger and bigger, while the history tab was starting to get cluttered ... just like a live production environment. As we added test data every day, parts of the software started to render just a little bit slower.
I was talking about this with a developer and the response was basically "Oh, that's just the browser. We can't do much there".
I think every tester has heard that line at least once. If you haven't, you are more fortunate than most, and your day is probably coming.
Let's take a look at a few ways we can get a better hold on performance in the user interface.
The few times developers weren't interested in what I was saying about performance were partly because of the way I presented information. They could have done a better job of listening and investigating, sure, but I can't control their behaviors. One thing I can control, and improve, is how to explain the situation.
"Slow" is an opinion; "page rendering jumped from 100MS to 5Seconds" is hard data. Profiling, or using tools to see everything that is loaded and how long it takes, is how we move from subjective to hard data. Using profiling tools, you can see that the images loading every time you hit refresh in the browser are larger than you had imagined, or that there is a lot of data coming down the pipe that may not be needed at the moment. Even more important, you can collect data and tell a story like:
"I navigated to the campaign setup screen and it tool 20 seconds to load the first time around. In developer tools I can see that there are 10Mb of images loading, and another 10Mb of data. The previous build only took 5 seconds to load the first time"
This story gives a few important things -- data about the things you think might be causing the slow-down, and context about how bad the slowness is based on how things were before.
There are a couple simple ways I like to collect this data. You could use a stopwatch, but that doesn't tell us if the problem is the delay between connections, time-on-server, or the front end. The YSlow browser plugin is a nice place to start getting performance related information. I like this tool because of its simplicity, you don't need a technical or performance testing background to begin collecting useful info. Using this tool is just a single button click. After the profiler runs, you will get a list of grades for different aspects of the page along with recommendations. (This brings truth to the lie "we can't do much there.") You can even set YSlow to run every time a page loads.
If you want to go deeper, open up FireBug or developer tools in Google Chrome. The magic here is in the Network, and Timeline tabs. There you'll find every file that is loaded into the browser for your page and exactly how long it takes.
Take your time to gather information.
Sometimes, performance problems will sneak up over time and surprise you. There is a popular sentiment that developers should avoid optimization until it is absolutely necessary. I understand this - we do need to make and sell software, not build an ivory tower. That attitude ignores what will become sneaky performance problems, and eventually they will need attention. There are a few common things I have noticed cause browsers to creep slower over time. Being aware of these problems turns browser performance into something more like a design choice than a boring optimization task. They are also changes to watch out for when testing.
Images can be very data heavy with much change in view impact. It's common, for example, to load a large image then shrink it to a small space. When it comes to images, less is more, yet image heavy software is increasingly popular. The last product like this I worked on was a marketing platform. There was a list of ad campaigns, and the user got to the campaign details by clicking an image. Eventually, that leaves the user with a page that could have 20 or more good resolution images. We learned about the performance implications by adding as many campaigns (and images) to that page as we could and running a tool like YSlow.
There is more to the data on a webpage than the text blurbs and labels. Everything is data, and the more you have of it the longer a page will take to load.
We know that everything is data, and I learned the hard way that page controls sometimes have more than expected. I once had a product that was used for documenting anesthesiology cases. Each case showed up as a row on a page along with a few buttons, and labels. During one release, we added a drop list to each row as a way to set status. It seemed like a fairly simple change at the time. On release day, we used a larger data set and noticed that things came to a crawl at around 50 rows. The drop list and data for each row added a large amount of data to the page load that we hadn't anticipated.
Seeing Front-End Performance Problems Before They Exist
One question I see testers ask often is "When should testers get involved in a project". The generic answer is as soon as possible, we can add value before the software even exists by asking questions and testing ideas. I don't have any problems with that answer, but let's get more detail on why.
Agile projects run fast, there is a short planning meeting where the team -- programmers, testers, and product manager -- talk about what needs to be built over the next two or three weeks. This meeting is usually light on details. It is also where you can start asking meaningful questions that will get the team to dig deeper into what they are making. When I see a page being designed that could have a lot of data, I like to ask questions about what the upper limits on that data is. Exactly how many rows should be displayed? If we don't know, how will we handle larger data sets?
The role of the tester isn't design, and while I could ask about things like lazy loading and paging, that might be solving the wrong problem. Asking questions on the other hand, can clarify questions the team didn't know they had.
Talking About Performance
Sometimes, when I talk about problems I've noticed or things I want other people to think about I just mention it in passing (this is a thing, some call it MiPing). There are a couple of benefits to this casual conversation -- this is a quick way to plant seeds of ideas you can grow later, it is also a way to prime people for a deeper conversation. There is a drawback, though; sometimes these passing conversations are so brief that no one can do anything with the small amount of information.
The times I tried to mention performance problems in a hallway conversation usually didn't go well. Bugs and software problems need context to make sense and be compelling, and performance problems are no different in that regard. The type of context these problems need is different from your run of the mill software bug though. Performance problems need data -- file sizes, load time, data sizes, and benchmark comparisons -- to make sense. It can be hard to talk about these things in a hallway conversation.
There is a spectrum in bug reporting that ranges from very formal documentation in tracking systems that have carefully designed workflows and daily triage meetings, all the way to developer and tester pairing where the report is a conversation and problems are fixed right away. I find that slowness in the user interface is best off somewhere in the middle. Not for the purpose of ceremony or process, but so all the important parts of the problem make it to the people that need them.
If you're going to have that hallway conversation, you might want to collect some data first. The phrase "Hey, this page feels a little slow" is much better followed by "...and here is some data" then silence and a blank stare from the programmer that feels like their time has just been stolen.
Performance testing can feel like a challenge to get into. There are lots of technical tools to learn, new concepts to think about like SQL indexing and memory caches, and the math problems that are bench marking. Starting in the user interface can be more approachable, yet also a great way to add value to your team.
Do you have any UI performance testing tricks? I'd love to hear about them.