We all know the phrase, “throw the baby out with the bathwater”. Unfortunately, marketers are often presented with enticing opportunities to do just that. One classic example is the monolithic “website redesign project”. The purpose of a site redesign should only be to move the business ahead. A lateral, or even a regressive redesign can be a costly mistake.
Websites can be made of thousands or even millions of words, images, and lines of code. Tracking these details and their effectiveness requires some discipline.
The challenge is, beyond intuition, how do we really know if a redesign is a step forward? With other stakeholders, how can we collectively decide and agree on a direction? Is there an objective method of measuring something so subjective?
One of the practices we’ll use to verify changes before they are implemented is to keep a list of “User Tests”. User tests are a simple method of gathering and measuring the requirements of a successful project.
User Tests are simple statements that at a minimum, should contain an “Agent”, a “Goal”, and “Location”. Good tests should be answered with a simple checkmark indicating “Pass” or “Fail”, not requiring additional notes, explanations, or qualifiers.
A visitor should be able to search on all pages. GoogleBot should find a unique title on all pages. A returning buyer should be able to view order details in my account.
By keeping a complete spreadsheet of User Tests, the savvy marketer can keep tabs on all changes before and after implementation. Thereby, ensuring a redesign does not leave some important capability behind.
There is no need to make your user tests complex. In fact, the simpler, the better. Simple user tests are faster for you (or your intern) to evaluate. Simpler tests are easier to explain. At Alpine, we feel that a test should be understandable by any staff member and our client.
Some people want to add other columns for “scope”, or “risk”, or “owner”. That’s fine, but try to keep it simple to explain and evaluate.
Define all Agents
Be sure you know who the agents are in your user tests. A visitor is different from a returning visitor and will have different goals when visiting your site. Googlebot, Bingbot, and the facebook crawler all have different data mining needs and preferences.
Don’t forget your back office agents .. content authors, product managers, shipping staff, and managers can all benefit from specialized user tests.
If each type of agent in your tests are defined and profiled, your user tests will be more personalized and therefore, will be more accurate.
Start With The Best Case (“Sunny Day” Tests)
When all is going well, product is shipping on time, customer service talk times are down, public relations are favorable… these are the sunny days where goals are met quickly, and response routing works perfectly. Plan for these.
Even if, the rest of the system lacks some efficiencies, keep these in sight for that day when it all works.
Think About the Worst Case (“Rainy Day” Tests)
Starting with the most likely things that can go wrong, start to think about how your website can help support your business. This is the land of the “what if” scenarios.
What if the phone system goes down? What if the purchaser does not get your confirming email? What if that color is not available? You know your business. You’ll know all about the rainy days.
Write your worst case scenarios down, then plan workarounds with one or more user tests.
Ok to Fail
Not all tests must pass initially. If all tests pass right now, you would be reading this from a beach somewhere because you have nothing to work on.
Write some tests that reflect your direction and desires. That way, later on, your redesign plans can include some user tests that are currently failing.
What If It’s Not Broken?
We all tend to get a little complacent when things are going great. So, it’s these times that tempt us to stop changing. If everyone in the board room loves that you are getting a 3% conversion rate. Why change anything? Why fix something that’s not broken?
The problem is, someone will get the bright idea that someone else is getting 4%, 5%, or even 30% conversions and you are suddenly behind.
So, what if you took 1% of your overall traffic and tested 2 new kooky ideas every month?
If any one of those kooky ideas results in a 1% improvement… you know the drill. After all, a jump from 3% conversion to 4% could represent a 25% increase in sales.
Share Your Tests
If you have colleagues running other websites, collaborate and help each other improve your user tests.Often, another perspective adds value.
Feel free to work with customers, people that know your work well, people that don’t know your work.
Share Your Results
Create a sense of accountability with your staff and your stakeholders. Help them understand the gravity of their choices, possibly risking existing results for a new idea.
Don’t wait for the next monolithic “website redesign project” to build a list of user tests. Start now with your existing site. Track the success and failures and plan for enhancements right away.
Can one or two of those enhancements fit into this month’s budget for maintenance?