Putting Web Site Quality and Accessibility into Context - Define and Implement Testing

To ensure a Web site meets targets, we need to define and implement tests that provide evidence as to whether the target has been met. If we don’t, we are simply hoping for the best. In my experience, this tends to be the norm. From the survey results, it appears that it’s also the case for many organizations.

Testing Prior to Launch

We already know that usability is important — the most important of all the quality factors according to the survey results. So it stands to reason that this would be the most important aspect of a Web site to be tested before it goes live. Reason, it seems, is lacking. Only 15% of respondents do a lot of usability testing prior to launch, and 13% do none at all (see Graph 2). The results are similar for security: 16% do a lot of security testing, and 17% do none at all.

Let’s take a step back for a moment. Usability is something that more than half of the respondents claim must be present for a Web site to be of high quality, but less than one-fifth of respondents bother to actually check if their site is usable. In my experience, most clients will only do informal testing; they will check it themselves, get colleagues and friends to check it, and maybe have family members check it to get an impartial point of view. The result is ad hoc. For example, on a recent project for a pathology organization, the feedback I got was, “We think the site would look and read better if the text was justified.” My gut feeling was that they were wrong. I did some quick research (about one hour of reading results from a Google search on “text justified readability”). What I found is there are a number of views on the matter, but the bias is swayed in favor of left-justified text when viewing Web pages. I sent a few links to the client, and the request went away. Had I have just followed orders, the client would have a site that was actually worse from a usability perspective. This is what happens when there is a lack of proper usability testing. What “we” might think is a good thing may actually do more harm than good.

Let’s put this in a different context. Imagine that 50% of chefs say they must wash their hands after going to the toilet before touching any food (personally I’d hope it was 100%!), but when those chefs are asked if they actually wash their hands, 12% say they never do, and only 14% say that they do all the time. Having learned this information, how comfortable would you feel next time you went to a restaurant? OK, this is a bit of an extreme example, but the situation is the same. We may say that something is important, but once again, only a small percentage — less than 25% — actually bothers to test for whatever it is that we claim is important.

Does that mean as an industry we don’t really care? Or that we are all so good that we don’t need to check? According to the survey results, we do need to check given none of the quality factors is present on all Web sites surveyed. I’m sure most people working in the Web industry would like to consider themselves to be professionals. I like to think of myself in that way, but I don’t have the proof to back it up. Over a period of 12 years, I have been involved in only a handful of projects that have gone through formal usability testing and even less that have been through a formal security audit. Just thinking about this makes me wonder how I’ve managed to get away with it for so long. In my case, it’s often because the client does not value testing or, more accurately, doesn’t want to pay for it. But, as a professional, it’s my job to convince the client of this if what we are testing for directly relates to a target that the site has to achieve.

Testing Post Launch

Web sites rarely remain static. Very few sites these days are what you could call “electronic brochures,” that is, a static Web site that is launched and never changes. Most Web sites will evolve over time in a number of ways. Content is added, edited, and removed. Features are added, modified, or removed. New designs are applied. Given this, you would expect that if a site is tested prior to launch to ensure it meets certain standards, it would be important to retest the site after each change or at least at regular intervals to ensure it is still meeting those standards. In my experience, this doesn’t happen. The usual scenario is that a problem surfaces and reactive measures are taken. The most common example I find is that a site is tested prior to launch to work on a number of browsers. At a later date, someone surfing the site using one of those browsers finds something doesn’t work and alerts the site administrator who then takes action to rectify the error — a reactive approach, but at least action is taken; sometimes the error never gets fixed. So, in an ideal world, there would be a plan in place to retest a Web site. When we asked the question whether sites have been retested since launch, 43% say no retesting has been done (see Graph 4).

Graph 4 - Since launched, what have you retested?

In summary, not enough testing is done prior to launch, and even less is done after launch. You could argue then that testing isn’t that important, but that would be wrong. When there is a problem with a Web site, if images are missing, or if the site is responding slowly or not at all, I get urgent calls from clients wanting this problem fixed, but the clients will repeatedly fail to put the effort in up front or take preventative measures once a site is launched. Once again, it’s a matter of not wanting to make the effort up front.

Maintenance Plan

A car needs regular maintenance to ensure it continues to operate well. A Web site is no different. As Web sites change over time, it’s important to make sure a plan is in place to maintain the site so that changes made are not just reactive but, rather, considered and planned to ensure the Web site continues to meet the targets set (assuming there are targets in the first place). Fortunately, it seems this is something that the survey shows is actually being considered. Although only 34% of respondents actually have a maintenance plan in place, 46% either are reevaluating their existing plan or working on one (see Graph 13). The remaining 20% don’t have a plan at all, but overall, 80% have something in place or are working on a plan. This is the most positive response from the survey and shows that at least as an industry we not only recognize that Web sites are dynamic but that we need to have plans in place to keep them working well.