Skip to main content

In May, births, deaths and marriages content from the Department of Internal Affairs was published on It’s a development we’re very excited about and like all new parts of the site we want to be sure we get it right, so in April we undertook some user testing.

The user testing marked a new phase of our testing approach: rather than doing big tests occasionally, we want to move to testing small and often, which we feel is a better fit with our iterative approach. So for this round of testing we did click-tests. Click-tests are short and can be done remotely so you can move a lot of people through them and get some good numbers on what’s working and what’s not. We sent word that we were looking for participants through internal networks, Twitter and a UX (user experience) participant recruitment company and the results were interesting.

What we learned

We learned that a lot of people couldn’t tell the difference between historical and non-historical records. The definitions are finicky and not immediately clear to the average reader. We also learned that our navigation was working well on the whole. For example, we thought people might have trouble starting at a lower-level contacts page and working their way up, but this was easy for most and needs only minor design tweaks.

More interestingly, however, we found proof of what we’ve come to call ‘the Wellington bias’, which we’d long assumed existed. This bias is based on the assumption that people in Wellington know how government works and understand associated jargon because they or someone they know works in government. Wellington has 18,637 public servants and many more providing services from the fringes. Wellington’s population is also highly educated relative to other parts of New Zealand, making it less representative of the overall population (as illustrated in this great infographic by colleagues from Statistics NZ).

We were under time pressure for this test so we relied quite a bit on our own networks for recruitment, which generated a geographical bias in survey results as our networks stem from Wellington. This group’s results were substantially different to those we sourced through the UX participant recruitment company, which were geographically representative with high school as their highest level of education as a really rough proxy for literacy levels.

We know that literacy is an important (and often overlooked) issue: four out of five New Zealanders do not have a highly effective level of literacy. And the Ministry of Education has found that The majority of Māori, Pacific Islands people and those from other ethnic minority groups are functioning below the level of competence in literacy required to effectively meet the demands of everyday life. (See previous blog posts about low literacy and government, eg Part I: Literacy and government websites – to the data! and Part II: Literacy and government websites – to the data!)

The participants recruited through the UX company took longer to complete the tests with significantly lower success rates for tasks: they averaged 54% compared to 66% for those we recruited through Twitter, and 73% for those who work in government. The median amount of time taken by those recruited through the UX company was 4.36 minutes, compared to a median of 3.32 for the other participants.

Want to help?

Although we cannot make inferences about user ability or behaviour from these stats, we find it striking nonetheless. And it’s a reminder to us of the importance of user testing with a broad range of people. In the coming months as we move to testing small and often, we’ll be looking to build a remote testing community who are happy to do similar short-click tests for us.  We’ll be putting significant effort into making sure it’s representative, especially geographically representative. If you’d like to be involved please let us know.

Utility links and page information