Question: How do you want to find Open Educational Resources?
As mentioned in my last post, our expert workshop included user testing several OER sites including JORUM, Merlot, Xpert and Connexions. To this list must be added Google, whose presence hovered constantly throughout the workshop.
Participants were asked to find teaching resources on qualitative interviewing and to generally evaluate the user interface. We divided into two groups for the session and amongst the group I was with, all sites reviewed were new.
Speed of Judgement
This sort of user testing always creates two impressions; firstly, how valuable it is to see a familiar website through someone else’s eyes, and secondly, how difficult web design is because of the speed with which users make decisions and judgements online. I had expected to run into time difficulties getting though all the web sites. In practice, judgements were made in the space of a few minutes at most. The value of having experts review the sites was the quality of the general conversation the tasks engendered. The strongest theme to emerge from the user testing was one of frustration. The most frequently expressed sources included:
- Results judged not to be relevant
- Not enough information provided about resources before selection
- Resources were slow to load (particularly JORUM)
- Navigation and presentation of results caused confusion
Using Google to Search
The session took an unexpected turn towards the end when one of the participants suggested comparing results with a Google search. All agreed the Google search produced the best results. What was interesting about this consensus was the speed and certainly of the judgement without moving away from the results page; further questioning showed that within this incredibly short amount of time a number of complex evaluations had been made. The success of the Google search was described in terms of:
- Relevance of results
- Clarity of results –knowing what the resource is from title and text
- Transparency of authorship (through academic URLs)
- Ability to distinguish file format (such as PDFs, PowerPoint)
Just how relevant results are is a moot point; Google’s ability to present information on the results page with search terms highlighted (rather than the first few sentences or an abstract) may contribute to making results seem more relevant than they actually are. However, what matters is that Google provides enough information for people to quickly establish personal relevance which in turns leads to trusting the search.
The trust invested in Google in our user testing meant that irrelevant results (such as commercial ones) were simply not noticed. People recognise that it is a general website and so seem to have adopted strategies so that they only focus on relevant results. Without this trust and familiarity, equivalent irrelevant results in OER searches acted to make participants mistrust the site. This mistrust was compounded by the initial expectation that the OER sites should be more relevant than Google because of the academic focus. It must also be added that the percentage of ‘irrelevant’ results was much higher with the OER sites – participants had difficulties identifying many resources for interviewing.
What the user testing demonstrated was the pervasiveness of Google in determining how people make sense of searching online. Even though not part of the design, both groups automatically compared all the sites to Google; my group simply took it further in doing a comparison search.
Google has invested a huge amount of money in its searches and no OER search is ever going to compete. But maybe understanding how Google generates trust could help in exploring alternative means of developing trust amongst OER site users. The awareness of how people interact with the resources gained through user testing is also a good reality check for all those involved with promoting the use of OERs.