Note: The full version of Felicity’s Master’s Paper that covers this project in more detail will be available in the Carolina Digital Repository in April 2022.
Introduction
One facet of accessibility that is not often at the forefront of conversations about web accessibility is word choice; word choice is associated with Web Content Accessibility Guidelines Standard 3.1.5, which requires that text written at an above-elementary reading level must have alternative text available that restates the content at an elementary reading level. This standard’s status as a Level AAA standard means that it is not yet a federal requirement, but readability for all users is something the University Libraries strives for.
Writing in common terms instead of industry-specific language boosts readability for users by providing a simpler reading experience, regardless of the user’s exposure to industry-specific terminology. This improves the experience for everyone, including and especially people with varying education levels and those with cognitive disabilities. This is the subject of several previous research projects that identified the crucial role that word choice plays in a library website’s architecture, especially for browsing tasks.
Regardless of education levels, the University Libraries’ patrons may be native speakers of any of the world’s languages, even if English competency is required for admission or employment at the school. Therefore, making use of jargon terms on prominent library resources is expected from academic environments but could also hinder the usability to patrons less familiar with library-related terminology. This project aimed to identify which library-specific terms for commonly used research tools are confusing for patrons of UNC-Chapel Hill’s University Libraries.
Methods
This paper opts to use first-click testing because it immerses a user in the page’s interface with a specific task. This testing method involves recording a user’s first click on a page to measure their ‘success’ in finding an item when given a task. The first-click tests were used to examine the clarity of three links in the Research Tools box on the homepage of the University Libraries’ website: “E-Research by Discipline,” “Articles+,” and “E-journals.” Testing just these three eliminates terms that would have very little chance of being changed on the website, like “Catalog” or “Google Scholar,” which are terms used widely outside of UNC-Chapel Hill.
Table 1 below documents the tasks that will be associated with the control terms and their common-language alternative options.
Table 1: Control and Variable Options with Accompanying Tasks
Control Option | Variable Option with ‘Plain Language’ Alternative | Task Given |
---|---|---|
E-Research by Discipline | Browse Article Collections by Subject | You came to library.unc.edu looking for articles in linguistics. You see this Research Tools box. Which link do you click to proceed? |
Articles+ | Search UNC’s Online Articles | You came to library.unc.edu looking for articles accessible to UNC affiliates. You see this Research Tools box. Which link do you click to proceed? |
E-journals | Find Journals by Title | You came to library.unc.edu looking for the academic journal titled Communications in Theoretical Physics. You see this Research Tools box. Which link do you click to proceed? |
Results
All 507 participants were recruited through a pop-up link on the website’s homepage. Most were undergraduate students, graduate/doctoral students, or UNC-Chapel Hill faculty. Participants were taken to demographic questions and then randomly to one of the first-click tests using the prompts listed in Table 1 above.
Very few participants reported being non-native speakers of English, and even fewer reported having an English proficiency level below that of a native speaker. Therefore, there was insufficient data to report their performance separately from that of native English speakers.
We measured ‘success’ through two metrics: success rate (percentage of clicks on the correct item in the menu) and time taken to respond. The results from this study show that participants prefer the control term “E-Research by Discipline” over “Browse Article Collections by Subject,” which took participants notably longer to answer and which saw more incorrect answers. On the other hand, participants showed a preference for “Find Journals by Title” over the control, “E-journals,” both in response times and success rates. Finally, the preference of “Articles+” versus “Search UNC’s Online Articles” was unclear and should be the subject of future testing.
All three of the research tools tested saw a large proportion of clicks going to the ‘Catalog’ option on the menu. This indicates that there is also some confusion about what is in UNC-Chapel Hill’s Catalog. Therefore, future studies may want to investigate UNC-Chapel Hill affiliates’ perception of the kinds of items listed in the catalog and which research tasks it is helpful for.