Printer-friendly Version
 

Web Accessibility:
Exploring the Disconnect Between Increased Awareness and Persistent Website Error

by 


Interim Head of Circulation, University of Memphis
 M.S. Student,  Information Science, University of Tennessee, Knoxville


Abstract

Providing universal access to online information is a worthy ideal for academic institutions. This paper surveys the laws and guidelines that have been enacted to ensure web accessibility for users with disabilities. The author then reviews the literature pertaining to web accessibility and conducts an investigation of the 22 schools listed in the Directory of ALA-Accredited Master’s Programs in Library and Information Studies (LIS) as offering  a 100% online program of study. Contrary to Schmetzke's study (2001), this study finds that the websites of LIS programs fare better than those of their campus library counterparts. The overall levels of web accessibility remain very low, however. The paper concludes that although researchers are raising the general level of awareness of issues concerning web accessibility, many website designers are not taking the subsequent steps to remedy existing errors.

Introduction

In the United States, many programs have been enacted over the past fifty years to help ensure equal access to opportunity for individuals with disabilities. New technologies bring about new challenges in this regard. One of those challenges is providing equal access to online information. As purveyors of information, academic libraries are especially affected by such challenges.

Definitions

It is worthwhile to begin by defining a few of the terms that will be used throughout this paper.

Usability

Jakob Nielsen defines usability as “a quality attribute that assesses how easy user interfaces are to use. The word ‘usability’ also refers to methods for improving ease-of-use during the design process” (2012). Further, Nielsen notes the following:

Usability is defined by 5 quality components:

  • Learnability: How easy is it for users to accomplish basic tasks the first time they encounter the design?
  • Efficiency: Once users have learned the design, how quickly can they perform tasks?
  • Memorability: When users return to the design after a period of not using it, how easily can they reestablish proficiency?
  • Errors: How many errors do users make, how severe are these errors, and how easily can they recover from the errors?
  • Satisfaction: How pleasant is it to use the design? (Nielsen, 2012)

Disability

Disability is a term that is difficult to define, since it takes on different meanings in differing contexts. As Matthew Brault points out, the term disability is used in various ways among medical, social, and demographic models (Brault, 2012). The US Census Bureau defines disability in terms of the following communicative, physical, and mental domains:

People who have disability in the communicative domain reported one or more of the following:

1. Was blind or had difficulty seeing.
2. Was deaf or had difficulty hearing.
3. Had difficulty having their speech understood.

People who have disability in the mental domain reported one or more of the following:

1. Had a learning disability, an intellectual disability, developmental disability or Alzheimer’s disease, senility, or dementia.
2. Had some other mental or emotional condition that seriously interfered with everyday activities.

People who have disability in the physical domain reported one or more of the following:

1. Used a wheelchair, cane, crutches, or walker.
2. Had difficulty walking a quarter of a mile, climbing a flight of stairs, lifting something as heavy as a 10-pound bag of groceries, grasping objects, or getting in or out of bed.
3. Listed arthritis or rheumatism, back or spine problem, broken bone or fracture, cancer, cerebral palsy, diabetes, epilepsy, head or spinal cord injury, heart trouble or atherosclerosis, hernia or rupture, high blood pressure, kidney problems, lung or respiratory problem, missing limbs, paralysis, stiffness or deformity of limbs, stomach/digestive problems, stroke, thyroid problem, or tumor/cyst/growth as a condition contributing to a reported activity limitation. (Brault, 2012, p. 2)

Web accessibility

There has been a great deal of legislation and attention directed toward the general notion of leveling the playing field for individuals with disabilities through measures of accessibility. Specifically, this paper discusses web accessibility. According to the Web Accessibility Initiative (WAI), a project of the World Wide Web Consortium (W3C), “Web accessibility means that people with disabilities can use the Web. More specifically, Web accessibility means that people with disabilities can perceive, understand, navigate, and interact with the Web, and that they can contribute to the Web” (WAI, 2005a). Further, the WAI helps create an international set of technical standards known as Web Content Accessibility Guidelines (WCAG) (WAI, 2012).

Statistics

According to the US Census Bureau, in 2010 approximately 18.7 percent of the overall civilian population in the United States (56.7 million) reported having some kind of disability (Brault, 2012, p. 4). The number of categories of disability is high, but those that are most relevant to the discussion of academic libraries are listed in the table below: movement, visual, hearing, hand grasping, and those relating to cognitive impairment (Brinck, Gergle, & Wood, p. 45).

Table 1

U.S. Population 15 Years and Older Affected by Disabilities

Type of Disability Number of People Affected (Millions) Percent of Population 15 Years and Older

Any

51.5

21.3

Walking

23.9

9.9

Sight

8.0

3.3

Hearing

7.5

3.1

Grasping

6.7

2.8

Learning

3.9

1.6

(US Census Bureau: http://www.census.gov/prod/2012pubs/p70-131.pdf)

These estimates are low, however, when compared to the results reported by the Pew Internet & Family Life Project from the same year. This survey indicates that approximately 27 percent of the US population lives with a disability that inhibits daily functioning (Fox & Boyles, 2012). This same report also indicates that people with disabilities use the internet at far lower rates than those without disabilities (54 percent vs. 81 percent, respectively), suggesting that persons with disabilities encounter barriers to using the internet.

Jacobs (2008) points out that we are now in an age when it is especially important to consider individuals with disabilities while providing access to resources over the internet (pp. 84-85). This stems from the fact that public transportation is now more accessible to those with disabilities and that an abundance of a library’s resources are available online. Therefore, more and more disabled people will have the opportunity to frequent libraries and the online information they contain.

Institutional Response

Lawmakers have already taken many steps to ensure that persons with disabilities are not discriminated against under the law. The Architectural Barriers Act (1968) requires that buildings built or remodeled with federal funds have to be accessible to those with physical disabilities. The Rehabilitation Act (1973) protects people with disabilities from being discriminated against in the workplace if the institution receives federal funding. A 1986 amendment to the Rehabilitation Act of 1973, otherwise known as Section 508, expanded the breadth of the act to incorporate information technologies. A newer version of Section 508 (1998) makes this incorporation of information technologies enforceable and “establishes a complaint procedure and reporting requirements, which further strengthen the law” (US Department of Education, n.d.).

Brinck, Gergle, and Wood (2002) provided the following guidelines to offer disabled users maximum access to online materials:

  • Avoid using color to make meaningful distinctions between items because of the prevalence of color blindness, especially red-green color blindness (8% of men and .5% of women in Europe and North America are color-blind).
  • Use high contrast and highly legible fonts to help those with even minor visual impairments. Allow the user to control fonts and font sizes for optimal reading.
  • Make sure all graphics and other multimedia elements have text equivalents so that people who are blind can hear descriptions of them with a screen reader.
  • Don’t rely on spatial relationships to make the text sensible. For instance, don’t refer to ‘the column to the right’ or ‘the button below.’
  • Avoid uses of DHTML or Java, such as rollovers and nonstandard pop-up menus, that make it difficult for screen readers to interpret where the buttons are or what text is displayed.
  • Avoid using small graphics as buttons and make sure small buttons are spaced well. Young children and people with arthritis often have difficulty targeting small regions and may hit the wrong button if buttons are too close together.
  • Avoid requiring typing when selecting a button or link will do. Avoid requiring the user to switch frequently between clicking and typing.
  • When you are using audio or video, provide closed captions or other text equivalents of the audio for the hearing impaired.
  • For the cognitively impaired, minimize the need to remember items between screens. Use simple, direct, concrete language. Expose the document structure as much as possible. (p. 47)

Four years later, the US Department of Health and Human Services (2006) released Research-Based Web Design & Usability Guidelines to help designers create websites that maximize use for all parties. Chapter three is dedicated to Web accessibility and provides a list of guidelines which are ranked in order of relative importance.

The Web Accessibility Initiative (WAI), a subdomain of the World Wide Web Consortium (W3C), was established in 1997 and remains one of the most influential organizations in establishing standards for web access. In their own words, “The Web Accessibility Initiative (WAI) develops strategies, guidelines, and resources to help make the Web accessible to people with disabilities” (WAI, 2011).

The American Library Association (ALA) has also played an important role in establishing equity toward people with disabilities. On January 16, 2001, the ALA released the following policy statement:

The American Library Association recognizes that people with disabilities are a large and neglected minority in the community and are severely underrepresented in the library profession…. ALA, through its divisions, offices and units and through collaborations with outside associations and agencies is dedicated to eradicating inequities and improving attitudes toward and services and opportunities for people with disabilities. (ALA, 2001)

Although this policy statement was supposed to have ensured that information professionals are doing their best to maximize web accessibility through universal design, the literature review that follows suggests otherwise.

Review of Literature

Many researchers have already conducted web usability tests in academic libraries. For example, Prasse and Connaway devote a portion of their chapter on usability testing to examples of usability testing in academic libraries (2008, see pp. 233 – 244). There the authors outline the entire process of creating and conducting a usability test and present a number of case studies. However, throughout the chapter there is never a discussion of conducting usability testing to gauge web accessibility for disabled patrons. This type of omission occurs frequently in the literature (see, for example, Dickstein & Mills, 2000; Marill, 2001; Chisman, Diller, & Walbridge, 1999; Hammill, 2005). Where measures of accessibility and compliance with ADA guidelines are concerned, many studies have focused on libraries providing physical access (Scheimann, 1994; Schmetzke, 2001; Carpenter, 1996; and Lomax & Wiler, 2005).

Research that places a more significant focus on web accessibility does exist however. Over the past decade, scholars have been compiling a modest amount of research on incorporating web accessibility into usability testing at libraries. In the fall of 2000, the University of Colorado at Colorado Springs (UCCS) conducted the first reported web usability study that incorporated disabled users into its testing (Byerley, 2001; and Byerley & Chambers, 2001). The researchers were able to solicit information from three blind students in a pre-test interview. The disabled students revealed the following problems associated with website design:

  • The graphical interface of the Web is the greatest obstacle.
  • Web sites use too many images without descriptive text.
  • Screen readers often jumble the information presented in tables, because they tend to read across the columns in a table.
  • Irrelevant links at the tops of Web pages force the visually impaired to have to sort through unnecessary material (Byerley & Chambers, 2001).
This last observation lends itself well to what Preece dubs the golden rule: “minimize the total amount of information by presenting only what is necessary to the user” (1993, p. 70).

 The blind students also identified certain aspects of Web design that they felt were advantageous:

  • Image links are easier to manage if they are arranged in a list.
  • Drop down boxes are preferred to long lists of links.
  • Web sites that offer alternative text only versions are useful as long as the information presented is equivalent to the graphical versions (Byerley & Chambers, 2001,  p. 309).

In the end, it was determined that the table layout that the UCCS web designers were using made navigation difficult for the visually impaired. In response, the designers vowed to move away from tables whenever they could or to make the information presented in the tables read logically via a screen reader (Byerley & Chambers, 2001, p. 310).

Schmetzke (2001) evaluated the top 24 library schools of 1999, as ranked by US News & World Report, to determine rates of web access success and failure. In this paper, the author operates under the assumption that “a library school's Web design reflects awareness about accessibility issues among its faculty” and that “one can gauge what prospective librarians are likely to learn about accessible design, and to which extent they are prone to implement accessible design principles once they have entered the profession” (p. 41). The study used Bobby 3.1.1, a web-based tool created by the Center for Applied Special Technology to evaluate web pages for their accessibility to people with disabilities (CAST, 2012). Bobby 3.1.1 analyzed the web accessibility of the campus library website and the library school website of each of the 24 library schools. The two major findings from this study are as follows:

  • The average Web site accessibility of the campus library sites was relatively high: 59 percent.
  • The average Web site accessibility of the library school sites was relatively low: 23 percent.

In response to the second point, Schmetzke writes “One can thus further assume that schools of library and information science are unlikely to teach principles of accessible Web design” (p. 44).

Byerley and Chambers continued to focus their research on providing access to individuals with visual impairments in their article “Accessibility and Usability of Web-based Library Databases for Non-visual Users (2002). More specifically, the article focused on providing access to two web-based abstracting and indexing services at the University of Colorado at Colorado Springs (UCCS): OCLC FirstSearch and Gale Group InfoTrac. The results of this study were promising: there was a high degree of accessibility found for OCLC FirstSearch and Gale Group InfoTrac and, according to questionnaires, both companies were actively pursuing ways to make their products more user-friendly for disabled persons. However, despite the high levels of accessibility, instances of usability problems were still found by incorporating human participants in the study. For example, the alternate tag for the Combining Search Terms screen in OCLC FirstSearch simply read “AND,” “OR,” and “NOT.” Therefore, while alternate text was provided, the alternate text failed to provide enough information for the patron to understand that he or she was being presented with the Boolean operators for the page. In a similar vein, a link on a sidebar in Gale InfoTrac read “Email or Retrieval,” while the alternate tag read “Local jump to print, email and retrieval.” In other words, the blind user was receiving misleading and non-equivalent information. This study very carefully underscores the fact that “accessibility does not necessarily equate with usability” (p. 177).

In 2005, Stewart, Narenda, and Schmetzke embarked on a large-scale evaluation of one library database from each of the 37 vendor interfaces to which the authors had access (2005). The focus of the analysis was three-fold: compliance with accessibility standards, functional usability (a pass/fail evaluation of whether the participant completed his or her task) for those using assistive technologies, and the format of the documents from the full-text databases. The major findings were as follows:

  • Approximately one third of the interfaces could not be navigated without instruction.
  • Eleven percent of the interfaces allowed the user to skip repetitive navigation components.
  • Forty percent of the interfaces did not use ALT-tags meaningfully or consistently.
  • 29 percent of the databases were unusable when scripts were disabled (Stewart et al.).

While libraries do not play a direct role in creating vendor interfaces, the authors argue that the information in their article puts libraries in a better position to challenge vendors with questions regarding a product’s functionality.

In their article, "Online Databases and the Research Experience for University Students with Print Disabilities," Dermody and Majekodunmi (2011) also considered the effect that a document’s format can have. In one of their surveys, a participant responded, "'my biggest fear is finding the perfect PDF document for a research project but not having it be in an OCR format, meaning that I cannot use a screen reader to read it'" (Dermody and Majekodunmi, 2011, p. 154-155).

Brophy and Craven (2007) present an overview of the usability and accessibility research that has been conducted in the United Kingdom near the beginning of the new millennium. They find a general failure of websites in the UK to comply with usability and accessibility standards. For example, citing City University (2004), they relate that only 42 percent of the English museum, library, and archives websites in the study met basic levels of accessibility. Further, blind persons found it impossible to complete 33 percent of their tasks and “22 percent of the problems experienced by the user panel were not identified by the automated testing of WCAG 1.0 checkpoints” (Brophy & Craven, 2007, p. 965). The study concluded by noting that, although the accessibility failure rates may be high, there is some solace in the fact that there is a growing awareness of web accessibility issues.

Finally, Youngblood and Mackiewicz (2012) investigate the usability and accessibility of municipal websites in Alabama and find substantial problems in both areas. The authors remark that the high number of errors underscores the need for basic usability and accessibility testing in the development stages of the websites (p. 587). While Youngblood and Mackiewicz do not specifically touch upon academic libraries, their findings continue to highlight the disconnect between website design, usability, and accessibility.

Methods

Over a decade has elapsed since Schmetzke (2001) revealed the low rates of accessibility for library and library school websites. Libraries and library school websites have been placing an ever-increasing amount of information online for students in the course of these ten years. Courses, their listings, descriptions, and syllabi are all online. Online catalogs are often the sole means of searching. Librarians energetically post events on a website’s newsfeed. Students are often required to access ebooks, ejournals, and directories online. In short, everything either already is or will soon be available online. It is therefore necessary to revisit these types of websites to ascertain their current status in terms of providing accessibility to patrons with disabilities. To this aim, I conducted a study modeled after Schmetzke's study (2001) that investigated the number of accessibility errors present on the websites of the 22 schools listed in the Directory of ALA-Accredited Master’s Programs in Library and Information Studies (ALA, 2013) as offering a 100% online program of study. Theoretically, online programs should increase a disabled person’s ability to attend, assuming that the websites conform to web accessibility guidelines.

The evaluation tool Bobby is no longer readily available to the public, but the Web Accessibility Initiative (WAI) provides an extensive list of tools that can be used to evaluate the accessibility of a website (see http://www.w3.org/WAI/RC/tools/complete for a complete list). I selected the automated, web accessibility evaluation tool WAVE 4.0 for my analysis. WAVE is a product created by WebAIM that identifies errors and possible errors with easy to see and simply-labeled icons. WebAIM provides a complete table of the icons used in WAVE, a description of the error that each icon represents, and the recommended action to reconcile each error at http://wave.webaim.org/icons. In the present study I focus solely on the icons labeled “Errors,” the presence of which will “almost certainly cause accessibility issues” (WebAIM, 2013). One error deems a site inaccessible. Table 2 presents WAVE’s breakdown of the 22 aforementioned schools. The left two columns pertain to a campus library’s website, while the right two columns pertain to a Library and Information Studies (LIS) program’s website. As is shown in the table, only a few select universities are completely free from web accessibility errors.

Table 2

Web Accessibility Errors at Universities with ALA-Accredited, 100% Online Programs

Library Site LIS Site

Name

Number of Errors

Name

Number of Errors

Clarion University of Pennsylvania

3

Clarion University of Pennsylvania

0

Drexel University

12

Drexel University

7

Florida State University

11

Florida State University

12

Kent State University

13

Kent State University

4

Louisiana State University

30

Louisiana State University

0

North Carolina Central University

5

North Carolina Central University

2

Rutgers, The State University of New Jersey

4

Rutgers, The State University of New Jersey

18

San Jose State University

1

San Jose State University

5

Southern Connecticut State University (Conditional)

11

Southern Connecticut State University (Conditional)

0

Texas Woman's University

10

Texas Woman's University

4

The University of Southern Mississippi

7

The University of Southern Mississippi

2

University at Buffalo, State Univ. of New York (Conditional)

24

University at Buffalo, State Univ. of New York (Conditional)

6

University of Alabama

23

University of Alabama

0

University of Kentucky

36

University of Kentucky

12

University of Maryland

2

University of Maryland

5

University of Puerto Rico

27

University of Puerto Rico

20

University of South Carolina

5

University of South Carolina

8

University of Tennessee

7

University of Tennessee

5

University of Washington

0

University of Washington

6

University of Wisconsin – Milwaukee

3

University of Wisconsin – Milwaukee

1

Valdosta State University (Conditional)

8

Valdosta State University (Conditional)

1

Wayne State University

13

Wayne State University

3

Average Number of Errors

11.6

Average Number of Errors

5.5

Note: The WAVE test was last conducted on these sites on February 21, 2013.

Results

Table 2 presents a preliminary overview of the current status of the number of accessibility errors present on the websites of the 22 schools listed in the Directory of ALA-Accredited Master’s Programs in Library and Information Studies (LIS) as offering a 100% online program of study. It is difficult to make direct comparisons between the figures in this study and Schmetzke's study (2001), since the latter used the evaluation tool Bobby, which measures the extent to which a website’s pages are accessible. WAVE, on the other hand, indicates whether a website is accessible or not. According to WAVE, 73 percent (16 of 22) of the LIS websites have a fewer number of errors reported than their campus library websites. The average number of errors also serves to highlight the comparative advantage that the LIS group holds over the campus library group. LIS websites incur an average of 5.5 errors, while the campus library websites incur an average of 11.6 errors. However, the overall number of websites with zero accessibility errors is very low. WAVE considers only one campus library website free from accessibility error (the University of Washington), while four of the LIS websites are error free (Clarion University, Louisiana State University, Southern Connecticut State University, and the University of Alabama). In other words, only 4.5 percent of campus library websites and 18.2 percent of LIS websites are fully accessible to people with disabilities.

Discussion

While the number of websites that are fully accessible to people with disabilities is very low, it appears that the tide might be turning for LIS websites. The success rate of the LIS websites is four times higher than the campus library websites. The average number of errors per LIS website is half that of the campus library websites.

The success rate could potentially be higher as well, since many of the errors are simple fixes. Other studies show that this is frequently the case (Flowers, Bray, & Algozzine, 1999; Youngblood & Mackiewicz, 2012). For example, the North Carolina Central University School of LIS website designer needs to add two form labels to reduce the number of accessibility errors to zero. The same type of error is true for the Valdosta State University LIS website. The University of Southern Mississippi LIS webmaster needs only to add text within two links that describes the functionality of the intended target in order to have a completely accessible website.

As the literature review demonstrates, the use of human participants in an accessibility study is extremely important. Human involvement can provide the final say in any situation of doubt about or disagreement with the automated tool. Therefore, human participants with disabilities will hopefully be incorporated into any subsequent research based on this study.

Conclusion

The literature and the current study suggest a demonstrated pattern of failure by information professionals to provide equitable access to online information. There has been a general increase in the awareness of web accessibility issues over the past thirty years, as evidenced by the creation of the Web Accessibility Initiative in 1997, two amendments to the Rehabilitation Act of 1973, and guidelines released by the US Department of Health and Human Services in 2006. This increase in awareness comes at a very important juncture. This importance arrives at a time when public transportation is now more accessible to those with disabilities and online resources are more pervasive than ever (Jacobs, 2008). More and more, people with disabilities will have the opportunity to frequent libraries and the online information they contain. While the percentage of the population that is affected by disability may not be increasing, the amount of information that its members have access to could be diminishing if measures of web accessibility are not adequately taken into consideration. An attempt has been made by information professionals to follow these pursuits of equity by releasing statements and conducting research. The research conducted thus far however suggests that this increase in awareness has not necessarily translated into better web accessibility for individuals with disabilities.

I conclude with a word on further research. Some of the research that has already been conducted notes that many of the web accessibility errors are relatively simple to reconcile: adding labels and alternate-text descriptions, for example (Youngblood & Mackiewicz, 2012). One study claims that 83 percent of the accessibility errors found are easy to correct (Flowers, Bray, & Algozzine, 1999). Yet, the errors persist. Therefore, it is necessary to delve deeper into the motivations for not eliminating accessibility errors from websites. It might be important to ascertain how often campuses redesign their websites and how often they are uploading new content. It is also important to find out who the website designers are and if they are tied to the curricula at Library and Information Studies programs in any way. It would be equally important to inquire whether or not recommendations are ever heard from persons with disabilities or from those that might represent individuals with disabilities.

References

American Library Association. (2001, January). Library services for people with disabilities policy. Retrieved from http://www.ala.org/ascla/asclaissues/libraryservices.

American Library Association. (2013). Directory of ALA-accredited master’s programs in library and information studies. Retrieved from http://www.ala.org/accreditedprograms/directory

Architectural Barriers Act of 1968, as amended, 42 U.S.C. §§ 4151 et seq.

Bell, L., & Peters, T. (2005). Digital library services for all. American Libraries36(8), 46-49.

Brault, M. W. (2012, July). Americans with disabilities: 2010. Retrieved from http://www.census.gov/prod/2012pubs/p70-131.pdf

Brinck, T., Gergle, D., & Wood, S. D. (2002). Designing web sites that work: Usability for the web. San Francisco: Morgan Kaufmann Publishers.

Brophy, P., & Craven, J. (2007). Web accessibility. Library Trends55(4), 950-972.

Byerley, S. L. (2001). Usability testing and students with visual disabilities: building electronic curb cuts into a library Web site. Colorado Libraries27(3), 22-24.

Byerley, S. L., & Chambers, M. (2001). Usability testing and students with disabilities: Achieving universal access on a library web site. In Association of College and Research Libraries & H. A. Thompson, (Eds.), Crossing the divide: Proceedings of the Tenth National Conference of the Association of College and Research Libraries, March 15-18, 2001, Denver, Colo. Chicago: Association of College and Research Libraries.

Byerley, S. L., & Chambers, M. (2002). Accessibility and usability of Web-based library databases for non-visual users. Library Hi Tech20(2), 169-178.

Carpenter, S. A. (1996). The Americans with disabilities act: Accommodation in Ohio. College & Research Libraries. 57, 555-66.

CAST. (2012). Bobby. Retrieved from http://www.cast.org/learningtools/Bobby/index.html

Chisman, J., Diller, K., & Walbridge, S. (1999). Usability testing: A case study. College and Research Libraries, 60(6), 552-569.

Craven, J. (2000). Electronic access for all: Awareness in creating accessible Web sites for the university library. Disability and Information Systems in Higher Education (DISinHE).

Dermody, K., & Majekodunmi, N. (2011). Online databases and the research experience for university students with print disabilities. Library Hi Tech29(1), 149-160.

Dickstein, R., & Mills, V. (2000). Usability testing at the University of Arizona Library: How to let the users in on the design. Information Technology and Libraries, 19(3), 144–150.

Flowers, C. P., Bray, M., & Algozzine, R. (1999). Accessibility of special education program home pages. Journal of Special Education Technology14(2), 21-26.

Fox, S. & Boyles, J. (2012, August). Disability in the digital age. Retrieved from http://www.pewinternet.org/Presentations/2012/Aug/Disability-in-the-Digital-Age.aspx

Hammill, S. (2005). Usability testing at Florida International University libraries: What we learned. In J. Caswell, P. G. Haschak, & D. Sherman (Eds.), New challenges facing academic librarians today: Electronic journals, archival digitization, document delivery, etc. (pp. 239-250). Lewiston, N.Y: Edwin Mellen Press.

Jacobs, M. (2008). Electronic resources librarianship and management of digital information: Emerging professional roles. Binghamton, NY: Haworth Information Press.

Kirkpatrick, C. H. (2003). Getting two for the price of one: Accessibility and usability. Computers In Libraries23(1), 26-29.

Lilly, E. B. & Van Fleet, C. (1999). Wired but not connected: Accessibility of academic library home pages. The Reference Librarian, 67/68, 5-28.

Lomax, E., & Wiler, L. L. (2005). The Americans with disabilities act: Compliance and academic libraries in the southeastern United States. In J. Caswell, P. G. Haschak, & D. Sherman (Eds.), New challenges facing academic librarians today: Electronic journals, archival digitization, document delivery, etc. (pp. 169-188). Lewiston, N.Y: Edwin Mellen Press.

Marill, J. L. (2001). Designing a usable health information portal: The MedlinePlus experience from the National Library of Medicine. In N. Campbell (Ed.), Usability assessment of library-related Websites: Methods and case studies (Library and Information Technology Association Guide, no.7)(pp. 100-108). Chicago: American Library Association. 

Nielsen, J. (1994). Usability engineering. San Francisco: Morgan Kaufmann Publishers.

Nielsen, J. (2012, January 4). Usability 101: Introduction to usability. Retrieved from http://www.useit.com/alertbox/20030825.html

Prasse, M. J. & Connaway, L. S. (2008). Usability testing: Method and Research. In M. L. Radford & P. Snelson (Eds.), Academic library research: Perspectives and current trends (pp. 214-252). Chicago: Association of College and Research Libraries.

Preece, J., et al. (1993). A Guide to usability: Human factors in computing. Wokingham, England: Addison-Wesley.

Rehabilitation Act of 1973, Pub. L. No. 93-112, 87 Stat. 355 (codified as amended in scattered sections of 15 U.S.C., 20 U.S.C., 29 U.S.C., 36 U.S.C., 41 U.S.C., and 42 U.S.C.).

Scheimann, A. (1994). ADA compliance: What are we doing? Master’s research paper, Kent State University. ERIC, ED 376855.

Schmetzke, A. (2001). Web accessibility at university libraries and library schools. Library Hi Tech, 19(1), 35-49. 

Section 508 of the Rehabilitation Act (29 U.S.C. 794d), as amended by the Workforce Investment Act of 1998 (P.L. 105-220), August 7, 1998.

Stewart, R., Narenda, V., & Schmetzke, A. (2005). Accessibility and usability of online library databases. Library Hi Tech23(2), 265-286.

United States Department of Education (n.d.) Q & A: Title IV – Rehabilitation Act Amendments of 1998. Retrieved from http://www.justice.gov/crt/508/archive/deptofed.html

United States Department of Health and Human Services. (2006). Research based web design and usability guidelines. Washington, D.C: Health and Human Services Dept.

WebAIM. (2013). Index of WAVE icons. Retrieved from http://wave.webaim.org/icons

Web Accessibility Initiative. (2005a). Introduction to web accessibility. Retrieved from http://www.w3.org/WAI/intro/accessibility.php

Web Accessibility Initiative. (2005b). Selecting web accessibility evaluation tools. Retrieved from http://www.w3.org/WAI/eval/selectingtools.html

Web Accessibility Initiative. (2011, March 11). Getting started with web accessibility. Retrieved from http://www.w3.org/WAI/gettingstarted/Overview.html

Web Accessibility Initiative. (2012, October). Web content accessibility guidelines (WCAG) overview. Retrieved from http://www.w3.org/WAI/intro/wcag.php

Youngblood, N., & Mackiewicz, J. (2012). A usability analysis of municipal government website home pages in Alabama. Government Information Quarterly, 29(4), 582-588.

 

 

creative commons attribution no commercial

 

 

 

Contact Us

P.O. Box 241074
Memphis, TN 38124-1074
Phone: 901-485-6952
Email: arhuggins1@comcast.net

Copyright © 2011 Tennessee Library Association. All Rights Reserved.