Assessing Information Literacy among Undergraduates: A Discussion of the Literature and the University of California-Berkeley Assessment Experience

College & Research Libraries, Jan 2001

Although national standards for information literacy have been developed and approved by the Association of College and Research Libraries, little is known about the extent to which undergraduates meet these or earlier sets of standards. Since 1994, the Teaching Library at the University of California-Berkeley has conducted an ongoing Survey of Information Literacy Competencies in selected academic departments to measure the “lower-order” information literacy skills of graduating seniors. The most fundamental conclusion that can be drawn from this survey is that students think they know more about accessing information and conducting library research than they are able to demonstrate when put to the test. The University of California-Berkeley library experience is consistent with earlier study findings that students continue to be confused by the elementary conventions for organizing and accessing information.

A PDF file should load here. If you do not see its contents the file may be temporarily unavailable at the journal website or you do not have a PDF plug-in installed and enabled in your browser.

Alternatively, you can download the file locally and open with any standalone PDF reader:

https://crl.acrl.org/content/62/1/71.full.pdf

Assessing Information Literacy among Undergraduates: A Discussion of the Literature and the University of California-Berkeley Assessment Experience

Patricia Davitt Maughan - aving crossed the threshold into the twenty-first century, where do academic libraries stand with respect to assessing information literacy among undergradu ates? Much has been written on the con cept of information literacy during the past twenty-five years. Recommenda tions and standards for information lit eracy have been developed and updated nationally by a variety of professional organizations. Yet, little is known about the extent to which undergraduates meet these standards. Moreover, few librarian authors have written on the topic. Interestingly, the term information literacy arose from neither the sphere of librarianship nor the halls of higher education but, instead, was first used by Paul Zurkowski, president of the Information Industry Association (IIA). In a 1974 proposal to the National Com mission on Libraries and Information Sci ence, Zurkowski suggested a national goal of achieving information literacy within the following decade. He described infor mation-literate individuals as people trained in the application of information resources to their work, who have learned techniques and skills for utilizing Patricia Davitt Maughan is the User Research Coordinator in the Teaching Library at the University of California-Berkeley; e-mail: . 72 College & Research Libraries the wide range of information tools as well as primary sources in molding informa tion-solutions to their problems. 1 In a sub sequent iteration of the concept, IIA dropped its reference to information re sources being used exclusively in the workplace and broadened its definition to encompass the use of information tools in fashioning solutions to all sorts of prob lems. In an article written in 1994, Shirley J. Behrens described additional changes and refinements to the definition of infor mation literacy occurring in the library community throughout the 1970s and 1980s.2 She perceived a paradigm shift among academic librarians by the second half of the 1980s, from one of library lit eracy to one of information literacy. Previ ously, Lawrence J. McCrank had observed that sometimes the term is used as a r e placement term for older terms that have simply become passe, like library skills, library use, or bibliographic instruction at other times usage implies that this new concept embraces all others as an expan sion the idea is becoming all encompass ing as it matures. 3 By the end of the de cade, Behrens had found that many user education programs were being replaced by those aiming to achieve information lit eracy. 4 In 1987, ALA President Margaret Chisholm appointed the ALA Presidential Committee on Information Literacy and charged it with defining information lit eracy and determining its importance to student performance, lifelong learning, and active citizenship. 5 The committees final report, issued in January 1989, char acterized the information-literate person as someone who is able to r ecognize when information is needed, knows what information is needed to address a given problem or issue, and, beyond that, has the ability to locate, evaluate and use ef fectively the needed information. Infor mation-literate people were said to be those who have learned how to learn because they know how knowledge is or ganized, how to find information, and how to use information. In other wor ds, they are people pr epared for lifelong learn ing. 6 The committee noted that to produce an information-literate citizenry, schools and colleges would need to integrate the concept of information literacy into their learning programs. 7 This underscored the necessity of a new model of learning, one based on information resources and one that is active and integrated, not passive and fragmented. The committee suggested that this new model for learning would further develop critical thinking skills on the part of students and prepare them for a lifetime of learning. Earlier American education reform re ports, including one in 1983 by the U.S. National Commission on Excellence in Education entitled A Nation at Risk: The Imperative for Educational Reform, largely ignored the role of libraries in the educa tional process.8 However, the 1987 report, College: The Undergraduate Experience in America, popularly known as the Boyer Report, gave considerable thought to the role of libraries in addressing the chal lenges faced by institutions of higher learn ing.9 In its initial release, the Boyer Report noted: The quality of a college is mea sured by the resources for learning on the campus and the extent to which students become independent, self-directed learn ers. And yet we found that today, about one out of every four undergraduates spends no time in the library during a nor mal week, and sixty-five percent use the library four hours or less each week. The gap between the classroom and the library, reported on almost half a century ago, still exists today. 10 In their 1989 book, Infor mation Literacy: Revolution in the Library, Patricia Senn Breivik and E. Gordon Gee emphasized the critical importance of part nerships within colleges and universities in graduating students qualified to be called information literate. The authors mentioned, in particular, the importance of partnerships between the library and classroom instructors, the library and uni versity administrators, and the library and the business community.11 With these cor nerstones of the information literacy move ment in mind, in 19931994, the library at the University of California-Berkeley be gan to reorganize the structure and ser vices of the Moffitt Undergraduate Library with the goal of graduating seniors from the university who met the criteria for in formation literacy. Teaching Library at the University of California-Berkeley The Teaching Library at the University of California-Berkeley was created in 1993 to bridge the gap between the classroom and the librarys information resources. Its mission is to ensure that all graduates of the university are thoroughly familiar with the information resources and tools in their respective fields of study, trained in their effective use, and, beyond that, prepared to conduct a search for informa tion resources in any field of inquiry. Plac ing special emphasis on the humanities and social sciences, library staff design and offer customized in-class presenta tions aimed at teaching resources and strategies appropriate to particular courses, drop-in workshops on the librarys online catalogs and Internetbased resources, and faculty seminars emphasizing electronic tools and re sources. Staff at the library have identi fied and targeted a number of lower-di vision courses having a library research component that serve as feeder courses through which many students pass to fulfill their general requirements. More over, the staff encourage the instructors of these courses to use the course-inte grated instructional services offered through the library. In addition, the staff have identified and contacted faculty, graduate student instructors, and other instructors for a number of key upperdivision courses that require library re search as part of the course work. At its inception, the staff, in addition to the position of head of the library, com prised three full-time equivalent program coordinators, a half-time coordinator of user research, and the staff of the Media Resources Center and the Library Graph ics Office. The position descriptions for program coordinators and the user re search coordinator were new. Candidates to fill the program coordinator positions emerged from existent library staff; they were charged with teaching faculty and students how to use a wide range of in formation resources. Representing the main instructional arm of the library, pro gram coordinators are expected to col laborate with one another, with selectors throughout the library, with faculty, and with campus student support units to in tegrate library and information literacy instruction into appropriate points in the universitys undergraduate curriculum. The library hoped to determine the extent to which current graduates possessed a core set of competencies defined by staff as being needed by capable library researchers and information-literate individuals. The terms information skills and library instruction involve a range of lower-order competencies, including skills such as us ing a variety of search systems to retrieve information in various formats, locating information within the library, and differ entiating between primary and secondary sources. In contrast, the term information literacy instruction, in addition to the lowerorder skills, includes higher-order abilities such as assessing search results for qual ity and relevance; evaluating the reliabil ity, validity, authority, and timeliness of retrieved information; and applying new information to the planning and creation of scholarly and professional projects and products. Although the library recognizes the importance of both lower- and higherorder skills and abilities, the staff initially chose to focus on those areas most readily taught and measured within the library worldthe lower -order library instruction skills. Program coordinators provide courseintegrated instruction primarily in the humanities and social sciences. They pub licize the librarys instructional program, which, in addition to course-integrated instruction, includes catalog instruction and research workshops, faculty semi nars, and, in 19931994, the Term Paper Advisory Service. All of the program co ordinators also serve on reference and information service desks but are not re sponsible for collection development. The position of half-time user research coordinator also was filled from existent library staff. The user research coordina tor designs programs to help the library better understand its users and assists in focusing library public services directly on known user needs. This was particularly important at the time the Teaching Library was created, when the overall librarys budget and staffing were shrinking as the range of information resources was ex panding. The user research coordinators role includes identifying user needs and describing, in depth, the levels of informa tion literacy and computer competency possessed by library users. Measuring Information Competencies as a Means of Marketing the Teaching Library Program Why measure student information lit eracy competencies in the first place? There are many important reasons for doing so, including: to establish a baseline of student skills around which an infor mation literacy program might be built; to assess the effectiveness of particular library instruction sessions or approaches to instruction; to determine the impact of library instruction programs on student information literacy skills and academic success; and to generate data with which to communicate with faculty. In the spring of 1994, the library ad ministration asked the user research co ordinator to undertake an empirical study of the information literacy skills of gradu ating UC-Berkeley seniors in the depart ments of political science and sociology. This was part of a larger plan to market the instructional services of the newly formed Teaching Library to campus aca demic departments. The library hoped to determine the extent to which current graduates possessed a core set of compe tencies defined by staff as being needed by capable library researchers and infor mation-literate individuals. If the library determined that graduating seniors did not possess this core set of competencies, it felt this could serve as a compelling ar gument for promoting a program of on going, systematic undergraduate instruc tion in the identification and use of library and information resources. Moreover, it might serve as an argument in persuad ing faculty to rely on the services of the library when designing their courses. The political science and sociology depart ments were selected by library adminis tration as starting points for this market ing effort. In User Education in Libraries, Nancy J. Fjallbrant and Ian Malley described evalu ation as being concerned with the collec tion and analysis of information about the input [and] the variables af fecting the educational process, and the end prod uct or output. 12 Because no systematic educational process was in place at the Teaching Library in the 19931994 aca demic year (the year of its inception), the user research coordinator focused, instead, on assessment. Fjallbrandt and Malley wrote: Assessment is concerned with the specific achievement of the individual stu dent with r egard to pre-specified goals. 13 In a 1980 article in Library Trends, Carla J. Stoffle and Judith M. Pryor wrote: A competency-based program is con ceived and planned based on the skills the exit-level student should possess. Compe tencies are identified with reference to spe cific roles stated in terms of what the stu dent should know and be able to do. 14 They continued: A competency-based program has three major components: competency identification, criteria level, and assessment. Instruction is also a sig nificant component, but is normally imple mented after the three major components. Instruction evolves readily from them and is designed to facilitate the development of the required skills. 15 Such skills or com petencies relate to the accomplishment of specific tasks. In contrast, information lit eracy is a far more comprehensive concept, encompassing abilities such as critical thinking, synthesis, communication, and research methodologies. Although infor mation competencies are easier to assess, the assessment of information literacy out comes, by contrast, must be a shared re sponsibility between librarians and faculty. In 1989, Mary M. Nofsinger wrote: Ar ticulation of library use/research skills for students is a r elatively obscure topic in the professional literature most instr uc tors simply assume that college students know how to use a library. However, she concluded: most students enter higher education virtually without any inkling of how to use a library. 16 In line with this, Lynn Cameron at James Madison Univer sity observed: The library must clearly define its instructional goals and objectives before it can assess whether they have been achieved. 17 Still in its infancy, the Teach ing Library set out to do just that. In early 1994, its staff met for a series of discussions to define a core set of competencies that they believed exit-level students at the uni versity should possess. In doing so, the staff referred to both the Maryland Library Associations 1991 Model Statement of Objectives for Bibliographic Instruction and the 1987 Model Statement of Objec tives for Academic Bibliographic Instruc tion pr epared by the ACRLs Biblio graphic Instruction Section (ACRL/BIS) and approved by the ACRL Board of Di rectors and the ALA Standards Commit tee.18,19 Following the library meetings, the user research coordinator drafted two dis cussion documents for use in promoting the librarys instructional program. The first document, The Teaching Li braryGateway to Information Literacy, borrowed heavily from the work of Den nis Isbell and Carol Hammond, specifi cally, the document entitled Information Literacy Competencies for Students, at Arizona State University West. It provides a succinct working definition of informa tion literacy and a list of conceptual com petencies around which the librarys in structional program was being built. This document was designed to be shared with department chairs, faculty, and teaching assistants. The librarys goals mirror those of Arizona State University West Librarys initiative: to r evise and improve library instruction to make it more relevant, mar ket the program to growing numbers of new faculty, and promote the inclusion of an information literacy component in the curriculum development plans of each academic unit. 20 The second discussion document, Minimum Library Skills for Cal Gradu ates, focused on competencies described in ACRL/BISs general objectives three, How information sour ces are intellectu ally accessed by users, and five, How information sources are physically orga nized and accessed, outlined in the Model Statement of Objectives for Aca demic Bibliographic Instruction. 21 With the librarys statements of core competen cies in place, the user research coordina tor went about developing an instrument to measure whether graduating seniors at UC-Berkeley possessed these core com petencies. In 1993, Donald Barclay, coordinator of instruction at New Mexico State Univer sity Library, called attention to the r e cent emphasis on outcomes assessment in higher education, describing inter est on the part of directors, deans, univer sity presidents, and even state legislators in what college students are learning. 22 The user research coordinator set out to answer this question with respect to ba sic student skills in conducting library research, as the first step toward achiev ing the overarching goal of attaining in formation literacy. Of course, outcomes assessment combines the measurement of basic skills or competencies with the as sessment of higher-order abilities such as evaluation and critical thinking. Stoffle defined competency-based education as an educational appr oach which struc tures learning around competencies de fined as fundamental for successful per formance. 23 According to her, the most comprehensive program of competencybased learning in 1980 had been located at the University of Wisconsin-Parkside. Wisconsins goal for the library skills por tion of its collegiate skills program was to develop in students the ability to use the appropriate resources and services of a university library to identify, select and locate materials, both print and non-print, on a variety of subjects. 24 The library staff initially chose this same approach, focus ing on the fundamentals of information competence, those most basic to access ing information resources and upon which the higher-order information lit eracy skills of analysis, synthesis, and evaluation could be built. But how could the library measure these fundamentals of information com petence? In 1991, Arlene Greer, Les Weston, and Mary Alm observed: A search of the literature reveals that most library questionnaires geared to an aca demic population principally addr ess issues of user satisfaction. Librarians, they said, were faced with the difficult task of designing a survey instrument that would measure student competencies objectively.25 Jill Coupe, too, commented on the lack of a good survey of library skills and suggested that this may well be the reason why librarians have done so little in the way of measuring users basic library skills.26 In developing a survey questionnaire to measure the information literacy levels of graduating UC-Berkeley seniors, the librarys user research coordinator relied heavily on the assessment work of Jill Coupe at Johns Hopkins University, which she reported on in her article, Under graduate Library Skills: Two Surveys at Johns Hopkins University, and on the WisconsinAssociation of Academic Librar ians Education and Library Use Committees Test of Minimum Library Use Skills, developed nearly a decade earlier.27 In the spring of 1994, the Teaching Li brary developed a self-administered mail questionnaire consisting of thirty-six mul tiple-choice questions. The first three ques tions were designed to collect information about the respondents themselves; the re maining questions were designed to test the respondents mastery of basic library research skills and knowledge of the UCBerkeley library system. The questionnaire was pretested on selected groups of un dergraduates. A revision of the survey was mailed for the first time in the spring of 1994 to all graduating seniors in the politi cal science and sociology departments. It was administered a second time to all graduating seniors in the history, history of art, and philosophy departments in the spring of 1995. A third survey was con ducted in the spring of 1999, again involv ing all graduating seniors in history, po litical science, and sociology. The results of those surveys follow. Results of the Information Literacy Competencies Surveys Of the three occasions in which the Infor mation Literacy Survey has been admin istered thus far, the first cycle resulted in the highest overall return rates. In the spring of 1994, 260 surveys were mailed to graduating seniors in the political sci ence department, of which 185 were com pleted and returned to the library (a 71% return rate). One hundred and twentyfive surveys were mailed to sociology graduating seniors, of which seventy were returned (a 56% return rate). Table 1 reflects the overall return rates from the 1994, 1995, and 1999 surveys. It remains a mystery why the return rates have dropped over time, as the methodology has remained consistent throughout all three of the survey administrations. This methodology included an initial mailing, accompanied by two follow-up mailings to nonrespondents. In all three adminis trations of the survey, the reward for re turning completed surveys also remained the same: a $10 gift certificate that could be applied toward the rental of the respondents cap and gown. Teaching Library Head Ellen Meltzer has observed that the income of UC-Berkeley students families rose during this time, lessening the appeal of the incentive coupon. The questionnaire first asked respon dents to rate their library knowledge and skills on a four-point scale ranging from Excellent to Pr etty poor. Over the five-year span of the study, in all but one group (the 1999 graduating sociology se niors), over half of the respondents (and Number of Surveys Number of Surveys Percentage Mailed Returned Returned in some cases as high as 70% to 77% of the respondents) self-assessed their skills as either Excellent or Pr etty good. In no case during the five-year span of the study did more than 14 percent of the graduating seniors studied self-rank their skills as Pr etty poor. The librarys user research coordinator compared students self-assessments of competency with their actual scores on the questions designed to measure their li brary and information research skills. In the latter case, anywhere from 35.5 percent to 81 percent of the respondents actually received poor or failing scores (defined as a score of 65% or lower) on the survey questions. Clearly, those graduating se niors surveyed held a higher opinion of their library research skills than they were able to demonstrate by their test scores. In reporting on her evaluation of the Library Education Program at Ohio State University, Virginia Tiefel wrote: students generally fail to realize the substantial dif ferences between school/public and aca demic libraries and therefore overestimate the extent of their knowledge of the lat ter. 28 In a survey conducted at the Uni versity of Northern Colorado, Greer, Weston, and Alm tested the hypothesis that both skill and confidence levels in crease among college students as a result of cumulative exposure to the library. They found that self-assessed excellent or good library skills are markedly higher for se niors than for freshmen, yet they found no dramatic tr end of higher proficiency from freshmen to seniors in the test cat egories. 29 In her study of library skills among undergraduates at Johns Hopkins University, Coupe found that juniors and seniors were more likely than freshmen to rate their skills as Excellent or Pr etty good, but in contrast to the UC-Berkeley results, she found a significant relationship between students opinions of their library skills and their actual scores. It should be noted, however, that significantly higher percentages of the Johns Hopkins students rated their library skills as Pr etty bad or Terrible (nearly 40%) than did the UCBerkeley students. The most extreme ex ample of misperceived competency among UC-Berkeley students was the 1995 gradu ating class in the history of art. None of these students self-assessed their skills as poor, and yet 77 percent received poor or failing scores on the surveys skill ques tions. A similar phenomenon was discov ered among 1999 respondents in political science and sociology. In the first case, only 4 percent rated their skills as Pr etty poor and yet 71 percent received poor or failing scores. In the latter case, only 7 percent of sociology respondents rated their skills as Pr etty poor, yet 81 per cent scored 65 percent or lower on the skills test. Median overall test scores among the respondents ranged from a low of 54 per cent (spring 1999 political science and so ciology seniors) to a high of 73 percent (1995 philosophy seniors) over the three surveys. In only three cases, that of his tory and philosophy students from the 1995 survey and 1999 history se niors, was the median score over 65 percent. In the remaining five groups studied, the median infor mation literacy competency score for graduating seniors was a fail ing score. In the course of analyzing the survey results, five basic library skills were identified so as to com pare test results among the sub groups surveyed. These were the ability to: (1) read a call number cor rectly, (2) identify subject headings in a library catalog record, (3) iden tify a reference to a book, (4) iden tify references to journal articles, and (5) interpret location informa tion in a catalog serial record. The results of this analysis are reflected in table 4. In only one case, that of arranging library call numbers in order, were 66 percent or more of the respondents able to demon strate this basic library skill consis tently. In the other basic skills ar eas, the percentage of students who were able to demonstrate the basic skills being tested ranged from 21 percent to 100 percent. In her study of library skills among undergraduates at Johns Hopkins University, Coupe had similarly disappointing findings. Al though she measured different skills in particular, she found that (1) less than half of the juniors and seniors studied were able to identify what Library of Congress (LC) call num bers were, (2) only 40 percent knew not to search the online catalog to identify journal articles, (3) less than 35 percent could distinguish be tween a citation to a book and one to a journal article, and (4) only about one quarter of the juniors and se niors knew that the library catalog relied on the use of standardized LC subject headings. When assessing senior English majors at James Madison Univer sity, Cameron found that over half Assessing Information Literacy among Undergraduates 79 did not know how to use the MLA International Bibliography, arguably one of the key resources in that field. She found that psychology majors scored poorly on questions relating to the American Psychological Associations Thesaurus of Psychological Index Terms. A selective review of the UC-Berke ley survey results revealed similarly disappointing findings. In the 1994 survey of political sci ence seniors, 78 percent were unable to identify the best source in the li brary for locating congressional pub lications; 66 percent could not identify what the Public Affairs Information Ser vice is; and 60 percent were unable to identify what the Statistical Abstract of the United States is. However, 65 percent of graduating seniors in political science did correctly identify the appropriate re source for locating an introductory article on Marxism. In 1995, 89 percent of his tory seniors were unable to identify what America: History and Life is; 56 percent failed to describe Current Contents; and 47 percent were unable to identify what Readers Guide to Periodical Literature is. Fully 92 percent of the 1995 graduating seniors in the history of art could not iden tify what Readers Guide is, and just less than half (46%) were unable to describe what Current Contents is and does. Among the 1999 sociology respondents, 69 per cent could not identify what Sociofile is. It could be argued that with the current widespread use of electronic indexing and abstracting databases, the need to identify and describe tools such as the Readers Guide is no longer as important to students as it was when the survey began in 19931994. However, students were equally uninformed about the newer electronic resources, such as Cur rent Contents and Sociofile. Other skill areas that were problematic for the UC-Berkeley graduating seniors surveyed included the ability to: (1) iden tify the catalog information needed to lo cate a physical item in the library, (2) find circulation information in a local catalog record, (3) identify the elements needed in a bibliographic citation, (4) limit search results, (5) recognize which indexes are searchable in the local catalog, and (6) determine when to consult a print versus an electronic indexing source. In a 1992 article reporting the results of a study of general library skills among undergraduates at Indiana UniversitySouth Bend, Brian R. Schuck found that, on average, the students correctly an swered only 34 percent of the multiplechoice questions pertaining to catalog use. He concluded that the results of the gen eral library skills exercises administered to undergraduates revealed a fair amount of confusion in understand ing some rather elementary conventions for organizing informationa fair num ber of study participants encountered problems with such sources/systems as the Library of Congr ess Subject Head ings List, the librarys own Periodical Hold ings List, and the Library of Congress call number system. Reference/instruction librarians cannot assume that students know how to identify a citation in even so well known a source as the Readers Guide. 30 Among the departments surveyed more than once at UC-Berkeley, history was the only one where a majority of stu dents basic library skills improved be tween the first and the second surveys (see table 5). Interestingly, history is one of the departments on campus where the librarys most intensive library instruc tion efforts have taken place over the past five years. However, it should be under scored that the purpose of the library surveys was not to measure the effectiveness of its instructional program but, rather, to measure the basic library/information literacy competencies or skills of graduating seniors in selected departments. In their 1992 article on performance evaluation, Richard Feinberg and Christine King noted that only a handful of articles has appeared that describe attempts at mea suring program effectiveness. 31 Overall, the UC-Berkeley surveys seem to indicate that the basic library skill ar eas where students appear to experience difficulties consistently and where greater attention must be focused in library in struction sessions include the ability to: identify and use subject headings; corr ectly identify references to books; decipher the location of se rials using the information con tained in the library catalogs se rial records. Stoffle and Pryor wrote that among the benefits of compe tency-based learning programs is the incr eased potential for struc turing of high-quality, relevant li brary learning experiences. 32 Over time, instructors in the Teaching Library have used the information from the 19941999 surveys and the results of the librarys program of pre- and post-testing student research skills within selected course-inte grated instructional sessions to identify those skill sets where more intensive instruction and hands-on practice are needed. Lastly, the surveys included a question on the number of re search papers students were re quired to produce over the course of their undergraduate years (see table 6). Anywhere from 50 per cent to 92 percent of undergradu ates surveyed were required to write six or more research papers within that time frame. From 21 percent to 46 percent were re quired to write eleven or more re search papers as undergraduates. Possession of even the most ba sic information literacy skills could well have a profound effect on student success in researching and producing these required papers. Conversely, the absence of such skills could have compro Percentage of Seniors Who Could Correctly . mised the quality of their work profoundly and added to the time and effort required to produce their papers. Recent Developments in the Information Literacy Movement Just as things elsewhere in the world at large and in the profession of librarianship seem to be changing and evolving on a continuous basis, so, too, has the notion of information literacy. A number of devel opments relating to information literacy have occurred since the UC-Berkeley Teaching Library first introduced its Infor mation Literacy Surveys in the spring of 1994. In 19961997, the CaliforniaAcademic and Research Libraries (CARL) Task Force to Recommend Information Lit eracy Standards to WASC (the Western Association of Schools and Colleges) drafted a statement of principles for in formation literacy criteria. It also revised the existing WASC standards, which in cluded recommendations on the estab lishment of institutional information lit eracy assessment plans. The task force recommended that these plans include: (1) a description of expected learning outcomes; (2) an articulation of perfor mance indicators for measuring spe cific information competencies; (3) a description of the process and methods for collecting data; and (4) a statement of how assessment results are incorpo rated into the information literacy pro gram planning and improvement. 33 The task forces draft concepts are now be ing incorporated more fully into the pro posed WASC Integrated Standards for Accreditation of Senior Colleges and Universities in California, Hawaii, and Guam. Currently, WASC is undergoing a rigorous self-evaluation and change in Less than 5 Research Papers More than 20 Research Papers accrediting processes, developing new models of self-study, and refocusing ac creditation on issues of educational ef fectiveness. The CARL draft Information Literacy Standards have proven instru mental to this effort. In December 1999, CARL past-president Carl Bengston ap pointed a CARL WASC Accreditation Standards Task Force. This task force has been actively engaged in reviewing and commenting on WASCs Proposed Ca pacity Standards and Educational Effec tiveness Standards, as well as accompa nying documents issued by the WASC Senior College Commission. As indicated in other studies of student library research skills, the UC-Berkeley experience confirms that students continue to be con fused by the elementary conventions for organizing and accessing information. A Web site created by Esther Grassian and Susan E. Clark entitled Information Literacy Sites, maintained on the ALA Web server, provides links to a dizzying array of megasites, national and local guidelines and reports, programs, tutori als, discussion groups, articles, and orga nizations and associations interested in information literacy.34 The National Fo rum on Information Literacy maintains its own Web site with links to, among other things, college and university information literacy programs and other information literacy Web sites.35 In March 1998, the ALA updated the Final Report of its Presidential Commit tee on Information Literacy.36 More re cently, at the 1999 ALA Annual Confer ence in New Orleans, the ACRL/BIS brought together librarians and educa tional technologists from across the United States at Think Tank III to present papers on critical information literacy is sues. Among the papers presented was one entitled Justify Our Love: Informa tion Literacy, Student Learning, and the Role of Assessment in Higher Education, by Anne Scrivener Agee and Craig Gibson, which examined issues relating to the measurement and assessment of information literacy and signaled a rekin dling of interest in the area of informa tion literacy assessment.37 In January 2000, the ACRL approved the revised draft of Information Literacy Competency Standards for Higher Edu cation. This document includes five broad standards and twenty-two perfor mance indicators, and its authors recom mended: In addition to assessing all stu dents basic information literacy skills, faculty and librarians should also work together to develop assessment instru ments and strategies in the context of par ticular disciplines. Further, they strongly suggested that that assessment methods appropriate to the thinking skills associ ated with each outcome be identified as an integral part of the institutional imple mentation plan. 38 The standards also were endorsed enthusiastically by the board of directors of the American Asso ciation of Higher Education (AAHE) in early May 2000. The Middle States Com mission on Higher Education has distrib uted copies of the ACRL standards as re sources to their outcomes assessment evaluators assigned to visiting teams for 20002001. Moreover, copies were pro vided to faculty and administrators at tending the commissions Outcomes As sessment Conference in March of this year. These latter developments mark an important step in broadening the higher education constituencies into which in formation literacy standards and assess ment planning now may be reintroduced and explored more substantively. These and other developments in the information literacy movement, which have taken place during the six years since the Teaching Librarys Information Literacy Survey was first introduced, un doubtedly will influence the librarys thinking about information literacy as sessment in the future and how its assess ment program grows and evolves. Recent developments are causing the staff to re think its basic assessment approach. With the assessment experience gained over the past six years, the library now has a baseline program from which future as sessment initiatives can and will develop. Conclusion The recently adopted Information Lit eracy Competency Standards for Higher Education, issued by the ACRL Task Force on Information Literacy Compe tency Standards, describes five standards for information literacy, twenty-two per formance indicators, and a number of out comes that emanate from these indicators. The outcomes include skills such as de termining the availability of needed in formation; defining an overall plan to ac quire information; assessing the quantity, quality, and relevance of search results; evaluating the reliability, validity, accu racy, authority, timeliness, and bias of the information retrieved; recognizing the cultural and other contexts in which in formation is created; and understanding the impact of these contexts when inter preting the information. The task force distinguishes between higher-order thinking skills and lowerorder thinking skills. The outcomes just described can be said to fall within the higher-order skills. The information lit eracy skills measured by the Teaching Librarys surveys of graduating seniors fall squarely within the sphere of lowerorder skills. The most fundamental conclusion that can be drawn from the University of Cali fornia-Berkeley Teaching Library surveys is that students think they know more about accessing information and conduct ing library research than they are able to demonstrate when put to the test. Sadly, in five of the eight groups studied be tween 1994 and 1999, the median score for graduating seniors was a failing score. As indicated in other studies of student library research skills, the UC-Berkeley experience confirms that students con tinue to be confused by the elementary conventions for organizing and accessing information. Why is this so? There are many possible reasons for this, including the fact that the state of California ranks close to the bottom nationally on fund ing for school libraries. In 1994, the entire state had only 850 school librarians.39 Seven out of eight schools in the state have less than half-time professional library staffing; and although the national ratio of library media specialists to students is 1:882, in California the ratio is 1:5342.40 With this very evident lack of support for school libraries within California, where the majority of UC-Berkeley students re side, is it any wonder that students ar rive at the university without information literacy skills? The Information Literacy Surveys focused primarily on the most fundamental and easiest-to-measure in formation competencies, described as lower-order thinking skills, considered basic to accessing information resources. It is upon these skills that the higher-or der information literacy skills of analy sis, synthesis, and evaluation are built. The ACRL task force has recommends that librarians, faculty, and others work collaboratively to develop assessment strategies and instruments. Further, it suggests that this activity can be useful in planning systematic and comprehen sive information literacy programs. Fi nally, it strongly suggests that assessment methods appropriate to the skills associ ated with each of the outcomes described in the report be identified as an integral part of every institutions implementation plan. These are praiseworthy, yet ambi tious, goals that reach well beyond the walls of the library and will require vig orous participation on the part of faculty to achieve. They will call for far more comprehensive programs of assessment to be carried out, again, not just by librar ians, but by faculty and other academic personnel on the campus as well. On the UC-Berkeley campus, initial ef forts to assess information literacy skills of undergraduates have been conducted on a periodic and special project basis by a half-time user research coordinator cur rently employed in the Teaching Library. If the campus and the library decide to adhere to the recommendations of the ACRL Task Force on Information Literacy Competency Standards, a far more exten sive program involving greater numbers and a far wider range--of assessors will be required. Patricia Iannuzzi, chair of the task force, wrote: Information literacy incorporates conceptual, technical, and critical thinking skills. Information literacy is much more than library instruction, and requires an institutional involvement that extends far beyond the library. 41 Faculty, in addition to library staff, will be required to take ownership for the development of authentic and course-embedded methods of assessment, including the creation of course assignments that require students to demonstrate mastery of the higher-or der abilities outlined and described in the ACRL Information Literacy Competency Standards for Higher Education. Whereas, Iannuzzi wrote: Information literacy as sessment within the library includes mea sures that can be conducted by the library independently because it has control over the process and can generate and analyze data. 42 She nonetheless concludes: if we want to ensure that those skills are applied within other courses, that there is mean ingful transfer to other learning environ ments, and that ultimately the quality of the students work is improved, the as sessment methodology [must move] be yond library control into collaborative efforts with the teaching faculty. 43 Devel oping this thought even further, Iannuzzi wrote: Strategies for campuswide assess ment of information literacy extend far beyond coordination between the refer ence librarian and the individual faculty members, and beyond the library instruc tion coordinator talking to department chairs. Strategies at this level require a li brary culture for information literacy strong enough to influence a campus cul ture, and this begins with the senior ad ministrators at our libraries and on our campuses. 44 Clearly, greater institutional commitment will be necessary to achieve these noteworthy goals. The results of the UC-Berkeley Teach ing Librarys Information Literacy Assess ment Surveys may or may not represent a microcosm of what is happening else where in the nation with respect to un dergraduate information literacy skills. To better understand student information lit eracy skills nationwide, more systematic and widespread assessment will need to be conducted and the results of these ef forts will need to be sharedfrom library to library and from institution to institu tion. So, too, the results of the individual librarys efforts in information literacy as sessment will need to be shared with re spective campus faculty and administra tion if information literacy is ever to be come truly a part of each college or universitys institutional assessment pro gram. Assessing Information Literacy among Undergraduates 85


This is a preview of a remote PDF: https://crl.acrl.org/content/62/1/71.full.pdf

Patricia Davitt Maughan. Assessing Information Literacy among Undergraduates: A Discussion of the Literature and the University of California-Berkeley Assessment Experience, College & Research Libraries, 2001, 71-85, DOI: 10.5860/crl.62.1.71