2016 Conference Reports

NASIG Newsletter, Sep 2016

A PDF file should load here. If you do not see its contents the file may be temporarily unavailable at the journal website or you do not have a PDF plug-in installed and enabled in your browser.

Alternatively, you can download the file locally and open with any standalone PDF reader:

http://tigerprints.clemson.edu/cgi/viewcontent.cgi?article=1777&context=nasig

2016 Conference Reports

Conference Sessions The Canadian Linked Data Initiative Embracing Evolving Technical Services Horizons Classifying Librarians E-books & Open Access Textbooks Migrating to a Next-Generation LMS Zine Acquisition and Cataloging Embracing Undergraduate Research Evidence-Based Acquisitions An Example of Librarian/Publisher Collaboration Incorporating Streaming Video into Workflows Knowledgebase at the Center of the Universe Managing Content in EBSCO Discovery Services The E-Resource Librarian's Role in Library Service Platform Migrations Open Access in the World of Scholarly Journals A Reconsideration of an Academic Library's Current OpenURL Link Resolver Service Defining and Refining the Role of Tech Services in New Resource Rollouts Show Me the Value! E 0 T. Scott Plutchak, University of Alabama at Birmingham , USA Reported by: Susan Wishnetsky Vision presenter T. Scott Plutchak began by recounting his past and present work experiences – library director, editor of the Journal of the Medical Library Association, member of the Scholarly Publishing Roundtable which informed the U.S. government’s Open Access policy, and, currently, director of digital data curation strategies at the University of Alabama at Birmingham – which have taken him outside the library and into collaborations with different sectors of the “scholarly communication ecosystem.” These experiences have led him to view publishers and other stakeholders not as adversaries, but as partners who are willing to offer their expertise to find the best ways to innovate and improve the discovery and dissemination of information. Plutchak recommended the recently published Making Institutional Repositories Work, with a foreword written by Clifford Lynch, executive director of the Coalition for Networked Information. In the foreword, Lynch recalls his own 2003 paper “Institutional Repositories: Essential Infrastructure for Scholarship in the Digital Age,” which envisioned institutional repositories as nurturing innovation and providing homes for new forms of scholarly information previously unavailable to researchers. Lynch’s early vision stood in contrast to the view presented a year earlier in “The Case for Institutional Repositories: a SPARC Position Paper,” by SPARC senior consultant Raym Crow, which envisioned institutional repositories as mechanisms to move traditional scholarly publishing into academia, to compete with traditional publishers, and to support a transition to Open Access publishing. Both Lynch and Crow also saw the institutional repository as a mechanism to highlight an institution’s research activities. In recent years, however, research information management systems such as Vivo, Symplectic Elements, and Elsevier’s Pure have emerged, along with tools such as ORCID identifiers and Altmetric. (ORCID identifiers combined with Altmetric are capable of identifying faculty authors and pulling in metadata from their published works. The tools have analytic capabilities to provide a complete picture of faculty output, including information on grants and teaching as well as publications, and they offer collaborative tools to bring researchers together.) Plutchak maintained that such systems eliminate the need for the institutional repository to function as a showcase for an institution’s research output. Research information management systems cannot provide access to content restricted by license; however, in some cases, institutional repository managers are able to provide access to some version of their faculty’s published works through the institutional repository. Plutchak warned that posting additional versions of articles available elsewhere brings its own problems. For example, if the institutional repository’s version has not undergone peer-review, it may not be pointing patrons to the best, most authoritative version of the content. If the article submitted to the repository is later corrected or retracted, it is unlikely that the version in the repository will contain those updates. While today’s repositories house many of the types of unpublished material Lynch had in mind – theses, dissertations, multimedia formats, syllabi and other teaching material, and research data – there is still a widespread focus on obtaining versions of peerreviewed articles, with some libraries imposing mandates on their faculty to deposit some version of their publications. Due to the National Institutes of Health’s (NIH) public access policy, federally-funded medical research is now being made openly available; other federally-funded research may soon follow. (The European Union is developing similar policies.) Since PubMed Central and other well-curated repositories are hosting this research, Plutchak wondered why institutional repositories duplicate their effort by hosting additional versions of the same content. As for Crow’s vision of moving the functions of traditional publishing into academia, Plutchak acknowledged the work of the Library Publishing Coalition and its members in that area, but concluded that we mostly remain dependent upon traditional publishers. Plutchak wrapped up by supporting the use of research information management systems to manage faculty metadata and promote institutional research, and calling for greater attention to the often neglected issues of interoperability among institutional repositories and the creation of a network of repositories. He urged a reduction in duplication of traditionally published content in institutional repositories and an effort to point patrons to an article’s version of record (or the closest version to it that is available). Plutchak concluded that the focus for institutional repository managers should be on making available more material that falls outside of traditional publishing. When asked what existing group might create the network of repositories he mentioned during his presentation, Plutchak pointed to the publisher group Clearinghouse for the Open Research of the United States (CHORUS) (http://www.chorusaccess.org/) and the academic group SHARE (http://www.shareresearch.org/) as organizations already working along those lines. An audience member suggested that the “publish or perish” standard for faculty was leading to the rise of predatory publishers. Plutchak agreed, and said that the Open Scholarship Initiative (http://osinitiative.org/) was planning to reach out to NASIG Newsletter September 2016 university administrators to discuss reforming the process of promotion and tenure. In addressing a question on how much time libraries should spend creating metadata, Plutchak acknowledged that there are always more things that need doing than time or energy to do them, and advised focusing on areas where the most can be accomplished with the greatest ease to avoid areas that may cause roadblocks and frustration. One audience member mentioned smaller, less sophisticated journal publishers whose content tends to move around and sometimes disappear, and wondered if institutional repositories might play a role in preserving that material. Plutchak recommended that such publishers might be directed to other established repositories that specialize in preservation, but agreed that a library could take on such a role if they made a commitment to “adopt” the journal and take responsibility for it. Another audience member indicated that many faculty members are depositing material in ResearchGate, and ignoring the library’s repository. Plutchak admitted that despite ResearchGate’s faults, many researchers like the “social networking” features that library repositories cannot provide, and suggested that we need to reconsider the role of our library repositories in the information ecosystem. Several audience members asked about including undergraduate projects; Plutchak responded that giving citations and DOIs to these works provided a tremendous service to students. One commenter noted that there was little discussion of preservation in Making Institutional Repositories Work, and wondered if it had been overlooked. Plutchak opined that real long-term preservation was very tough and probably should not be the focus of a single, stand-alone institutional repository. The Power of Open Heather Joseph, Executive Director, Scholarly Publishing and Academic Resources Coalition (SPARC) Reported by: Rachel Miles Heather Joseph spent fifteen years as a publishing executive in both commercial and not-for-profit organizations before serving as SPARC’s Executive Director. SPARC (Scholarly Publishing and Academic Resources Coalition) leads efforts in the U.S. and worldwide to create and maintain Open Access policies and practices. Access to information, data, research, and educational resources has never been more promising; yet, much of this crucial information is still concealed from the general public and the researchers most in need of using it due to publisher pricing, restrictive licenses, and prohibitions on reuse. Joseph opened the session with the current state of the Open Access (OA) movement, and in particular, the Budapest Open Access Initiative (BOAI), which has worked for the past decade to “provide the public with unrestricted, free access to scholarly research” (http://www.budapestopenaccessinitiative.org/read). The original BOAI declaration asserts that “an old tradition and a new technology have converged to make possible an unprecedented public good” (http://www.budapestopenaccessinitiative.org/read). While the OA movement has certainly made great progress in the fourteen years since the BOAI declaration was written, there are still many complex barriers to overcome. Today, as in the past, scholars share their research and creative works without the expectation of compensation in order to build upon existing knowledge and to enhance their research skills and professional development. The concept of “open” removes the barriers to access by allowing everyone — the research community as well as the general public — to immediately and freely access and reuse content. Joseph described scholarship as an ecosystem of sharing. As library budgets have shrunk or remained stagnant, journal prices have increased. The traditional publishing model is no longer sustainable and some stakeholders, including faculty, students, policy makers, funders, individual publishers, and members of the public, believe that scholarship deserves a model that allows for the greatest return on investment. Joseph highlighted several examples of opening up research to all, with one remarkable instance standing out among the rest: between 1988 and 2012, researchers with the Human Genome Project decided that all data and new information produced would be “freely available online within 24 hours of discovery” (http://sparcopen.org/impact-story/human-genomeproject/). The project generated $956 billion in economic output with more than $293 billion in personal income through wages and benefits. Economics aside, the project also led to a number of scientific breakthroughs and helped develop new DNA screening tests and diagnostic tools “capable of quickly identifying diseases and infections” (http://sparcopen.org/impact-story/human-genomeproject/). Despite inspirational success stories, there is still a long road ahead for the Open Access movement and its advocates. Joseph describes SPARC’s involvement in the OA movement as “too close” and “in the trenches,” which often leads to difficulty in recognizing the greater implications of Open Access; this simple awareness led SPARC to first assess the OA landscape and then develop strategies based on their assessment. When SPARC was established in 2002, there was a great deal of “stumbling around in the dark” before learning how to navigate the landscape of the OA movement. Overall, SPARC deduced that there are four themes that need to be addressed in order to move forward: 1. The Open Access landscape is much greater and more complex than we realized. Open Access applies to not just scholarly journals, but to data, software, educational resources, and more. 2. SPARC must now define its end goals in order to communicate to stakeholders the impact of defaulting to “open” in research and education. 3. SPARC’s goals must not advocate for “open” for “open’s” sake. SPARC must address what “open” achieves. 4. SPARC intends to help start a movement that will reward “open” in meaningful ways. Recently, an opportunity arose to assist SPARC in promoting its newest initiatives. In October 2015, Vice President Joe Biden developed a plan to lead a “moonshot” to cure cancer (http://www.sciencemag.org/news/2016/01/what-vicepresident-biden-s-moonshot-may-mean-cancerresearch). The effort intends to accelerate progress for cancer treatments and to find strategies to take barriers down that prevent researchers from making progress. SPARC, with Joseph leading at its helm, has determined that certain obstructions prevent the progress of the OA movement. While the task ahead appears daunting, the overwhelmingly positive responses to the OA movement from past initiatives has propelled the advancement of research forward. Joseph asserts that the time has now come to break through the obstacles that continue to stall progress in science and the arts by changing the conversation from talking about “open” for the sake of “open” to helping stakeholders understand the consequences of a world in which publishers control the majority of access to scholarly and educational content. Librarians can make, and have made, a ubiquitous influence on the scholarly community and the general public, and they will continue to do so. Budapest Open Access Initiative. Read the Budapest Open Access Initiative. http://www.budapestopenaccessinitiative.org/read Kaiser, Jocelyn. 2016. “What Vice President Biden’s moonshot may mean for cancer research.” Science (January). NASIG Newsletter http://www.sciencemag.org/news/2016/01/what-vicepresident-biden-s-moonshot-may-mean-cancerresearch SPARC. From ideas to industries: Human genome project. http://sparcopen.org/impact-story/humangenome-project/ Conference Sessions The Canadian Linked Data Initiative: Charting a Path to a Linked Data Future Marlene van Ballegooie, University of Toronto Libraries Juliya Borie, University of Toronto Libraries Andrew Senior, McGill University Reported by: Susan Wishnetsky Marlene van Ballegooie began the presentation with some background on the Canadian Linked Data Initiative. In the fall of 2011, the Library of Congress announced its Bibliographic Framework Initiative would eventually replace the MARC format. Just over a year later the BIBFRAME model for bibliographic description was introduced. When Library of Congress catalogers began testing BIBFRAME for a wide variety of formats and languages in August 2015, the coming changes became real and urgent. In the U.S., a transition team was already being formed. Linked Data for Production (LD4P), a collaboration of five universities (Columbia, Cornell, Harvard, Princeton and Stanford) with the Library of Congress was formed to reinvent the production of metadata, to work with standards organizations to establish common protocols and procedures, to test and expand the BIBFRAME ontology, and finally to transition library systems to the linked data model. A related project, BIBFLOW, was established to analyze existing workflows in library systems and find ways of migrating them to the new model. The major research universities in Canada have a long history of collaboration on many projects, including sharing a single library platform. Via one of their regular teleconferences, the five largest research libraries in Canada (University of Toronto, University of British Columbia, McGill University, Université de Montréal, and University of Alberta) formed their own joint initiative to develop a path toward linked data. In September 2015, they held a daylong meeting with LD4P members and other experts at the annual Access Conference in Toronto, which resulted in an agreement to cooperate, a communication plan, the development of initial working groups, and the inclusion of three additional libraries which were national in scope (Bibliothèque et Archives nationales du Québec, Canadiana.org, and Library and Archives Canada) to the initiative. More working groups and relationships between them were later established, presented by Juliya Borie as a linked-data cloud: There is also a steering and planning committee consisting of associate university librarians and working group chairs, which meets via a monthly conference call; it is intended to provide vision, enthusiasm, and leadership to the members of all the working groups. A shared web space was quickly established for documentation. Research Council, a national funding body. The Education and Training Working Group is collecting resources and preparing to train others by educating themselves. They have participated in online training and made several presentations on linked data to staff and senior management. The Digital Projects Working Group has identified possibilities for collaborative projects around linked data, including student publications, historical postcards, and a collection to celebrate the 150th anniversary of Canada in 2017.The French Language Working Group will assist with translation of documentation and try identify the needs of the French-speaking community for authorities and identifiers. The Identifiers Working Group is tackling the enhancement of legacy data with URIs and other linked data elements, and exploring how linked data tools such as OpenRefine, MARCEdit BibNext, Catmandu, Karma, and RIMMF (RDA in Many Metadata Formats) can be used in metadata production. The BIBFRAME Editor Working Group is testing and examining tools when available (e.g. BIBFRAME Editor from the Library of Congress and BIBFRAME Scribe from Zepheira). The IT Working Group was only recently formed, to enable the integration of linked data into digital repositories and provide programming expertise. The User Experience Working Group, of course, is planned for the future. Andrew Senior concluded by listing the challenges ahead: the “big picture” challenges of funding, coordination, and reaching multicultural and multilingual institutions over the wide expanse of Canada, as well as the individual challenges of incorporating new workflows and making the “mental shift” to new ways of thinking. Future challenges will involve migrations, working with vendors to ensure interoperability of systems, and finding “meaningful” ways to connect library data to the web. Senior recommends small steps and patience, combined with a culture of learning and an atmosphere of optimism. Charting a Course toward Embracing Evolving Technical Services Horizons Nadine Ellero, Auburn University Reported by: Kelli Getz Nadine Ellero, head of Technical Services at Auburn University, began her tenure by analyzing current processes in technical services. She quickly noticed that the department faced many challenges, including creating more efficient ways to serve users; pruning and maintaining print resources; and maintaining print and electronic workflows. As the department’s leader, Ellero had to make the environment safe for staff to provide honest feedback. She met with each staff member to learn their “pain points.” She felt that this was an important step because it fostered an environment of honesty and respect. It became clear that the experienced staff had been overlooked for some time, and they felt uncertain managing electronic materials and dealing with the increasing complexity of the work. In addition, she faced blending new staff into the department. In bringing the disparate groups together, she focused on seeking the truth, doing the right thing, and promoting respect through frequent communications. It took nearly a year to gain staff trust, but she eventually did see results of her hard work. She felt responsible for creating a new culture of servant leadership based on growth, caring, and communication. The first step in implementing the new culture was to focus on personal growth. Personal growth would allow staff to better embrace change. Ellero found that she often became a counselor for staff on their personal growth journeys. Additionally, Ellero sought to instill and emulate a learning and productive environment that invites expression of thoughts and ideas, especially those unknown or unpopular. Her staff has become a group of individuals who value and work on the art of listening, who reflect and share to effectively solve problems, who create new products and services by seeking truth, by promoting respect, and by helping each other. Ellero emphasized the importance of frequent communication. Her next project is to work on holding effective large group meetings to solicit more meaningful feedback. She makes it a point to touch base with each staff member as often as possible as part of her communication strategy. Ellero feels that it is time well-spent due to the professional growth demonstrated by her staff over the past year. Ellero cautioned against potential pitfalls, such as experiencing burnout. She experienced burnout because most days she was unable to get her own work completed due to spending so much time working with staff. The burnout went unchecked and eventually caused her physical ailments. Also, both Ellero and her 7 NASIG Newsletter staff had to learn that it was impossible to control everything and that mistakes were going to happen. Ellero chose to accept the mistakes as learning opportunities. Additionally, she had to accept the inevitable conflicts that she would encounter. Overall, Ellero transitioned reluctant, experienced staff into more open-minded individuals by building up their self-esteem and empowering them to make decisions. Classifying Librarians: Cataloger, Taxonomist, Metadatician? Beverly Geckle, Middle Tennessee State University David Nelson, Middle Tennessee State University Reported by: Marsha Seamans Beverly Geckle and David Nelson reviewed approximately 300 job ads from 2013 to 2016 that had “cataloging” or “metadata” in their title or job description. They deconstructed the job ads as well as analyzed the use of the terms “cataloging” and “metadata” in order to identify trends within the profession. They did not examine organizational structures of the institutions for whom the jobs were posted. The analysis identified fifty-four unique job titles, including ones which contain some form of “cataloger/cataloging,” “metadata,” “metadata and cataloging,” “metadata and [something else],” as well as many where the terms were just part of the job description. Besides the proliferation of job titles, a number of general observations emerged. Job ads for cataloging and metadata services included a high, perhaps unrealistic set of expectations that blend cataloging and computer programmer expertise. The length of the job ads has increased, along with desired personal qualities listed in the job description. Finally, the use of the term “metadata” was ambiguously defined in job description postings. The qualifications in job ads often included knowledge of or experience with both cataloging and metadata standards, as well as programming skills and software knowledge. Additionally, the ads usually required previous experience. These trends raise a number of questions and concerns: Is expertise being sacrificed for doing more? How does one demonstrate experience? How do we train future librarians if experience is a requirement? Finally, as we look at the direction in which libraries are headed, will we start seeing job ads for linked data librarians? Some of the personal qualities that appeared in job ads included: innovative, creative, energetic, self-motivated, collaborative, forward-thinking, knowledgeable, serviceoriented, dynamic, flexible, and detail-oriented. The use of these evaluative adjectives raises the questions of how these are presented by candidates and how they are judged by those doing the hiring. Despite the proliferation of the term “metadata” in job ads, the definition remained elusive, and the presenters wondered if the term “cataloging” is now deemed archaic and “metadata” is more current. Metadata is typically defined as data about data, but job qualifications typically reference knowledge of content standards such as Library of Congress Subject Headings, Library of Congress Classification, Dewey Decimal Classification, the Art & Architecture Thesaurus, the Union List of Artist Names, and the Thesaurus of Geographic Names. Metadata often refers to schemata rather than content standards. The presenters argued that what is really needed is “data value creators using metadata standards.” Catalogers might be thought of as taxonomists rather than metadata librarians, with taxonomy being defined as the science of classifying things. As expected, the deconstructed job ads identified a number of trends in the profession and raised important questions. This presentation engaged the audience with a lively discussion about this trend. The presenters concluded by suggesting participants read Heather Hedden’s The Accidental Taxonomist. E-books for the Classroom & Open Access Textbooks: Two Ways to Help Students Save Money on Textbooks Jason Boczar, University of South Florida Laura Pascual, University of South Florida Reported by: Nancy Hampton Jason Boczar and Laura Pascual work in the University of South Florida Library (USF). Boczar is the digital scholarship and publishing librarian. Pascual is the electronic resources librarian and manages the university’s “E-books For the Classroom” program. Their presentation focused on three main topics: the need for textbook affordability programs; initiatives the USF is taking in this area; and how two programs were implemented (E-books for the Classroom and Open Access Textbooks). Between 2002 and 2012, the Government Accountability Office determined that textbook prices increased 82%. At USF over half of all students receive financial aid packages that include Pell grants, scholarship aid, and federal student loans. When surveyed, over half of respondents admitted to foregoing the purchase of textbooks due to cost, despite the fact that this decision could negatively impact their grades. On October 8, 2015, the Affordable College Textbook Act was introduced in the United States Senate. This Act directed the Department of Education to make competitive grants available to institutions of higher education to support pilot programs that expand the use of open textbooks. In response to the need for affordable textbooks, Boczar and Pascual created the Textbook Affordability Project. They determined that librarians, with their knowledge of instructional materials and their experience with publisher licenses, are well suited to provide advice on e-books, reserves, Open Access, and the best textbook price advice to faculty and students. The USF Library developed a website with information about the most affordable textbooks, e-books for the classroom, course reserves, and Open Access textbooks. Their website recommends that faculty request open DRM e-books so that students can access supplemental readings as well as required readings online. Open Access textbooks are encouraged because faculty at USF can control the content of the textbook as well as its cost. In addition, Open Access textbooks can incorporate interactive materials such as videos and maps, and they can be hosted on the university’s institutional repository. In order to increase faculty participation, USF librarians worked with the Provost’s Office to promote the creation and use of Open Access materials. Boczar and Pascual described challenges they experienced while assisting with the creation of Open Access textbooks, including that different Open Access platforms use different formats. For example, they noted that one platform may use the iBook format and another may simply use PDF. When PDF is used, a separate PDF should be created for each chapter rather than each book. This will allow patrons to download or print only the chapters that they want. The library team also needs to locate peer reviewers, provide copy editing, and host the content on the university’s institutional repository. Peer reviewers need to be given ample time to review the materials once they receive them. Faculty authors will need to be compensated for their time. Librarians will need to gather all copyright permissions as early as possible. The presenters noted that getting these permissions can be time consuming. Once a new Open Access textbook has been created, the Library should inform all faculty about the new resource even if it is not within their discipline. Once they see how Open Access works, they will want to create material of their own. Embracing Changing Technology and New Technical Services Workflows in Migrating to a Next-Generation Library Management System Kay Johnson, Radford University Jessica Ireland, Radford University Reported by: Martha Hood In 2015, Radford University decided to migrate to OCLC’s WorldShare Management Services (WMS). Kay Johnson and Jessica Ireland shared their experiences with the migration process and their analysis of the workflow within the Collection and Technical Services (CaTS) Department at McConnell Library. One of the first instrumental decisions was to evaluate what data would migrate and what would not, along with assessing what data in records would need to be cleaned up before migration. WMS migrated bibliographic and items records, along with patron and circulation information, reserves, and holdings records as expected. The knowledgebase, acquisitions and electronic resource management system (ERMS) data, check-in records, and authority records were items that would not migrate and careful planning was needed to manage accordingly. Attendees learned how Radford University’s librarians dealt with the difficult challenge of accurately reflecting thousands of local holdings records for their serials in OCLC while retaining critical data in check-in notes, such as routing information, coverage, and other important detail information during the migration process. Next, the speakers shared how they mastered setting up the knowledgebase, aptly named Collection Manager. One huge challenge was the inability to batch import data into OCLC’s knowledgebase (this would create custom collections that would not be updated automatically by OCLC). Therefore, the librarians decided to individually update collections and titles, a huge undertaking, but one that was needed in order for the collections to be automatically updated by OCLC. Although this process was not the most streamlined, they loved the ease of turning on collections and individual titles in WMS, along with the ability to access links between the knowledgebase and financial data in the acquisitions module. There is, however, improvement needed with accurate linking to streaming videos and music collections, more timely removal of titles from various collections, and providing better refined searching. When switching library management systems, is it important to not only carefully plan out all the details of what and when to move data, but also to train staff in the new system. The library took on this challenge by having weekly meetings and utilizing many training videos, in addition to collaborating and networking with various other universities to better learn from their experiences. They also examined their workflow, proposed changes, and hosted question and answer sessions with their staff. One particular idea which alleviated apprehension among staff was having a special “CaTS” (Cataloging and Technical Services) Retreat. This was an opportunity to go through NASIG Core Competencies, conduct PEST (political, economic, socio-cultural, and technological) and SWOT (strength, weakness, opportunity, threat) analyses, and review the position descriptions of various staff. Post-migration projects naturally developed during the migration process. Primary focus was given to verifying the accuracy of serials titles, local holdings records, and simplifying journal location fields. Another postmigration project involved creating order records and updating historical payment information in the new system. Overall, the Radford University librarians were pleased with the relatively smooth process of migration. They unified and carefully planned in a very limited timeframe, and most impressively had less than one percent of their records not match up with OCLC’s bibliographic records! Best of all, they were pleased that OCLC’s WMS and knowledgebase operates on all browsers and electronic devices. In the future, they will continue to review possible changes to positions and workflows; submit enhancement requests as needed; populate a license manager; and develop procedures for their department. 10 NASIG Newsletter Embracing the Zines: Zine Acquisition and Cataloging at the Vassar College Library Heidy Berthaud, Vassar College Reported by: Scott McFadden Zines are self-published works, created by individuals or groups, usually sold or distributed directly by their creators. They represent voices and narratives often absent from traditional publishing. The library of Vassar College, a private, four-year liberal arts college with a diverse and socially active student body, maintains a zine collection. The collection consists of 182 cataloged zines, with others waiting to be cataloged. Cataloging these materials began in 2014, and the zines were made available to the public in the catalog in the fall of 2015. While zines can cover a wide variety of subject matter, Vassar collects mostly ones pertaining to political issues. The concept of ethical zine collection is central to Vassar’s collection development policy. Most zines are not done for profit, and zine creators, a.k.a “zinesters,” spend their own money to produce zines. Thus, a policy of ethical zine collection suggests the library should purchase the zine directly from the creator whenever possible, which helps the zinester defray costs. When direct purchase from the creator is not feasible, a second choice is to purchase from a zine distributor, a.k.a. “distro”. It is also considered ethical to give the zine creator the right of refusal, as some creators intend their zines for a particular specialized audience and prefer that they not be more widely available to the public at large. In practice, Vassar has found that most zinesters are happy to be included in the collection, and the library has received many thank you notes from creators. Unlike traditional publishing, zines require much more active searching on the part of the acquisitions librarian. Sources such as Twitter, Etsy, and Tumblr are good ways to find zines. As mentioned above, online distros are also good sources of zine content. While the zine creator does not typically receive as much money for a zine purchased through a distro as one purchased directly, they do still receive some remuneration. Cataloging zines can create a number of challenges, since many zines deliberately decline to follow the paradigms of traditional publishing. For example, in many cases, common elements such as dates of publication or places of publication are simply not present. It may even be difficult to discern the intended title of the zine. Zines are deliberately radical and unconventional. For this reason, local practices will play a large part in a library’s cataloging of zines. Identifying a zine’s author can also be challenging, as many authors employ pseudonyms, and in some cases have reason to prefer the anonymity this provides. In many libraries which collect zines, the MARC name qualification $c (Zine author) has begun to be used in name authority records. For example, a zine might be entered under the heading Rachel $c (Zine author). Vassar maintains a file of known zinesters, as well as their names and preferred pronouns. Another development among libraries which catalog zines has been the creation of a metadata standard called xZINECOREx. Based on Dublin Core, xZINECOREx offers metadata elements important to zine publishing, including subject matter, genre, content notes, freedoms and restrictions on distribution, provenance, and trigger warnings. Because of the unconventional nature of zines, Library of Congress Subject Headings (LCSH) are often not a good fit for the subject matter of these publications. Vassar’s zine collection is heavily focused on diversity, and LCSH is often at odds with the terms that zinesters use to describe themselves. Those outside the traditional gender binary, as well as genderless people, are not well represented by the terms of LCSH. Vassar attempts to use language that is inclusive and that reflects the usage of the community being described. Thus, local subject headings are created when necessary. For example, the term “transsexual” is controversial within the zine community, and so it is used as a subject heading only when it actually appears in the zine being cataloged. In addition, Vassar has established local subject headings for terms such as “white privilege” and “non-binary gender,” even though such terms are not included in LCSH. This policy is an 11 NASIG Newsletter attempt to be true to the resource being cataloged, rather than being true to the cataloging code. In cases where subject headings seem inadequate, the cataloger may also rely heavily on summary notes, which attempt to include as many keywords as possible that might be searched for by researchers. The session concluded with an activity for the audience that illustrated the challenges in cataloging zines. The audience members were shown examples of zines that posed particular cataloging difficulties. Embracing Undergraduate Research: Creating the Arsenal Melissa Johnson, Augusta University Kim Mears, Augusta University Reported by: Maria Aghazarian Melissa Johnson and Kim Mears presented in NASIG’s first Skype session on how Augusta University’s libraries were involved in the creation of a new Open Access undergraduate research journal, Arsenal. They presented a detailed report of the journal’s creation, including empowering interested students, creating a journal identity that meshed with the University’s identity, and discussing challenges and future plans. Johnson and Mears began with some context for the educational system of the university, which was recently reformed as a consolidation of two public Georgia universities. The university highly values undergraduate research and has two research programs in place, including the Center for Undergraduate Research and Scholarship (CURS). Excited to share their research, students formed an organization called On the Shoulders of Giants (OSG) and approached CURS with the idea of starting a journal. The importance of the journal was evident: publishing allows students to see the value of their research by making it publicly available, establishing students as the creators of knowledge as well as consumers. A major success of the journal was the ability to show CURS that costs could be kept to a minimum. The institutional repository was chosen as the journal’s home due to supported web hosting, archiving, and platform stability without extra cost or staffing. Article submissions were handled through a Wufoo form, as the university was already a subscriber. A LibGuide was created for the journal’s homepage to give students more control over the look and feel of the site. OSG’s student organization budget funded CrossRef fees so DOIs could be assigned to published articles. Finding an appropriate name was challenging. The students originally wanted to name it after OSG, but their advisors recommended coming up with a title that would connect more closely to the university’s identity. This would encourage faculty and student buy-in, and showcase the journal as a part of the university’s research identity. In the 1800s, the Summerville campus was an arsenal, so the name “Arsenal” had significance. The editorial board is composed of faculty members, librarians, and OSG student members, providing a great opportunity for librarians to teach students about copyright and Open Access. While they had support, students were primarily responsible for the core decisions of the journal, such as aims and scope, metadata infrastructure, and the peer review model. One of the most important decisions was to create a faculty mentor consent form. This form required student authors to seek guidance from a faculty member who would oversee ethical and legal aspects of the research, including institutional review board (IRB) approval. An unexpected challenge to the Arsenal was apprehension from CURS faculty. Some faculty members were hesitant to encourage students to submit to the journal because they wanted to ensure that the articles produced were credible scholarly products. Sustainability is an ongoing challenge, especially considering the rate of faculty turnover since the consolidation. Future plans for the journal include applying for Directory of Open Access Journals (DOAJ) inclusion, 12 creating subscription notifications when new issues are published, continuing to increase faculty buy-in, marketing of the journal, and indexing of the journal articles. Exploring the Evidence in Evidence-Based Acquisitions Stephanie J. Spratt, University of Colorado Colorado Springs Reported by: Derek Wilmott Stephanie Spratt shared the University of Colorado (CU) Libraries’ experience with two different demand-driven acquisition platforms. She and her colleagues at the University of Colorado campuses - Colorado Springs, Boulder, and Denver, had the opportunity to compare both the Alexander Street Press evidence-based acquisition (EBA) model with Kanopy’s patron driven acquisition (PDA) model for streaming video. The CU Libraries began comparisons with usage statistics. Issues that arose included the types of usage statistics available; interpretation of the gathered usage statistics; and other data provided in the usage reports. A second comparison focused on assessments of the EBA and PDA models and workflow comparisons to other resources or models. Spratt first pointed to differences and similarities between the EBA and PDA models through the lens of the Alexander Street Press and Kanopy platforms. In the case of Alexander Street Press EBA, there is an upfront monetary commitment with the cost known at the program’s start. Selections are mediated, as the collection development librarian decides titles to purchase at the end of the contracted time. Kanopy’s PDA, on the other hand, has quarterly invoices for videos accessed, and a less flexible spending option that requires a deposit account for libraries. Video selection is not mediated and relies on patrons to trigger purchases. According to Spratt, the licenses for streaming videos in Kanopy have a default setting of one or three years. The library can decide subject areas and producers they wish to activate and they enjoy full public performance rights. Both Alexander Street Press and Kanopy provide the following: free MARC records for discovery; accessibility features; library management system (LMS) integration; and a flexible clip and playlist construction by a patron. The CU Libraries examined the setup, maintenance, and assessment process for both platforms. The initial set up for the Alexander Street Press EBA program needed an up-front decision as to where to place access points into the platform. It was noted that with the Kanopy PDA platform, selecting subject area or producer collections took more time than activating the entire catalog. MARC records for both platforms required deduplication efforts in the libraries’ LMS. This meant that when one institution purchased a title, the other libraries needed to suppress the title from displaying for the rest of the consortia. The Alexander Street Press EBA program required constant monitoring by staff to track usage and make purchasing decisions for the consortium by the program’s end. One concern was the possibility that individual title selection could cause double payment, if a subject collection was purchased at a later date. The Kanopy PDA platform does not require staff to monitor usage for triggering a video licensing event. However, staff did spend more time managing quarterly invoices and tracking the deposit account, if that option was selected. Finally, Kanopy licenses needed to be reviewed for renewal before the expiration of the program. The last part of the presentation focused on what the CU Libraries learned, pointing out the best features of both programs, and describing the next steps that they decided to take. Spratt advocated for the need to actively promote the programs. Cost is definitely a factor in deciding which platform to use. Setting up platforms required two months, which they felt was excessive. There was also a need to manage faculty expectations. Spratt gave the example that University of Colorado Colorado Springs no longer had access to the PBS streaming videos, which disappointed some faculty. 13 The Alexander Street Press EBA model is best suited for libraries with available space in their budgets for perpetual access streaming video. It has extensive program offerings, and patrons can provide input on which subject areas have need for streaming video. The Kanopy PDA model is best suited for libraries with limited budgets. The model is also suited for libraries that value access over ownership and/or prefer requests for streaming videos in specific subject areas. The CU Libraries decided to replace their Alexander Street Press EBA platform with individual Academic Video Online: Premium (AVON) subscriptions and to continue with the Kanopy PDA platform for another year. Their next steps will include devising a license management workflow and electronic resource management (ERM) tracking. There were a few questions that centered on workflow issues and a comment that maintaining two different platforms seemed like a lot of work. Spratt acknowledged the sentiment and noted that the CU Libraries were not prepared to deal with how challenging usage data collection would be for them. Finally, Spratt described the workflow process for introducing MARC records first into the catalog and then adding them to the discovery layer. The Future of Information Literacy in the Library: An Example of Librarian/Publisher Collaboration Rebecca Donlan, Florida Gulf Coast University Stacy V. Sieck, Taylor and Francis Reported by: Stephanie Spratt Taylor & Francis (T&F) is putting more focus on content and services to aid in information literacy (IL) instruction. To demonstrate this, Stacy Sieck of T&F partnered with Rebecca Donlan of Florida Gulf Coast University (FGCU) in a collaborative project to update and rebrand the library’s IL instruction efforts. They copresented a poster session, Stop, Collaborate and Listen, at the 2015 Charleston conference and presented an informational session at NASIG. Librarians at FGCU are academic faculty and have established relationships with other campus faculty through liaison work and committee work. Using the ACRL Framework for Information Literacy for Higher Education and focusing on undergraduate research, the FGCU librarians did their best to provide quality IL instruction. Finding that just-in-time instruction was more beneficial than just-in-case instruction, the librarians disclosed their findings to faculty. The librarians and the writing center faculty collaborated to propose improvements to the curriculum that resulted in a partnership and a requirement for all students to participate in IL activities throughout their programs. The FGCUScholars: Think, Write, Discover program was developed to improve IL instruction. The library and writing center faculty created a rubric incorporating critical thinking and IL components that identified benchmarks for students to meet throughout their college careers, including a capstone project intended to be met by graduation. However, current students had difficulty meeting benchmarks and milestones indicated on the rubric. The goal of the current project is to overhaul the IL instruction program to improve the results of incoming students as they progress toward graduation. T&F is collaborating with the FGCUScholars program to develop a literacy toolkit using webinars, instructional materials, a website, and in-person workshops. This toolkit will be designed to help students achieve the benchmarks defined in the FGCU rubric. T&F was interested in developing an IL program after holding a forum with librarians in March 2015. During that forum, T&F discovered that IL instruction is a shifting and challenging responsibility for librarians. The launch of the updated IL instruction program is planned for fall 2016. In order to be successful, the collaborators noted that faculty buy-in is essential, timing is important, and marketing will need to be used to build interest. Additional components of the new IL instruction program include partnering with FGCU’s undergraduate research journal and getting student work into FGCU’s institutional repository. They plan to assess the program after five years. Juggling a New Format with Existing Tools: Incorporating Streaming Video into Technical Services Workflows Jennifer Leffler, University of Northern Colorado Reported by: John Kimbrough “Dealing with streaming video can feel like you’re juggling fire,” warned Jennifer Leffler at the start of her presentation. Format complexities, copyright questions, authentication issues, and user expectations are just some of the difficulties posed by streaming video. Leffler exhibited existing workflows for streaming videos at the University of Northern Colorado (UNC), and then described some of the challenges encountered by UNC staff in cataloging videos and making them accessible. Within UNC’s Technical Services Department, streaming video orders are initially entered into the ILS, and then passed to one of the two technical services managers (Leffler and her colleague Jessica Hayden). The managers handle licensing and copyright, seeking permission to stream the video at UNC. Amenable copyright holders and/or vendors provide access to streaming videos in a variety of ways. Some simply grant permission for UNC to locally host and stream the video, either from an existing DVD or a file sent by the vendor. In these cases, technical services staff obtain a DVD copy and arrange to host the file on a local video server maintained at UNC. A second way to provide access is by linking to the video via a vendor’s website, YouTube, or Vimeo. Leffler related one copyright holder that granted permission, then sent 100 user/password keys to a password-protected Vimeo video, leaving technical services staff the task of distributing and managing keys. Once access to the video is obtained, the order is paid and the video is cataloged. Many streaming video permissions are only granted for a finite period, such as one year or three years. To track expiration dates, UNC makes entries for streaming NASIG Newsletter September 2016 videos in their ERM, and uses existing ERM workflows to generate reminders when videos are up for renewal. Leffler posed several questions about streaming video processing for audience discussion using some of the issues that had arisen at UNC while developing workflows:  Are multi-year video leases treated as monographs or serials? (UNC treats them as monographs.)  If the library acquires a title in both streaming video and DVD, are these formats cataloged together or separately? (UNC catalogs separately.)  Should libraries track streaming video usage, and if so, how much of a video has to be watched to “count” for usage? (Some legitimate uses could be quite brief, such as scene studies in a theater class.) Providing discovery and access of streaming video is an ongoing challenge. At UNC, all videos are cataloged, either with vendor-supplied MARC records or original cataloging. UNC inserts local descriptors for streaming video records (e.g., “sv” prepended to the call number to help identify streaming videos). Although UNC’s discovery layer tool can ingest MARC records, the process strips away some of the format-specific information, making it difficult for users to find videos. In addition, some knowledgebase vendors have worked directly with video providers to ensure their entire inventory is available in discovery tools, posing difficulties for libraries who only subscribe to a selection of the provider’s content. Much like a novice juggler, managing streaming video can initially feel like an exercise in dropping balls. However, according to Leffler, things do get better with practice. The days when we can juggle streaming videos with aplomb and ease may be far off, but sharing ideas helps make progress towards that goal. library collections, and the catalogs that contain them are the preeminent library system, central to all workflows. However, much as the Copernican revolution transformed views on the natural world and social order by demonstrating that the Earth orbited the Sun, so too is the prominence of electronic resources leading to a paradigm shift in the way we think about library systems. Kristen Wilson, Associate Head of Acquisitions and Discovery at North Carolina State University Libraries, has distilled this new thinking into a forthcoming Library Technology Reports issue. At this session she shared her research with NASIG, explaining why knowledgebases have supplanted the catalog as the crucial library system undergirding patron discovery and staff workflows. She also surveyed the current state of knowledgebases and reported on efforts to make them even more collaborative and global in scope. Wilson defines a knowledgebase as “structured data describing the institutional collection and how to access it.” Knowledgebases combine descriptive metadata about an information resource (such as the title or a publication date range) with acquisitions information (such as the package in which it was sold or the library’s subscription entitlement). Knowledgebases exceed the capabilities of the traditional catalog by blending global data true for all libraries with local data specific to a given institution. Wilson offered an example by comparing a knowledgebase record for Serials Review to the corresponding bibliographic record, which lacks information about previous providers, perpetual holdings, and alternative access through aggregators. Because they are aware of resources in a global and local context, knowledgebases serve as an “identity broker” that orchestrates the proper function of other library systems. Knowledgebase at the Center of the Universe Kristen Wilson, North Carolina State University Reported by: Sanjeet Mann Conventional wisdom has long held that bibliographic records are the most important resource for describing 15 The centrality of knowledgebases makes their maintenance and design all the more important. Wilson reviewed the metadata supply chain connecting content providers (who create and sell metadata), knowledgebase vendors (who normalize metadata) and libraries (who display and help troubleshoot metadata). In practice, these roles are blurred; the proliferation of competing knowledgebases leads to duplicated effort for content providers and libraries alike; and erroneous titles, holdings, and identifiers trigger frequent linking errors. Fortunately, widespread adoption of the NISO KBART recommended practice is helping to make knowledgebases more accurate. By examining case studies of how various proprietary vendors and open source initiatives are developing their knowledgebases, Wilson was able to identify trends in knowledgebase design. Knowledgebases are expanding to include more kinds of information content and track changes in content over time; they are leveraging APIs to make themselves interoperable with many other systems; they encompass both central management and support for library specific holdings; and they are opening themselves up to allow customers to collaboratively contribute and edit the metadata. For example, the KB+, BACON, and ERDB-JP knowledgebases all originated in consortia and contain highly-curated metadata, with provisions for partners to improve any errors they find. Wilson closed with the observation that knowledgebase metadata seems to naturally lend itself to being maintained at multiple levels. For example, there could be global data on publishers, packages and standard license terms, national or consortia-level data on shared packages and licenses, and local data on institutionspecific holdings, pricing and negotiated license terms. Doing so would move these systems toward the infinitely flexible, all-encompassing and “self-sustaining” global knowledgebase envisioned by Ross Singer. Managing Content in EBSCO Discovery Services: Action Guide for Surviving and Thriving Regina Koury, Idaho State University Library Charissa Brammer, Idaho State University Reported by: Emily Ray Regina Koury, from Idaho State University, spoke about her experiences with EBSCO Discovery Services (EDS). (Her presentation partner, Charissa Bremer, could not attend the conference.) Koury began by outlining the size of Idaho State University (14,371 students and thirty-nine faculty and staff in the library) and the transitions of her department’s name from Technical Services to Content Management to Resource Discovery Services. Most of the session addressed her library’s experience with EDS and specific issues they resolved. E-book records from their Voyager catalog were not loading to EDS; records from EBSCO collections were able to be loaded. However, EBSCO collections’ records either displayed no concurrent user information, or the concurrent user information appeared too low at the bottom of the page for patrons to notice it. Working with EBSCO support, they set up filters to prevent loading records into EDS when the 856 field contained “Netlibrary,” 049 contained “N $ T”, and 938 contained “ebsco”. With the filters in place, the Library’s catalog records for EBSCO e-books loaded into EDS. This process took about two weeks. Other issues discussed included that “bound-with” bibliographic records appeared in EDS with only the first title visible to patrons. They hope for better title discovery in the future. There were also some issues with a few databases. For example, widgets for Ovid and Natural Medicine did not appear in EDS, so they decided to load MARC records for these resources into EDS. They considered a similar process for Clinical Key, but the content is now available in EDS. Following a request from public services librarians, videos were removed from their EDS indexing and were no longer visible to patrons. In addition to outlining the issues and workflows to resolve issues in EDS, Koury discussed attitudes towards EDS and discovery tools in general among public service and technical services librarians. Since implementation of EDS, library staff are more in favor of discovery tools. Koury listed ways to contact EBSCO to receive information from them, including the EDS content newsletter, the EDS partner listserv, the EDS blog, and the EDS wiki (which requires a log in). For customer service, she was happy with the engineering team, but lately there have been some issues with general support. She was optimistic; however, and hoped that her recent issues were due to changing roles and will improve. She reported that her institution prefers EBSCO’s LinkSource and EDS over SFX and Primo. In answering questions, Koury detailed how the Library uses a Google Form ticketing system that is sent to several individual emails for troubleshooting. They have not yet started weeding e-books from their catalog or from EDS. Koury noted that content must be deleted in three places to remove it fully from EDS. For Open Access content, they loaded Project Gutenberg titles, but there were so many updates they deactivated this service. For Directory of Open Access Journals (DOAJ), there have been some problems, but they are retaining those journal titles in EDS. Master of “Complex and Ambiguous Phenomena”: The Electronic Resource Librarian’s Role in Library Service Platform Migrations Conor Cote, Montana Tech of the University of Montana Kirsten Ostegaard, Montana State University Reported by: Sanjeet Mann When Conor Cote and Kirsten Ostegaard polled the audience at the beginning of their NASIG session, nearly everyone in the room was either contemplating a library service platform (LSP) migration or had recently completed one, and many were migrating as part of a consortium. System migrations are disruptive for any single library; one audience member likened the experience to changing the wing of an airplane while flying it. Libraries that choose to migrate as a consortium face added complexity, and typically their electronic resource librarians (ERLs) are caught in the middle. At this session, Cote and Ostegaard used the NASIG Core Competencies of Electronic Resources Librarianship to explain how their consortial migration has affected their work; and facilitated discussion with audience members on the communication, project management, and time management strategies needed to achieve a successful migration. TRAILS, a diverse consortium of Montana academic, special, and tribal college libraries, includes Montana Tech (a 2,500 FTE engineering and science campus in Butte within the University of Montana where Cote works as electronic resource librarian) and Montana State University (a 15,000 FTE land grant university in Bozeman where Ostegaard is electronic resources and discovery librarian). The consortium recently chose Alma as its new LSP, concluding contract negotiations in May 2016 and committing all members to undertake a migration before their existing ILS contracts expired. To manage the migration, the consortium set up three groups of project teams: “functional teams” composed of experts from various libraries in five areas such as “discovery” or “e-resources”; a “core team” containing the leaders of each functional team (and a few others); and primary contacts from each library in TRAILS (usually the director). Teams used Basecamp to manage key documents, and communicated via email and recorded webinars. Cote used OneDrive for Business to share documents and archived key emails in a shared OneNote notebook. He also served as Montana Tech’s primary liaison with Ex Libris, with responsibility for submitting support tickets on behalf of all departments in the library. Cote and Ostegaard both cited time management as a challenge; they negotiated reduced workloads and wrapped up competing projects in order to focus on the migration. Audience members from other consortia undertaking LSP migrations reported similar experiences. Research literature shows that LSP migrations require buy-in from every department in a library; consortial NASIG Newsletter September 2016 migrations also require trusted relationships between institutions and the leveraging of shared experience and resources. E-resource and systems librarians are disproportionately affected; one recent study estimated that they fielded a quarter of the problems that arose during the migration. Ostegaard and Cote examined how each of the seven Core Competencies can help an ERL participate in a system migration: 1. Life Cycle. Tracking resources throughout their life cycle gives the ERL enough familiarity with library operations to be able to serve as a bridge between departments, or between the library and the system vendor. 2. Technology. The ERL’s technical knowledge is necessary to orchestrate hardware and software changes, train staff, and communicate with external stakeholders. 3. Communication. Once begun, a LSP migration moves with surprising speed. The ERL must keep up with changes and communicate in multiple directions: “up” to management (especially regarding potential problems), “down” to all staff, and “across” to teammates. 4. Research and Assessment. Migrations test the ERL’s analytical skills by offering plenty of problems to solve. Audience members shared that the learning curve remains steep for the first year after going live. 5. Supervision and Management. ERLs involved in a systems migration may find themselves influencing and managing people over whom they have little formal responsibility. Cote remarked on the need to share a sense of urgency with project teams, while setting realistic deadlines that give them sufficient time to respond. Ostegaard commented on the need to translate policies and redesign workflows to suit the new system. 6. Trends and Professional Development. LSPs have a rapid development cycle and continue to add new functionality even as staff are being trained on the system. ERLs can use release notes, listservs, and peer advice to help keep up with the changes. 7. Personal Qualities. Cote and Ostegaard highlighted emotional intelligence as a key skill for ERLs involved in a migration. “Leading with respect,” empathizing with anxious staff, and establishing guiding principles for how the migration will benefit end users can help ward off the phenomenon of “emotional hijacking” that might otherwise foment staff resistance. Libraries in the midst of a LSP migration may be tempted to liken the experience to that of navigating an obstacle-ridden skijoring course, as one audience member did when Ostegaard included a slide on this popular Montana pastime (where a person on skis is pulled by a horse). However arduous the process, Cote and Ostegaard concluded that ERLs are well positioned to help pull their libraries through, as long as they act with respect, stay goal oriented, and communicate transparently. Open Access in the World of Scholarly Journals: Creation and Discovery Sandra Cowan, University of Lethbridge Chris Bulock, California State University Northridge Reported by: Shona Toma Sandra Cowan and Chris Bulock brought together issues faced when advocating for the creation of Open Access (OA) content, and the discovery and access issues posed by OA content in hybrid journals. First, Cowan summarized the current status of OA content. She presented stark figures demonstrating that the current subscription model is unsustainable for libraries. The increasing costs of commercially-published journals are damaging monograph budgets and even impacting the ability to hire new staff. Cowan described how Canadian institutions are seeking to overcome this current crisis. Assessing which journals are absolutely critical has served as useful leverage in negotiations, particularly in breaking down “big deal” journal publication packages. She asserted; however, that the best solution is to diminish the power that commercial publishers have over libraries. Cowan gave a very useful overview of OA policies and initiatives in Canada, including the University of Lethbridge’s Journal Incubator (http://www.journalincubator.org/). The obstacles and incentives for OA publishing were also discussed. Cowan called on librarians to lead by example, advocate for positive OA publishing and policies, and to demonstrate the many benefits of OA to our academic colleagues. Bulock spoke more specifically about hybrid journals and the many reasons why they are problematic. A hybrid journal gets funding in two ways: it has a subscription fee, and also offers authors the option to pay to make their article OA. Bulock identified reasons why these are a popular choice. Publishing in a hybrid journal satisfies many OA mandates, but publishing in hybrid journals still has the “prestige” element required for promotion and tenure because there are subscription fees associated with these journals. For the library, hybrid journals are a particular challenge to integrate with OpenURL link resolvers and discovery layers. Bulock explained that within a hybrid journal, it is difficult to determine which content is accessible to the library. If the library doesn’t index Open Access articles, the user is probably getting better results via searching Google. The use of NISO Access and License Indicators offer an article level indicator in the metadata; however, Bulock revealed that this is not being used by many publishers of hybrid journals, or if it is being used, it is not implemented correctly. There is a high volume of research published in hybrid journals, particularly in the UK, and therefore content needs to be accurately indexed. Bulock concluded with suggestions for what librarians faced with this challenge can do. These included discussing the issue with your discovery and content providers, and advocating for the proper use of the NISO indicators. Remain in Safe Mode or Embark on a New Horizon? A Reconsideration of an Academic Library’s Current OpenURL Link Resolver Service Rachel Erb, Colorado State University Libraries Reported by: Sanjeet Mann restructuring and staff reductions had combined to leave the electronic resource management librarian, Rachel Erb, with only one staff member to assist with eresource management, even as Erb’s role shifted away from troubleshooting and knowledgebase management toward licensing and vendor negotiations. Outside the department, the vendor marketplace for link resolvers had changed considerably, and the CSU library system was looking to integrate operations across its three campuses. Conditions were ripe for change; however, as Erb shared in this NASIG session. The process led her and her colleagues in a direction they could not have predicted. The search began in March 2015, when library deans created a committee to identify the pros and cons of alternative link resolvers, gather price quotes, recommend the best system, and propose workflow recommendations and an implementation timeline. Erb chaired the committee, which also included representatives from library systems, academic computing, and a subject librarian. They had only six months to complete their work, so they tracked milestones using Only Office project management software. After brainstorming a list of ideal features, the team drew up a short list of four OpenURL providers (including Ex Libris) and compiled a forty-five question Request for Information (RFI). Vendors were asked to comment on their capacity to provide training and technical support, compliance with industry standards, MARC record and usage reporting functionality, customizability of the public interface, product development goals, and overall cost. Erb sent the RFI to vendor contacts and answered countless follow up questions. Vendor responses took over three months to arrive and were tracked in a spreadsheet. After nearly thirteen years running Ex Libris SFX link resolver software, Colorado State University (CSU) Libraries decided in early 2015 that it was time for a change. Within the department, organizational At this point, the unexpected happened: library leadership revisited the work of two dormant task forces that had been researching next-generation ILS and discovery services, and decided to migrate to Ex Libris Alma and Primo. Erb had served on both task forces and recognized that the Ex Libris products would meet those needs; however, the decision also obliged the e-resource department to stay with SFX as the link resolver of the future. The migration project expanded to include other library departments, now that it was an ILS migration instead of an OpenURL migration. The core project team began holding twice-weekly meetings, produced monthly reports for library management, and convened monthly meetings for all library staff. Documents were shared through OneDrive and project materials distributed through Basecamp. Implementation proceeded in three stages, beginning with a planning and data cleanup phase scheduled to last through July 2016. Staff scoured the Ex Libris documentation for ideas when they realized that ERM and order records in the existing Innovative Millennium system could not be easily imported into Alma. They converted records to XML where possible, and developed a creative workaround involving Create Lists and spreadsheets to address records that could not be converted. They are also working with campus IT staff to replace the library’s expiring MetaLib subscription with an easier way for patrons to access subscription databases. Ex Libris staff will take the lead in the second implementation phase, scheduled to occur before December 2016. This phase includes configuration of system options, the actual transfer of data to Alma, and going live with the new systems. The entire year of 2017 has been dedicated to post-implementation work. This phase will likely entail extensive troubleshooting, data cleanup, and further system configuration. While CSU Libraries’ e-resource department is still using the same system under which they had begun their investigations, the outcome can hardly be considered a 20 NASIG Newsletter regression to “safe mode.” Researching OpenURL systems taught Erb and her colleagues a lot about the systems marketplace and helped them gain a holistic approach to library systems integration. Since changes in any one system ripple across other systems, Erb recommended that libraries interested in replacing their OpenURL resolver should instead broaden their view to reconsider their entire ILS. Erb closed by encouraging audience members contemplating the new horizons offered by a replacement ILS to “expect the unexpected” and stay nimble throughout their journey. Shaping Expectations: Defining and Refining the Role of Technical Services in New Resource Rollouts Jeff Mortimore, Georgia Southern University Debra Skinner, Georgia Southern University Reported by: Linda Smith Griffin Mortimore and Skinner presented on how the technical services department at their library has taken an active and front-facing role in improving public communication strategies and promoting new and existing resource rollouts to the library and university community. The presenters noted that prior to the creation of the “New Resource Rollouts Protocol,” the library’s messaging was inconsistent and contributed to a series of internal problems between technical and public services, and external issues between the library and patrons. Additionally, the presenters noted that technical services is well-suited to lead communication activities because communication begins at the point of acquisition and setup. Knowledge and familiarity with resources enables technical services librarians to provide consistent messaging for liaison librarians. In turn, faculty will be better positioned to promote the new resources and increase student buy-in and use. Attendees were given copies of the protocol that contained a detailed communications timeline and a copy of a rollout template that highlighted the entire messaging process. The protocol is conducted in three stages and requires coordination between technical and public services. The first stage, Trial and Adoption, is the beta period where most configuration work is conducted to ensure that the resource is functional. This occurs two weeks prior to the first go-live announcement. It is during this trial period that the resource is activated and can be discovered before the actual go-live date. The second stage, Go-Live Announcement and Go-Live Two Week Notice, is the actual launching of the product. Final testing and support materials are created for the resource. Liaisons are notified that the new or existing resource will be promoted to the public in two weeks. At this stage the focus is on giving the liaisons time to become familiar with the resource prior to promoting it to the public. Specifically, liaisons are given time to train, test, submit corrections, and request additional support. A week before the product is launched, several documents are drafted including the external FAQ post; a faculty read-copy of talking points in language liaisons can use to communicate about the resources with faculty; the blog announcement; and the faculty announcement regarding liaison training. The internal FAQ is also finalized and released and a liaisons go-live reminder is sent. Stage three focuses on the public release and includes an official go-live date. This stage includes revision of the internal FAQ post; finalization and release of the external FAQ, faculty read-copy, blog announcement, faculty announcement; and the beginning of liaison training. Public promotion and support begins. Liaisons and the promotion committee take over. At the conclusion of the session, the presenters shared the impact, lessons learned, and future directions. It was noted that the new resource rollouts protocol has improved the relationship between technical and public services and it is contributing to a unified customer experience that clearly shows technical services is public service. The next steps will include looking at cancellations (rollbacks), publicizing FAQs, increasing public services’ support autonomy, and expanding assessment. Since the protocol’s implementation, there have been thirty-three new resource rollouts. The success of this technical services initiative has merit for the University System of Georgia Libraries. Show Me the Value! Matthew Harrington, North Carolina State University Reported by: John Kimbrough What is your serial ROI? In recent years many librarians have asked, or been asked, to measure return on investment (ROI) for their serial subscription purchases. Consortial arrangements introduce additional complexity for ROI assessment, as both journal package costs and ROI data may be spread across multiple libraries. For the past few years, Matthew Harrington has developed and maintained a Microsoft Access database to measure ROI for the Triangle Library Research Network (TRLN), a consortium of four libraries including North Carolina State University (NCSU). Harrington chose to work in Microsoft Access for its easily understood graphical user interface and its ability to handle multi-dimensional data (e.g., from multiple libraries, in multiple years, and/or drawing from multiple sources). The goal was to produce a tool that would show metrics for a given journal package. Collections librarians and other users could define their own standard of value (e.g., a certain cost per use) and use the ROI database for queries such as: Does a package meet this standard? How has the package performed in the past? Would we get a better score with a different mix of titles? The ROI database includes a variety of data: title prices, package costs, usage data, bibliographic metadata, coverage dates, and impact factors. Working with multiple libraries and multiple branches makes data collection especially challenging. Harrington used a combination of linking ISSN (ISSN-L) (http://www.issn.org/understanding-theissn/assignment-rules/the-issn-l-for-publications-onmultiple-media/), institution, and year to uniquely identify data, but “linking data is never a straightforward process,” he noted. TRLN currently uses the ROI database for two packages: Springer and Wiley. Springer is a “true shared collection” in TRLN, with a single package and cost shared among consortium members. Wiley holdings are more complex; each TRLN member has their own set of Wiley journals, often a combination of a Wiley package and individual subscriptions. These different journal title mixes made for 1,500 titles and 24,000 subscriptions over the six years of available data. Harrington used a demonstration version of the ROI database to show several possible views of a package’s data. A “TRLN view” displays consortium-wide pricing, savings over list price, total usage, cost per use, and titles falling outside the Wiley collection package. Each member can display its own annual data at the branch level, and institutions can compare data with other members, such as overlap analysis, cost per title, or cost per use. Individual titles can also be selected and subscription information can be displayed, along with impact factor and usage. The database also includes subject-level views of cost data based on LC class. Librarians can set limits, such as a minimum number of uses per year or maximum cost per use, and the database will display the number of journals that meet the limit. In the future Harrington hopes to automate additional features, such as automatic data integrity checks (e.g., titles with no list price) and easier ingestion of annually produced data, such as COUNTER reports. During the Q&A session, current and former members of the TRLN collections committee, the primary user group of Harrington’s ROI database, noted the tool had been very helpful for evaluating journals and determining savings of package deals over individual subscriptions. To Lead to Learning, Not to Madness: E-Books & E-Serials at the Library of Congress Dr. Theron Westervelt, Library of Congress Reported by: Jamie Carlstone Dr. Theron Westervelt, a supervisor at the Library of Congress (LC) discussed the implementation of a system 22 for e-book and e-journal deposit at LC. Westervelt’s presentation discussed the challenges and benefits to electronic deposit for both e-journals and e-books, and focused on how LC uses its established relationships with publishers to broaden collections to include digital files. The challenges are particularly great at LC, where the mission is to create a rich and diverse collection for the American people. LC has done this successfully in the past with print; however, there is nothing in the mission statement that says, “Forget the digital stuff.” Collecting intellectual content is key, regardless of format. In 2004, there were about 150,000 e-books in the LC’s collection. In 2013, there were over 900,000. Each year, the e-book and e-journal collections are increasing. In 2004, over 15% of the serials that began that year had an online version. By 2013, this had increased to 40%. By 2013, nearly 30% of serials were available as online only. To ensure the deposit of online resources, LC took advantage of processes that were already in place for print acquisition, and created Copyright Mandatory Deposit (electronic deposit for serials) and the Cataloging in Publication (CIP) program for e-books. Essentially, LC is using the same relationships that were there for building print collections, and applying them to build electronic collections. Mandatory deposit requires anyone who publishes or widely distributes creative work in the United States to send two best copies to the Library of Congress. This has been an integral way LC has built its collections since the late nineteenth century. In the late 1980s, when creative output began on the World Wide Web, an exemption to the deposit law was written for nonprint materials. In the 1980s this exception made sense, as the future of the web was uncertain. In February of 2010, LC made an exception to the exception beginning with e-serials that were published online. Now, mandatory deposit must be made for ejournals that are published online only. LC is now in the process of changing that exception to the exception to extend to e-books and digital sound recordings, and hopes to have that written into the regulation by the NASIG Newsletter September 2016 end of the year. By the end of 2017, LC will receive books and music that are only digitally distributed. The Cataloging in Publication program (CIP) at LC is used to deposit e-books. This program has been in place for four years and is an agreement between LC and publishers. Publishers send LC galley copies, LC does as much cataloging as possible, and then publishers use the cataloging metadata for publication. LC decided to create the metadata and take advantage of the already existing CIP relationships to build the e-book collection. The publishers were very interested, and nearly two hundred publishers signed up for the program. About 4000 e-books have been acquired this way. One of the main challenges of digital deposit is file formats. LC has received eight-seven different file extension types, which presents many challenges for file management in the digital life cycle. LC invested in Signiant Media Exchange, which handles file uploads, metadata, and provides a landing space on the Library’s side of the workflow. LC also uses Delivery Management Services, which handles digital files like they are print material, thus making acquisitions workflow easier. LC also developed recommended format statements because it has to consider the digital life cycle and the potential future costs of managing obsolete formats. The program will expand in the future to include foreign publishers. LC is still in the early days of this process and is still figuring out how to navigate the many challenges of the program. However, these challenges are faced by everybody: libraries, authors, and publishers; and everybody has a common interest in ensuring there is a model that allows for the creation, distribution, preservation, and access of creative work. Using Course Syllabi to Develop Collections and Assess Library Service Integration Ria Lukes, Indiana University Kokomo Angie Thorpe, Indiana University Kokomo Reported by: Melanie J. Church Ria Lukes and Angie Thorpe began their presentation with a statement that it was based on practical research intended to make them better at the job of collection development and noted that they are not part of a bigger collection development team. They were already using course lists and degree requirements, faculty and student requests, their own judgment, and gaps within the collection to perform collection development, but they wanted a more precise method for assessing the gaps. They decided to approach this by examining course syllabi to assess what the gaps were in library holdings of required and recommended resources. At Indiana University Kokomo, faculty are required to submit their syllabi to departmental secretaries, which made it possible for Lukes and Thorpe to collect a significant number of them at one time. After standardizing the resource lists gleaned from the syllabi and assessing the data, Lukes and Thorpe found that books were the most commonly mentioned resource type, but databases, media, periodicals, and legal cases also appeared frequently enough to warrant assessment. Assessment included looking at library holdings and usage. In determining whether or not the library provided access to the books listed on the syllabi, one factor that needed to be accounted for was the library’s policy to not purchase textbooks. As many of the books listed on syllabi are textbooks; the high number of titles that the library did not provide access to (87%) is not as problematic as it would be if they collected textbooks. The range of media listed on syllabi, which included PBS videos, YouTube, C-Span, and Rotten Tomatoes, made the number that the library did not provide access to fairly high (79%). In analyzing usage, Lukes and Thorpe noted that print journals, e-journals, and e-books that In addition to the resources listed, Lukes and Thorpe noted a number of surprising things in some of the syllabi. Specifically, none of the faculty sent students to any streaming video available from the library. They also found outdated language prohibiting the use of “Internet” resources. Some suggestions faculty had for how to do research in Google were troubling and the library was infrequently mentioned as a place for research. More frequently, it was described as a place to get a laptop, a place to study, or the location of tutoring and other services. Lukes and Thorpe also learned of a twenty-five page research-intensive paper that was not reflected in any of their reference transactions. Based on their analysis, Lukes and Thorpe have made some plans for next steps to improve collections and services. They intend to do outreach to individual faculty, use known assignments to develop library courses to embed in the learning management system, and identify underutilized online resources to make decisions to either cancel or promote them. Some final thoughts Lukes and Thorpe wanted to share were largely about project planning. They advised attendees to invite buy-in before beginning, prioritize, and define who’s leading the project along with the goals and boundaries. They also encouraged people who are looking at doing this type of project not to lose track of what their dream goals are. Reported by: Shannon Regan Xiaoyan Song’s presentation detailed the challenges for managing e-books and e-book packages. Using the metaphor of Legos, the talk started by looking at how different systems, workflows, and individuals contribute to building a dependable process for the acquisition, access, and management of e-books. By reviewing the existing e-resources acquisition workflow and the systems used to manage this workflow, the team at NC State identified needs that were not met by the current process. Song described that their approach of using a knowledgebase, traditional ILS, discovery system, and ERMS left gaps in their ability to manage licenses, title lists, administrative information, requests from collection management, and access. NC State implemented the following new tools to address many of these gaps:  CORAL, to manage e-book acquisition workflows;  An internal wiki site (an e-resource hub) to capture all administrative information about e-book packages;  An e-book reconciliation database built in MS Access to provide title list support. Song ended the talk with some suggestions for those looking to improve upon the management of e-books. Suggestions included evaluating existing systems for what they can and cannot do, focusing on needs not met, and exploring other solutions to address those needs. Three of the four vendors looked promising, so the team scheduled them to give ninety minute product demonstrations and invited the whole library. A brief three-question survey collected feedback from library staff who attended the demos . These meetings helped the project team identify a preferred finalist. NASIG Newsletter September 2016 were mentioned on syllabi didn't have significantly higher usage than other titles in the same formats .


This is a preview of a remote PDF: http://tigerprints.clemson.edu/cgi/viewcontent.cgi?article=1777&context=nasig

2016 Conference Reports, NASIG Newsletter, 2016,