2015 Conference Evaluation Report

NASIG Newsletter, Sep 2015

A PDF file should load here. If you do not see its contents the file may be temporarily unavailable at the journal website or you do not have a PDF plug-in installed and enabled in your browser.

Alternatively, you can download the file locally and open with any standalone PDF reader:

http://tigerprints.clemson.edu/cgi/viewcontent.cgi?article=1681&context=nasig

2015 Conference Evaluation Report

Overall Conference Rating Submitted by 0 0 2015 Evaluation and Assessment Committee: Bridget Euliano (chair) , Derek Marshall (vice-chair), Melody Dale, Michael Fernandez, Kathryn Johns- Masten, Jane Smith and Kathryn Wesley 231 surveys were submitted from 380 conference attendees. Survey respondents could enter a name and email address for a chance to win a $50 gift card. Nancy Bennett from Carroll University was the winner. Below is a summary of the survey results. Conference Rating Respondents were asked to give ratings on a Likert scale of 1 to 5, with 5 being the highest. The overall rating of the 2015 conference was 4.28. This was a bit lower than in previous years. - 4.2 4.25 4.3 4.28 4.31 2012 2013 4.39 4.42 4.35 2014 4.4 2015 4.45 Facilities and Local Arrangements Geographic Location 4.3 4.42 The 2015 rating was 4.3, a slight decline from the 2014 location of Fort Worth, which rated a 4.42. However, this year’s rating was higher than Buffalo’s rating of 3.72 and Nashville’s rating of 3.89 in 2013 and 2012, respectively. Fifty-nine comments were entered on the survey about local arrangements and facilities mentioning a variety of issues. Meeting room space appeared to be a large factor with several attendees noting the rooms were either too small or too large for particular sessions. There were also several who mentioned that the conference was not in Washington D.C. proper and that there was an overall lack of easy access to tourist destinations. There were many compliments on the food and hotel service; however, there were a few comments that concerned the proper labeling of food for those with allergies. Local Arrangements GEOGRAPHIC LOCATION MEETING ROOMS HOTEL ROOMS MEALS BREAKS SOCIAL EVENTS 4.09 4.1 4.3 The conference website received a weighted average of 4.18. The conference blog was rated less highly at 3.77. Many of the commenters noted they did not take advantage of the conference blog. NASIG-SSP Joint Meeting Prior to the Opening Session, the 2015 NASIG conference featured a special joint meeting between NASIG and SSP (Society for Scholarly Publishing). It featured three keynote sessions and two other sessions. The joint meeting was well received by NASIG members in attendance. Eighty-one percent of respondents said they benefited from attending the joint meeting. Seventy-one percent said they would like to see more joint meetings with other organizations in the future. Post-Conferences Vision Sessions Eighty-seven percent of respondents noted they did not attend a post-conference. Three vision sessions were a part of the 2015 conference. The average overall ratings for the three sessions ranged from 3.89 to 4.10. Dorothea Salo’s presentation style was not to everyone’s liking but many praised her talk on user privacy as one that made them really think about an important topic. The comments on Stephen Rhind-Tutt’s session expressed passion about open access issues. Many respondents appreciated the questions and discussion his open access views generated. Some commenters felt that Anne Kenney’s talk on electronic journal preservation should have been a strategy session as opposed to a vision session. Other Sessions NASIG offered thirty-one concurrent sessions during the 30th annual conference. Twenty-four of those (77%) received an overall rating of 4.0 or higher. The number of sessions offered was lower than last year’s conference in Fort Worth. Most comments were positive, or offered specific, constructive criticism of an individual session. Feedback will be shared with presenters upon request. 2015 marked the third year of the Great Ideas Showcase, formerly called poster sessions. While only four participants were featured in 2014, there were seven in 2015. The overall rating for the Great Ideas Showcase was 3.72. The showcase sessions did not generate many evaluation comments. Some commenters felt the showcase should not have been held at the same time as the snapshot sessions. The 30th conference was the second year to offer snapshot sessions, “designed for 5-7 minute talks in which projects, workflows, or ideas are presented.” There were six sessions, two of which were rated 4.0 or higher. Due to an oversight by the Evaluation & Assessment Committee, there was no comment box for the snapshot sessions. The survey requested that responders rate and comment on ideas for future programming. Comments were entered with general and specific ideas for various types of sessions. A detailed summary of feedback will be submitted to the board. Events The First Timer’s/Mentoring Reception received a rating of 4.37. An overwhelming 93% would like to see this event continue. Comments submitted about the event were overwhelmingly positive, praising the mentors and networking opportunities. The Business Meeting received a rating of 4.0; however, the comments were varied. Low attendance was noted. 1 -To ease the reading of the demographic chart, several categories offered on the survey were condensed:  Academic libraries contains: College Library, Community College Library, University Library  Vendors and Publishers contains: Automated Systems Vendor, Binder, Book Vendor, Database Provider, Publisher, Subscription Vendor or Agency The Vendor Expo received a rating of 3.68 with the majority of survey respondents (88%) wanting to see it continue. The majority of the negative feedback consisted of the space being too small for the event. Respondent Demographics1 RESPONDENT DEMOGRAPHICS Academic Libraries Specialized Libraries Other 10% 3% 8% 7% Government Libraries Vendors and Publishers 72% As in previous surveys, academic library employees continue to represent the largest group of respondents at 72%. This is a marginally higher percentage than was held by academic libraries for the 2014 conference at 75%. Respondents were asked to “describe your work” using as many of the twenty-four given choices as necessary (including “Other”). 2015 marks the second year that “electronic resources librarian” garnered the highest number of responses (113). Serials Librarian (96), Acquisitions Librarian (79), Catalog/Metadata Librarian  Specialized Libraries contains: Law Library, Medical Library, Special or Corporate Library Government Libraries contains: Government, National, or State Library  Others contains: Public Library, Student, Other Several other categories were available, but not selected by a survey respondent. (63), and Collection Development Librarian (51) rounded out the top five responses. When asked about the number of years of serials related experience, “More than 20 years” received the majority at 72 responses. More than 20 years 11-20 years 7-10 years 4-6 years 1-3 years Less than 1 year 0 20 40 60 80 Forty percent of respondents noted they have attended one to five past conferences. PAST CONFERENCES ATTENDED 0 1-5 6-10 11-15 16-20 More than 20 10% 10% 6%


This is a preview of a remote PDF: http://tigerprints.clemson.edu/cgi/viewcontent.cgi?article=1681&context=nasig

2015 Conference Evaluation Report, NASIG Newsletter, 2015,