2018 Poster Abstracts

Poster Session 1

Facilities & Spaces

1. Using Prototypes to Inform Library Planning

Holt Zaugg (Harold B. Lee Library, Brigham Young University)

Several factors need to be taken into consideration when planning changes to spaces within a library need. Some of these include current and future patron needs. Some of the planning can be mapped onto the personas of library patron or the use patterns libraries have identified, but others is more difficult to ascertain. In these cases it is helpful to prototype the change to examine seen and unseen consequences of the change. The prototype allows planners to determine a glimpse of how the final change will affect library use.

This purpose of this presentation is to discuss several principles of how to use prototypes in library settings. It specifically references two examples where prototypes were used to help in the planning process for changes to the library. The first prototype is the redesign of an individual study desk and the second prototype is the combination of the library’s main Circulation Desk with a neighboring Help Desk. The presentation discusses how each prototype was developed and then used to collect data to inform the final design decisions. The presentation will help to inform other libraries who wish to use a prototype to inform the design portion of their planning process.


2. If You Install It, Will They Use It? Group Study Spaces and Student’s Use of Library-Installed Technology

Terry Brandsma (UNC Greensboro)

Effectively creating and equipping group study spaces is critical for our student’s academic success, and managing technology costs is critical to our operations. Providing technology where it is not used is ineffective and wasteful. In order to guide our decision making on how to best configure additional group study spaces in the library, and what technology to provide within them, a brief observational study was conducted in early 2018. We needed to determine which group study space configurations where the most used by students, and if they were using the library library-provided technology within those spaces. The new group study spaces were designed and equipped with these results in mind, and the observational study was repeated to see if we made the right choices. The results of both observational studies will be presented.


3. Navigating the Stacks: How Commuter Students Use the Library

Melissa Bauer (Kent State University at Stark)

The poster presentation discusses the library space assessment of a large regional campus library. A mixed-method research approach is used to understand commuter students’ experiences, preferences and needs of an academic library. The commuter student experience is underrepresented in the literature, which often focuses on residential campuses. Their unique perspectives are necessary to gain knowledge of current and emerging needs of this community. Data is presented from hourly floor sweeps, observations, and photo diary interviews. Initial findings along with practical implications for change will be presented.


4. If You Build It, They May Come—If It’s Not Too Far Away: A Mixed Methods Approach to Improving Graduate Study Space

Camille Andrews, Selena Bryant, Tobi Hines and Sara Wright (Cornell University)

What do graduate students want in a library space? To answer this question at our library, we used a variety of assessment methods to help us dig deeper into the study habits and space preferences of our graduate student population. Guided by the results from a recent university-wide graduate student survey, we used a mixed-methods approach including surveys, interviews, journey mapping techniques, and furniture demos/testing, to identify overarching themes and key characteristics of an ideal study environment. With that information in hand, and working with an interior designer, we are renovating our current graduate study area into a space that hopefully, will better accommodate their needs. This poster will showcase our assessment methods and results, as well as how we plan to address our students’ needs and concerns through newly selected furniture and an improved space layout.


5. “Something for Everyone”: A Multi-Method Space Assessment of the Cabot Science Library

Tim Gallati and Kris Markman (Harvard University)

In the fall of 2017, staff at the User Research Center at Harvard Library conducted a space assessment of the newly renovated Cabot Science Library, which had reopened in April 2017, in order to capture feedback from both new and returning students. The overall goal of the assessment was to understand how and why students were using the new space and to evaluate the usability of a few key areas in the library.

Methods: We conducted a multi-method investigation that included on-the-spot surveys, room observations, and wayfinding and interviews using wearable eye tracking. All data were collected in October 2017. Approximately 400 on-the-spot surveys were distributed to people using the library over the course of seven days, including weekends, at multiple time periods. A sealed ballot box was stationed at the exit of the library for survey return. During seven survey distribution times, the project team member also conducted observations on a video conference room, which was one of the new spaces on the first floor of the library.

In order to understand key wayfinding issues in the new space, we used the Tobii Pro Glasses 2 wearable eye tracking glasses to record participants during a set of wayfinding tasks. Participants (n=8) completed four tasks followed by a debrief interview conducted by the project lead. Finally, the project lead conducted an interview with one of the security guards who worked at the library entrance.

Results and Implications: Overall, 241 surveys were returned, and indicated that 89% of library users were undergraduates. Most respondents reported visiting Cabot at least once per week, and 40% said they visited almost every day. Respondents typically had spent one hour or less (41%) or one to three hours (43%) in the library. The most common activity was working individually (72%) on an assignment or homework (67%), using their own laptops (69%). Most chose Cabot over another location because it was a convenient location (58%). The video conference room observation found that on average there were 2.5 people in the space, frequently using laptops or mobile devices, but that there was very infrequent use of the large LED display or the video conference camera in the room. For wayfinding, most participants struggled with at least two of the tasks, and visited between 1.4 and 2.9 floors to complete tasks that were all on the same floor as the start of the tasks. Taken as a whole, the assessment revealed that users were generally very positive about the new space, however signage was lacking and users struggled with understanding the functions of some key areas of the space. Recommendations included moving staff to the first floor during peak hours and reviewing signage opportunities.


6. User-Centered Space Assessment in a Small Academic Library

Matthew Moore (University of Texas at Austin)

Daniel Pshock (University of Houston Libraries)

The University of Houston Music Library is a small branch library primarily serving music and performing arts students. This paper presents the methodologies and results of a multipronged space assessment project that put users at the center. The purpose of the project is to audit current signs in the library, conduct assessments to evaluate space use and user needs, and design specific changes to signs, space, and internal workflows based on those assessments.

The Music Library is heavily used. There are only 600 students and 80 faculty in the Moores School of Music where the library is located. However, the Music Library disproportionately accounts for 9% of University of Houston Libraries checkouts on a campus of 50,000 students. The Music Library receives 75,000 visits a year. This project analyzes how these visitors use a limited space. Though narrowly focused, the project will have a wide impact that will hopefully stretch beyond the Music Library. Methods used could be applied to the much larger main library on campus.

Project methods included surveys, library sweeps, user interviews, and sign auditing. Assessment is ongoing and will likely continue into early summer 2018. Multiple approaches have been taken to allow the team to create a robust picture of how patrons use the library, what they find important, and what spaces are popular. So far, assessment has shown printers as the most valued library service, and that the surrounding area is the busiest in the library. We have also discovered pain points in finding items in the stacks and with noise policies.

After assessment is complete, the team will coordinate service design changes in the library, including updated sign templates and policies and updating workflows for creating signs and planning spaces.

The paper should provide a toolbox of methodologies that can be applied to space assessment in a range of library settings. We hope that our eventual design changes demonstrate the possibilities of user-centered space assessment. The library is better equipped to provide service to users when it shifts its focus away from anecdotal evidence towards systematic, user-centered methods.


7. Library of the Immediate Future: Using Ongoing Occupancy Counts to Promptly Identity Improvements Needed in Library Services

Gavin Paul and Ana Torres (Dibner Library, NYU Division of Libraries)

This poster describes the implementation of a continuing assessment plan initiated in the spring 2016 at Bern Dibner Library, NYU Tandon School of Engineering. Occupancy counts are taken in public spaces several times a day, every day, providing up-to-date information on space utilization. This study reports on short-term and long-term trends and the effectiveness of library services that were inferred from the data. An observational method was utilized in this study. Using a Qualtrics form and detailed floor plans, staff members, mostly student employees, recorded occupancy numbers 4 to 5 times a day throughout the library. Public areas in the library were categorized into zones—silent study, conversational study and collaborative study. Available number of seats and types of furniture were noted. The resulting data allowed for analysis of occupancy in the context of space, time, patron behavior, services and employee performance. Collected data was instrumental in measuring several key elements. Data revealed a consistent pattern in the number of library visitors throughout the day, week and semester. The data can be used as a multipurpose assessment tool even providing insights in areas where the connection is not immediately apparent.

With the advancement of technology and its proliferation, usage of physical space is a growing concern in academic libraries. Print collections are now available online, diminishing the need for shelf space. College students now interact differently with space. Different technologies require different space. This is not a new concept, but the rapid evolution in these areas is driving the need to rethink library space. Usage of library space is not a constant, but a changing variable that depends on users’ needs and behaviors, and it requires perennial review. The observational process here described is cost effective, reproducible, easily implemented and maintained. For assessment, one can use it alone, or in conjunction with other tools.

Data collected by this method is pertinent to discussions on the nature and direction of the library, revealing trends that help with the making of long term and short term decisions and can facilitate micro-changes to services. This information enabled the librarians to identify the best times in a semester to roll out services such as workshops, reference activities and to optimize library hours. It was also used to strategically relocate furniture and improve seating layouts in open areas and in study rooms. It aids in assessing the effectiveness of changes such as the addition of outlets and the redesign of the snack lounge and data lab. It can also be helpful in the management and deployment of student workers.


Outreach & Services

8. Surf, Sand, and Sun: Gathering Library Feedback from Users through a Beach-Themed Event

Kymberly Goodson (UC San Diego Library)

The UC San Diego (UCSD) Library offers a variety of student-focused, de-stress events throughout the year to help students withstand the rigors of long hours of study in the library and the demanding nature of UCSD’s 11-week academic quarter. While each event offers relaxing or stimulating activities and snacks to the student attendees, opportunities for sharing targeted feedback with the library are also provided. One such event is the Spring Beach Party, held in April, which aligns with the campus’ location on the shores of the Pacific Ocean. At the event, attendees receive free lemonade and iced tea, beach-themed snacks, and information about library services. They can also play with kinetic sand, compose poetry with beach-themed magnetic words (also in celebration of National Poetry Month in April), enjoy beach-themed coloring sheets, and more.

To enable feedback gathering at the 2018 Spring Beach Party, 13 large, colorful posters were created showing a variety of beach-related images, including surfing, friends, tunes, grub, relaxing, bonfires, treasures, and more. Each poster was accompanied by a comment card with 3 questions associated with the theme, along with 2 related tips about library services or offerings. For example, the “crowds” poster asked attendees about difficulties they experience finding a seat in the library during different times of the term, finding an available outlet in the library, and about the locations where available outlets are hardest to locate. Tips on the card alerted attendees to the additional computers and study seats in a smaller, lesser-known library building, and about a recently-launched app with a live map of how busy library spaces are at any given time.

Attendees were encouraged to complete all or some of the comment cards, exchanging them with a staff member at the event for an equal number of raffle tickets. Drawings for prizes were held throughout the event, though one did not need to be present to win. Prizes included packages of kinetic sand, campus gift cards, snacks, and library-imprinted items.

Approximately 50 students completed at least some comment cards at the event. To gather additional feedback, the same posters and cards will be hung in the library in the second part of the spring term, with additional prize drawings to be given out before Finals Week. During the summer of 2018, all feedback from this initiative will be compiled and evaluated, so that actionable recommendations can be presented.

The poster will show how, while not necessarily statistically representative, this low-cost, easy to implement approach can garner valuable, immediate, and actionable feedback directly from users in a fun and unique way that students find compelling. Such an event, or a similar feedback initiative without a corresponding event, can easily be adapted to suit the needs, staffing, and budgets of a variety of other libraries. The poster will visually share details of the event, display the feedback results, and outline the actions taken as a result of what was revealed in the student feedback. Images will include the event itself, as well as the posters and comment cards used.


9. From Results to Practice: Application of Assessment Findings Informing the Direction of a New Position

Steve Borrelli and Mark Mattson (Penn State University Libraries)

In response to the university’s strategic plan foundational component of “Enhancing Global Engagement,” the Penn State University Libraries created the position of Global Programs Librarian. This poster illustrates collaboration between the Global Programs Librarian and the Library Assessment Department in culling together relevant data points from recent assessments into a report to inform practice relative to constituent needs. To provide insights into the needs of international students a key constituency for the Global Programs Librarian, the Department conducted statistical analyses comparing international and domestic student results from a recent administration of the Ithaka S+R Undergraduate Survey connected with findings from a multi-campus focus group investigation into the sense of belonging of international students while in library facilities. This collaboration provided direction for initial efforts for the Global Programs Librarian including the development of metrics relating to training for library personnel on cultural differences and sensitivities, and multicultural focused exhibits and programming for use in assessing the reach and impact of the Librarian’s activities. This process demonstrates how examining existing assessment results through a new lens enhances the ROI of investigations.


10. Assessing an Academic Library’s Marketing Strategies to Promote Subject Specialists: A Mixed Methods Approach

Stacey Ewing, April Hines, and Hélène Huet (University of Florida)

The purpose of this study is to analyze the effectiveness of an academic library’s public relations (PR) and marketing strategies designed to promote library subject specialists to undergraduate students.

One of the core issues librarians face is that many students do not necessarily know what a librarian does, nor how they can help them. To remedy this problem, a group of librarians at an R1 institution collaborated with students from a PR class to devise marketing strategies so as to better reach undergraduates and promote the libraries’ subject specialists. Following this collaboration, the group implemented some of the recommended marketing strategies, and assessed whether these methods would resonate with students and/or change their perceptions of librarians.

For the purpose of breadth and depth of understanding and corroboration, the researchers used a mixed methods approach, believing that combining elements of qualitative and quantitative research approaches through distributed surveys and focus groups, would help them to better understand the effectiveness of library marketing and PR strategies to promote subject specialists (Tashakkori, 2010).

More than 300 students completed surveys, answering questions regarding their perceptions and use of subject specialists before and after the implementation of various marketing strategies. The project team also convened three focus groups ranging from 3 to 8 participants each. Students were shown three two-minute videos, featuring profile interviews with some of the Humanities and Social Sciences librarians. Team members coded transcripts and evaluated content for trends and themes.

The results from this study will certainly influence future library marketing strategies. Early findings show that students are more likely to pay attention to library marketing content shared by their instructors rather than view this content through social media. Indeed, they explained that they are already bombarded by too many images and news from their friends. Another early discovery is that students wish they had known about library subject specialists sooner. While the students had a general idea of what librarians do (such as help with research), the students had a harder time understanding how that applied to them personally. Additionally, the students found the videos engaging and seemed pleasantly surprised that librarians came across as friendly and helpful. Said one participant: “[Librarians] are people just like we are!” The next objective is to find the best way to disseminate the videos in a way that would benefit most students now that the team has received a variety of suggestions through the focus groups. With a plan to launch the videos in Fall 2018, this research group will follow up with additional surveys and focus groups with students enrolled in targeted academic programs.

The value of this project is that by assessing library marketing and PR strategies to promote subject specialists, we are able to inform library marketing best practices. While prior marketing assessment data has focused on attendance numbers and social media metrics, few studies have looked at the effectiveness of marketing strategies to change perceptions and awareness of librarians themselves. This poster will provide an overview of the assessment methods, present qualitative and quantitative data collected, discuss analysis and takeaways, and share the planned “next steps” in this long term research project.

Tashakkori, A., & Teddlie, C. (2010). Sage handbook of mixed methods in social & behavioral research. Los Angeles, Calif: SAGE Publications.


11. Restructuring Reference Services Using Assessment Data

Brandon West (State University of New York at Geneseo)

Instruction librarians at a public university were collecting quantitative data about student use of reference services, which was in decline. The librarians were not certain on what do with the data beyond reporting out statistics.

Purpose: The librarians needed to find a new approach for understanding and applying reference assessment data in order to restructure reference services to meet evolving student needs.

Approach: The librarians began working with campus partners to correlate student use of the library to elements of student success, such as GPA and retention, based on Nackerud, Fransen, Peterson, and Mastel’s (2013) methodology. Using this data, the librarians developed exploratory questions to see what could be learned by examining the assessment data and its correlations to student success. Having a better idea of what the assessment data meant, they generated a strategy for transitioning the references services from a traditional reference desk model to a triage model over the span of four semesters.

Findings: The changes made to reference services resulted in a 200% increase in the number of students scheduling in-depth research consultations. It also decreased the number of hours librarians spend providing reference services.

Implications: This project allowed the librarians to engage with assessment data in a meaningful way, helped them identify gaps in their assessment methods, and has encouraged them to explore qualitative assessment measures.

Nackerud, S., Fransen, J., Peterson, K., & Mastel, K. (2013). Analyzing demographics: Assessing library use across the institution. portal: Libraries and the Academy, 13(2), 131–145.


12. A New Approach to Outreach Assessment: Evaluation of ROI for Out-of-Class Student Programs

Ariana Santiago, Emily Vinson, and Mea Warren (University of Houston Libraries)

The University of Houston Libraries’ 2017–2021 Strategic Plan includes a focus on positioning UH Libraries as a campus leader in student success initiatives. To ensure the success of this goal, a team was assembled and tasked with assessing the return on investment for the Libraries’ involvement in out-of-class programs that enrich the student experience. This poster will describe the project goals, methodology, findings, and recommendations.

The goals of this project were to identify out-of-class programs that the library sponsors or has significant participation in, assess their purpose and impact, and make recommendations for how to prioritize and allocate resources moving forward. Team members defined the scope of the project, collected and categorized data on student-focused programs across branches and departments, and holistically assessed their purpose and impact to determine the return on investment and make recommendations.

The presenters will share practical strategies for facilitating a cross-departmental team of librarians and staff on an assessment project with a short timeline. The completion of this project led to the creation of new workflows and communication strategies in order to enact the project recommendations, enabling this assessment project to be sustainable and affect future practices.

This poster will present a unique project within the broader landscape of academic library outreach assessment. Much of the literature focuses on assessing outreach efforts at the individual program level—for example, how to gather data to assess the impact of a specific event. This project focused instead on assessing the Libraries’ outreach programming from a holistic perspective. This is a significant endeavor as this type of assessment had not previously been undertaken at UH Libraries and is not often represented in the broader literature.

The return on investment project was completed in 2017 and the recommendations are now being operationalized. The results of this project have implications for outreach assessment and libraries’ impact on student success through engagement with their out-of-class experience.


13. Assessment of Visual Arts Projects Produced by Library Student Employees

Karen Reed (Middle Tennessee State University)

University libraries employ students to perform a variety of entry-level tasks. With planning and commitment from library personnel, this employment can provide additional educational opportunities for students through mentoring, reinforcement of coursework, and even support of career goals post-graduation.

This case study will describe the research-in-progress at one large public university library employing students to complete visual arts projects in support of library marketing and teaching initiatives. The Curriculum Materials Center (CMC) at Middle Tennessee State University’s Walker Library is a partner to the university’s College of Education through its holdings of K–12 teaching materials. Students majoring in Art, and particularly in the university’s specialized Art Education major, have been welcomed as part-time employees of the library unit. For the past three years, these student employees have been tasked with the sole responsibility of producing large art installations which correspond to state K–12 teaching standards. Displayed at the CMC’s entrance, the art installations have served as an eye-catching marketing tool of the library unit in raising awareness of the area’s holdings; these displays also have instructional value to the CMC’s target patrons as a practical demonstration of state teaching standards.

Although the visual arts project has enjoyed a successful run, the CMC director sought to formalize the student mentoring aspects of the program through the use of an assessment tool. This new direction posed a difficult question: how does one assess creative works? With no formal background in visual arts assessment, the CMC director looked to the professional literature. The use of a portfolio, as a means to both instruct and assess students, emerged as an amenable solution.

Student employees creating art murals for the CMC are now following a systematic design process incorporating the portfolio assessment. This poster presentation will showcase the portfolio’s implementation and emerging results, as well as specific criteria of the portfolio itself. Student portfolios include multiple photos of the resulting art mural as well as a completed design rubric emphasizing student planning and self-reflection. The portfolio is therefore an assessment with an intended goal of student development through librarian mentoring and instruction.

In all, it is hoped that this renewed effort at a beloved library initiative will yield enhanced benefits to our creative student employees, as well as greater collaboration with our university’s art department. This poster presentation seeks to demonstrate the role of portfolio assessment in specialized student library employment, as a means to mentor and instruct students in support of their long-term career aspirations.


14. New Directions in Outreach Assessment: Defining the Value and Impact of Library Outreach

Amanda Hornby and Emilie Vrbancic (University of Washington Libraries)

Libraries are increasingly engaged in outreach to their user groups, but assessing outreach and demonstrating its value to the broader community remains a challenge for many of us. This poster details how the Odegaard Library’s Undergraduate Student Success (USS) team at the University of Washington (UW) sought to fill this gap by creating a strategic, sustainable and scalable program to assess their outreach activities. The assessment framework is flexible enough to be adapted to any size, format or level of outreach. While the framework requires staff time and resources, ongoing project management, and the collection and analysis of data, it is highly replicable in a variety of institutional contexts. This poster will describe the USS team’s outreach and assessment planning processes, the online outreach assessment toolkit, the qualitative and quantitative assessment methods employed, the results of our assessment program, and how this work demonstrates the value and impact of library outreach within and outside of the library.

The Outreach Assessment Toolkit was designed to help set outreach goals and outcomes and choose appropriate assessment methods for each outreach activity. A mixed-methods approach to assessment allows our team to document student engagement and feedback, and use data to thoroughly examine what we’re doing well and what we can improve upon in our outreach efforts. The assessment methods utilized in the toolkit include:

  • Ethnographic observation
  • Post-event surveys
  • Social media analysis
  • Capturing comments
  • Photo documentation
  • Staff reflection, and
  • Headcounts.

This approach is designed to demonstrate the value of each unique outreach experience, using qualitative data to richly describe the impact of the experience to students and campus partners.

Drawing from two years of assessment data and analysis, the poster details how the assessment program has strengthened relationships with UW Libraries administration and campus partners. We will discuss how this assessment program has shaped the work of the USS team, especially through the use of reflective practice. Using both reflection and longitudinal data, our outreach assessment program has become a flexible practice that has helped us become more reflective librarians, think about library outreach in innovative ways, and deepen our campus partnerships.

This poster will be of value to participants who want to create and assess a program of outreach for any user group. The poster session will have an interactive component that will engage participants in thinking about ways to incorporate the online outreach assessment toolkit at their own institutions. We seek to foster a community of practice among poster session participants (and beyond) where the toolkit is shared, remixed and adapted to meet the needs of a variety of library teams and library sizes. By sharing this practical model of outreach assessment, we seek to fill a gap in library service assessment and empower colleagues to demonstrate the value of library outreach activities.


15. How Do You Measure Fun and Relaxation? Assessing Academic Library Programming Using Traditional and Non-Traditional Forms of Assessment

Randa Morgan (Louisiana State University)

Purpose: The purpose of this poster is to discuss traditional and nontraditional forms of assessment as it relates to academic library programming. This poster will describe the various ways the Programming Committee at Louisiana State University have assessed programs and events during exam weeks and how we have included the use of both traditional and nontraditional assessment in measuring a program’s success. This poster will also look at some of the ways in which assessment can be used to justify the cost of library programming.

Design, Methodology, and Approach: The Programming Committee has utilized various forms of assessment since its creation. These have ranged from more traditional assessment methods such as surveys and counting participation numbers, to more nontraditional assessment using qualitative questions asked during events and “assessment on the fly” where students utilize comment boards. We have tried both private feedback and public feedback. We have also brought in technology to count students at events and with room usage.

Findings: The committee has found that at the beginning when we needed support from the administration and to justify why we wanted to spend the amount of money we wanted to spend, traditional feedback was best. We counted student participants at events, used surveys to determine what students wanted and then later changed our survey to ask more open ended questions. However, as the committee has gathered support from the administration, more nontraditional feedback has worked just as well. We found that once we have “proved ourselves” to the administration, the need for traditional assessment isn’t as strong.

Practical Implications/Value: All of our varied assessments have shown that students are interested and excited about library programming. We also use assessments to help determine our return-on-investment and help us make future decisions about programs and events. Assessment is important to justify expenses for events and to facilitate administrative buy-in and support. It even helps us justify building or adding upon existing programs in the library. As our committee matures we will continue to use different means of assessment. Library programs bring students into the library, helps build and support community, and creates student buy-in. Assessment can help support all of these.


16. Revising and Reviving Reference Statistics: An Online Collection System Pilot at the University Of Manitoba

Carol Cooke, Janet Rothney, and Sherri Vokey (University of Manitoba)

Purpose: The selection and collection of statistics are topics of regular discussion at the Neil John Maclean Health Sciences Library. Statistics have traditionally been gathered by library assistants at the Client Service Desk and by librarians in their offices via paper “tick sheets”, collated in spreadsheets and reported to various audiences. This project was designed to test the use of Springshare’s RefAnalytics to gather reference statistics in an effort to a) move away from a paper-based model, and b) realign our statistics with profession-wide standards and definitions to best suit the data collection initiatives we participate in (CARL, ARL, AAHSL, AFMC). We hope to streamline statistics collection at individual and unit levels, stop collecting outmoded data points such as directional transactions or item use statistics gathered elsewhere, and increase the uptake of statistics tracking by librarians.

Design: A review of standards and definitions from the profession was undertaken, and compared with current practices. A RefAnalytics template was developed to include statistical categories relevant to the unit and reports required by external bodies. Documentation with descriptions of each category on the new template were made available to all participants of the pilot. Project meetings were held with the library assistants and librarians prior to the 3 month pilot period and near the end of the first month of the pilot. These meetings helped to clarify definitions of statistics categories and expectations for collection, and to ensure that no major gaps were missed. Report deliverables were tested with the first full month of data. A draft report of the pilot will be presented to participants for feedback before the final report is submitted. It is our hope that the pilot phase will inform us on the utility of the template design, as well as the feasibility of expanding this model across the UM library system.

Findings: RefAnalytics has worked well for unit needs at the end of the pilot phase. We have decided to continue using the template, which allows us to remove several steps in statistics collection. The new process clarified our approach to statistics, and has identified opportunities for deeper analysis in areas such as the type of support needed by affiliated colleges and patron types. Pilot data will also be consulted for planning unit-level training opportunities for the next academic year. One area to explore more fully is the feasibility of implementing RefAnalytics across multiple units within our library system.

Practical Implications: RefAnalytics is a simple customizable tool for basic reference statistics collection. Moving from paper to an online collection system was a fairly seamless process, and has increased librarian’s statistics tracking. Future work will involve year over year analysis of trends around uptake and capture of reference transactions pre- and post-implementation of RefAnalytics. Taking the time to reevaluate the statistics gathered at each reference point, what data is required for us to gather and what additional data may be useful has been a fruitful exercise that will continue to shape our library in the near future.


17. Simplifying the Process: Using Technology to Integrate Event & Program Assessment from Start to Finish

Amanda McLellan and Heather White (East Carolina University)

Event assessment shouldn’t be painful or mysterious, so how can we leverage technology to demystify and simplify program and event assessment? This poster illustrates how to utilize custom technology solutions to ease the workflow between several departments and make the assessment of events integrated from the planning phase for a more practical and sustainable result. The result of our collaboration is a tool for effective, sustainable and practical event and program assessment in library.

This poster targets libraries interested in reflecting on their own event and program assessment and how an open-source application may help streamline their processes. Previously, it was hard to paint the assessment landscape of before, during, and after an event, and our Events & Programs Coordinator had to juggle event coordination over multiple systems. However, in conjunction with library application developers, new solutions were built, allowing us to consider a culture of assessment from start to finish while streamlining the event management process.

Using our academic library as an example, the poster presents key results which emerged from the integration of this application within our normal workflow. Our organization hosts over 120 events each calendar year. This poster will touch on the background leading to the creation of our Event Assessment Tool, but focuses on illustrating the streamlined process to integrate assessment into event planning. Through testing and usability work, we have modified the original application to be more similar to a ticketing system to help delegate responsibilities across departments. The use of this tool allowed us to enact change at our institution by simplifying the assessment process, yet ensuring each event and programmatic effort was tied to our mission and strategic goals. The resulting application is an open source product that your institution can download and modify for your own event assessment workflow.


18. “Let’s Never Do This Again:” How a Large-Scale Project Led Us to Rethink Chat Reference Assessment at University Of Washington Libraries

Jackie Belanger, Alyssa Deutschler, and Lauren Ray (University of Washington Libraries)

Michael Mungin (University of Washington Bothell/Cascadia College Campus Library)

In 2015–2016, the University of Washington Libraries launched an ambitious assessment of our well-trafficked chat reference service. We performed a content analysis on over 3,500 patron transcripts in order to identify who was using the service and variations in question type and complexity. Using a modified READ (Reference Effort Assessment data) Scale, we explored the level of effort expended on chat reference questions, as well as patterns of use over time and in different subject areas. The assessment team gleaned many valuable insights into our users and the types of questions they posed on chat. However, we also learned some difficult lessons about the feasibility of coding such a large number of chat transcripts and the need to integrate assessment at more regular, frequent intervals.

Our initial assessment project revealed that chat transcripts provide a rich—and previously untapped—source of data on how patrons interact with the library’s myriad tools and services. Following the project’s conclusion, members of the chat assessment group met with various committees and units across the UW Libraries to explain how chat data might inform their services. As a result, we’ve conducted several smaller-scale projects to address specific questions or problems posed by our colleagues. The result has been a more sustainable process that allows us to focus assessment efforts in a flexible and timely manner.

This poster will detail how the chat assessment group at UW Libraries has moved from large-scale projects to smaller, targeted, iterative assessments focused on specific service areas or questions. It will discuss the structure and findings of our large assessment project, applying the READ Scale to an academic quarter’s worth of chat transcripts, as well the smaller chat assessment initiatives that have followed, including: efforts to document user difficulties with requests in the new ILS at UW Libraries; a project showing how UW patrons use Interlibrary Loan services; and a recent, exploratory project interviewing chat patrons about their expectations of online reference service. In addition to specific findings and methodologies, presenters will share lessons learned from each project and best practices for undertaking sustainable chat reference assessment.


Teaching & Learning

19. Transformational Tuning: Building Competencies and Context into Assignments

Lori Albrizio and Greg Lindeblom (Broward College)

At the heart of a successful General Education Program lies the ability to create interesting, engaging assignments that accurately measure a competency. Tuning is a class-level process that brings learning competencies explicitly into learning activities and assignments. The tuning process brings real-world understanding to the purpose of general education and can help students connect more with the curriculum in meaningful ways.

This poster explains a pilot study at Broward College that began in Spring 2017 where library and discipline faculty worked collaboratively to tune a macroeconomics general education course to information literacy (IL) competencies. IL is one of the six core general education competencies at Broward College. The purpose of this quasi-experimental study was to determine whether students who received intensive library research instruction and supports through the tuning process would be more proficient in the IL criteria measured.

Results showed that the tuning initiative helped more students exceed or demonstrate IL competency on the two comparative IL criteria measured: use information to support an argument or to solve a problem and cite sources. In addition, students have shown continuous improvement in all four IL criteria measured over three consecutive semesters. The pilot study has also helped the researchers identify and remediate specific areas of weakness for students with the IL competencies.


20. Transforming Interdepartmental Information Literacy Instruction: Mapping the ACRL Framework and Nursing Professional Standards onto an Assessment Rubric

Katelyn Angell (Long Island University, Brooklyn Campus)

[Institution name omitted] is a medium-sized urban university with robust undergraduate and graduate programs in nursing. Nursing students are avid library users, as their field of study necessitates a large amount of complex health sciences research. In order to graduate all undergraduate nursing majors must pass a writing intensive course, End of Life Care, situated within the nursing department. The final project for this class is a six page PICO research paper. PICO (Patient/Intervention/Comparison/Outcome) assignments are an integral part of the nursing curriculum, and require students to create a solid clinical question and use evidence based research to answer their question.

Given the prevalence of nursing students, the ubiquity of PICO assignments, and the specialized information literacy skills needed to successfully complete this assignment, the investigators invited members of the nursing faculty to collaborate on a rubric designed to assess the information literacy skills of these students. Inspired by the ACRL Framework for Information Literacy for Higher Education, the investigators identified one knowledge practice from each of the six frames that best fit the parameters of the PICO assignment. In order to more closely involve nursing faculty in the project and to better help students evaluate their skill level as information literate professionals, the investigators selected six key competencies from the American Nursing Association (ANA) professional standards and mapped them to the six ACRL frames.

Next, the investigators developed three different achievement levels (beginning, developing, and exemplary) to correspond to each of the six ACRL frames and ANA competencies measured in the PICO paper. Two nursing professors examined the rubric and offered feedback and minor edits.

The investigators used the rubric to separately evaluate fifty student assignments. Once all artifacts were graded the Intraclass Correlation Coefficient was used to determine inter-rater reliability. Statistical analysis showed that while there was some disagreement regarding mastery of information literacy competencies, additional standardization of the rubric will result in a promising assessment tool. Nursing faculty can use this information as they see fit to enhance the information literacy skills of their students. This project can easily translate to other academic disciplines for librarians wanting to collaborate with teaching faculty on similar initiatives and for a broad range of academic disciplines.


21. First Generation Success: Mixed-Methods Information Literacy Skills Assessment for First Year Writing Students

Kathy Anders, Sara DiCaglio, Stephanie Graves, and Sarah LeMire (Texas A&M University)

As universities seek to improve retention and graduation rates, more attention is being paid to populations that are statistically less likely to persist, such as first-generation students. Engaging with a campus-wide initiative targeting first-generation college students, librarians at a research university were awarded a grant to study the information literacy skills of this special population and to develop intervention strategies to help retain students.

Purpose: Partnering with the English department and a campus provisional admission program, librarians developed and taught special sections of the first year composition course, ENGL 104. These sections were designed to seamlessly embed information literacy concepts into the traditional ENGL 104 curriculum and to thoroughly assess the impact of this approach. This study was designed to better understand the information literacy knowledge and skills of first-generation students and to evaluate the impact of embedding information literacy into a course required for their degree plans.

Methodology: This poster will feature the results of the mixed-methods assessment. Assessment methodology included a nationally standardized information literacy test and application of a rubric to assess student research papers. Study participants included members of the first-generation course cohort and a substantial control population, and a comparative analysis will be included in the poster.

Findings: This poster will share findings from the study, including a more clear understanding of the developmental writing and information literacy skills of first-generation students and the impact of the embedded approach on students’ information literacy skills.

Practical Implications: Attendees will learn how the assessment methodologies complemented each other and provided context to enhance understanding of data. Attendees will also learn how collaborative assessment projects can improve the library’s relationship with strategic campus stakeholders and demonstrate the library’s potential to contribute to key campus initiatives.


22. Assessing First-Year Student Information Literacy Sessions for Lesson Development

Diana Dill, Marian Hampton, Leslie Poljak, and Berenika Webster (University of Pittsburgh)

Purpose: Each fall, the University Library System (ULS) at the University of Pittsburgh provides over 65 library instruction sessions to first-year students enrolled in an academic foundations course. This poster will highlight how project leaders utilized evidence based curriculum development to assess these sessions. The purpose of this assessment served to influence future lesson design by creating a more relevant, engaging library session.

Approach: Assessment design included collecting rubric data assessing a sample of prior years’ student library assignments. Additionally, a survey was sent to course instructors collecting their perceptions of the research skills students need and lack using HEDS (Higher Education Data Sharing Consortium) Research Practices Survey indicators.

Findings: Assignment assessment results showed that the students were comfortable with locating and comparing information. Survey results noted that the top information literacy skills that instructors perceived students to lack involved evaluating information, identifying which resources to search, and citing. These results were used to develop a new pre-class tutorial assignment, updated class handout, and shaped the focus of the next year’s lesson.

Practical Implications: The survey and rubric assessment findings were shared with program directors and used to frame the library lesson plan and assignment tutorial for the following year. The new lesson included a shift from searching and locating physical materials to a focus on identifying information need and evaluating a variety of online and library resources.


23. Using Analytics to Advance our Future Instruction Services

Laurie Alexander and Doreen Bradley (University of Michigan)

Purpose: Our university is accelerating its education and innovating strategies to shape the future of learning. We want students to be risk takers, to create and share knowledge, and to be engaged global citizens. Our goal is to teach at critical points in the curriculum, expand our reach to students who may be less prepared, and to partner with increased focus on experiential learning. An ongoing analytics based assessment of our library’s course integrated instruction program is intended to assess the effectiveness of these strategies. Without an university information literacy requirement, we seek to make connections where there will be high impacts and we are dialoguing with schools/colleges about the outcomes of these assessment efforts. This poster will share findings from indepth analysis of two four-year cohorts of undergraduate students and library course integrated instruction services.

Design/Methodology: We used two sources of data in this study. First, to collect data on students taught during our course-integrated library instruction programs, library instructors record statistics in our scheduling application for library instruction (SALI). SALI tracks course and section numbers for each session. Second, we obtained IRB approval to work with student information in the university’s learning analytic architecture (LARC). From this system, access is available to demographic, socioeconomic, and academic data on individual students. We extracted data from both systems to run various analyses to examine several measures of student success. We also assessed students of different socioeconomic, demographic, and academic discipline backgrounds and their attendance in library instruction sessions to assess the integration of our program with various schools, colleges, and campus programs.

Findings: Finding highlights include 71.2% of students receive library instruction during their undergraduate experience, trends in the top 10 courses where the most students receive library instruction, analyses by socioeconomic status, first generation students, underrepresented minority groups, gender, student retention, and analysis by discipline including engineering, nursing, public health and others. Findings will also be presented regarding students and instruction with Special Collections Research Center (rare materials). We also Identified gaps, such as 64% of students not receiving library instruction are male. We will discuss these finding and our next steps in working with this data.

Practical Implications/Value: What we learned through this assessment effort has direct implications for our program planning (demographics, sequencing, curriculum development), resource allocation, and delivery of library instruction. This poster will provide an overview of our assessment, dive into the challenges we faced, outline our methodology, highlight results, and invite attendees to envision such assessments for their own campus. We will share outcomes of discussions this data has had on our instruction activities with specific departments as well as other campus service partners. For those who do not have a required information literacy component in their general education requirements, this assessment offers a replicable methodology to understand current impact and open discussion for future possibilities.


24. Standardized Assessment of Student Learning: One-Shot Library Instruction for the First Year Writing Program

Ashley Blinstrub (Saginaw Valley State University)

In Fall of 2016, all research librarians at Saginaw Valley State University used a common assessment to measure student learning in our introductory one-shot library instruction sessions for the First Year Writing Program. From 2005–2016, the librarians of the Melvin J. Zahnow Library taught two library instruction sessions for each ENGL 111 course. For the Fall of 2016, the First Year Writing Program Coordinator restructured the course curriculum and the instruction librarians redesigned the library instruction portion of the course. Assessing student learning was important in order to continuously improve the library instruction program and to show value of the sessions to the faculty members and First Year Writing Program Coordinator. It was also vital to create a systematic assessment plan for this program that could be sustained with a staff of six research librarians.

During this redesign of the library instruction learning outcomes and lesson, the librarians created means of measuring student learning through activities given in the class. In this study, librarians looked at student responses for in-class activities and applied these responses to rubrics in order to measure student learning. After the first year, changes were made to the lesson and the study was duplicated with a few minor changes. This poster will outline the outcomes of both assessment projects and the importance of continual assessment.

The results of this assessment project will inform librarians about how well students are achieving learning outcomes in a one-shot library instruction session.


25. Expertise without the Expert: Building Academic Confidence through Peer Assisted Study Sessions (PASS)

Kimberly Lace Fama and Christina Sylka (University of British Columbia)

Gen Zs are the current generation to embark on their journey to higher education. They exhibit unique learning habits from their predecessors, and learning how to engage with a generation who desire relevant, solution-oriented relationships with their mentors and peers is crucial.

Peer Assisted Study Sessions (PASS) forms part of the supplemental instruction (SI) model. Taking into consideration the educational values of our Gen Z students alongside institutions’ growing class sizes without a corresponding growth in departmental budget, peer-led programs can provide an effective, desirable and affordable option for supporting students academically.

In just three years, PASS has become one of our most sought after cocurricular support programs, prompting a methodical evaluative study in 2018 to investigate what factors influence students’ motivation to attend and their satisfaction with this program. Our team employed a mixed methodology in which we gathered data from first year students, PASS participants and PASS leaders through focus groups, interviews and surveys.

Attendees will gain insight about what factors motivate a student to attend, and impacts their satisfaction with, cocurricular academic programming. By considering how these factors can be leveraged at one’s own institution, attendees will take away insights that will help them effectively design, revise, and/or market their own programs and services. Additionally, by understanding the factors that contribute to students’ current level of satisfaction with PASS, attendees will be able to make thoughtful alterations and/or improvements to their existing offerings, or to direct resources toward/away from new or current initiatives more effectively.


26. Exploring the Relationship Between Instruction and the Reference Desk

Meaghan Valant (University of Toronto Mississauga Library)

Purpose: The University of Toronto Mississauga Library’s instruction program delivers information literacy (IL) sessions to both undergraduate and graduate students. During IL sessions, Liaison Librarians promote the Library and its services by encouraging students to visit the Reference & Research desk. In order to assess the impact of our promotional efforts, the Library undertook a project to explore the relationship between course-related instruction and visits to the reference desk. This poster reports the methods and results of our study.

Methodology: The data for this project was collected between September 2016 and April 2018 using two methods. The first was an online desk tracker that captured transaction data at the Library’s Reference & Research desk. In addition to the more traditional measures, our tracker logged course codes when visits to the reference desk were related to a course or an assignment.

The second method we used was an in-house database that tracked course codes, course enrollment, and the total number of course-related sessions taught by each librarian. For the purposes of this investigation, we defined course-related instruction as instruction given during class time that may or may not be tied to an assignment.

In order to explore the impact of our promotional efforts during IL sessions, course codes common to both the reference and instruction datasets were identified and organized according the University’s academic terms. Data analysis comprised crosstabs and frequencies to determine the association between reference desk visits and instruction sessions.

Findings: Results indicate that our Reference & Research desk received more visits from students in courses that did not have an instruction session, than those that did. Of the total visits to the reference desk, only 9% were the result of course-related instruction. However, the effect of service promotion during instruction sessions was not entirely inconsequential. Fifty-six percent of courses that received instruction resulted in at least one visit to the reference desk at some point during the semester. Together, these results suggest that promotional activities outside of our instruction program play a significant role in marketing the reference desk.

Practical Implications: Our findings suggest that promotional strategies both in- and outside of course-related instruction have an impact on the number of visits to our Reference & Research desk. However, results from this study are limited because we did not explore promotional activities outside of our instruction program. In order to address this limitation, our Library is in the process of implementing a user survey that will capture referral sources.


27. Affordable Course Materials: Impact on Student Learning

Eric Resnis (Miami University)

As the need for higher education has increased in the preceding decades, so have the costs associated with it. Tuition has risen 62% and textbooks have increased in cost 88% between January, 2006 and July, 2016. In response to this, institutions and some state governments have made college affordability a priority. Our institution has implemented a robust textbooks-on-reserve program, a coursepack consultation service, a OER adopt/adapt grant program, and an alternative textbook program.

The measures of success for our affordability initiatives focus on decreased student costs and maintaining quality student learning. This poster will focus on the assessment of student learning for our OER grant adopt/adapt program. To ensure that OER use is effective and to convince skeptical faculty of their value, recipients of the OER Adopt grant are required to assess whether the OER impacts student learning and their experience in the course, either positively or negatively. For this assessment requirement, faculty are encouraged to compare student performance on the same test or assignment when they teach the course using a traditional text and when using the adopted OER. Additionally, they are required to complete a Small Group Instructional Diagnosis (SGID) when they teach the course using the traditional text and again when teaching it with the OER and to include a reflection on their experience teaching with the OER in their final report.

The Small Group Instructional Diagnosis (SGID) is a common student perception assessment tool developed by Clark and Redmond. SGIDs are utilized often in higher education to help faculty understand student learning perceptions and preferences so they can enact immediate change based upon the results. The SGID was chosen to assess OER quality because, similar to a focus group, it allows for follow-up questions so that the facilitator better understands the advantages and challenges of OER utilized in the classroom. ‘Return rate’ is also considerably higher than other tools such as surveys, as the technique is usually completed in the classroom during class time. As a low-stakes method, faculty buy-in is high. Finally, since SGID is a commonly used technique on our campus already, most faculty and students understand its value. The SGID includes questions on OER, instructor, and course quality.

This poster will provide an overview of the OER adopt/adapt program, the SGID method, and data from 2.5 years of implementing the assessment. Data will indicate trends in student learning that occur with the implementation of OER. Additionally, a discussion of both faculty and student perceptions on learning when using OER will be included.

https://www.bls.gov/opub/ted/2016/college-tuition-and-fees-increase-63-percent-since-january-2006.htm

Clark, D. J., Redmond, M. V. (1982). Small group instructional diagnosis: Final report. Received from: https://eric.ed.gov/?id=ED217954


28. Assessing the Assessment: Using the One-Shot Assessment to Establish Pedagogical Values

Allen LeBlanc and Brittany O’Neill (Louisiana State University)

Prior to 2017, Louisiana State University’s Research and Instruction Services (RIS) Department had no assessment for information literacy instruction. Librarians met instructional objectives however they saw fit. Student learning outcomes (SLO) were already in place and loosely adhered to the ACRL Framework for Information Literacy for Higher Education, but had not been recently updated to fit changing needs at LSU.

In early 2017, RIS revamped its SLOs to better correlate with the Framework and with the changing university landscape. Through this process, RIS collectively examined and agreed upon core values.

In Summer 2017, RIS created an ad hoc committee to develop student assessments for one-shot instruction sessions. Multiple questions were developed based on each SLO and aligned with common challenges students face in the research lifecycle.

The assessment was intentionally open-ended and adaptable. Librarians chose from a bank of 34 questions and could choose up to three to include, or choose not to use the assessment if it did not fit the lecture. While librarians were still autonomous in how they taught, this provided a way to plan what they taught more intentionally. Librarians also were free to choose how the assessment was delivered, with some making the tasks an organic discussion activity within the lesson, while others issued the assessment as a learning check at the end of class.

The first iteration of the assessment ran for two semesters, during which RIS engaged in conversations about its effectiveness and its impact on instruction. Concerns were addressed regarding disparities between language used in the assessment and our instruction, and the language present on the libraries’ website and digital materials. This led to adopting cohesive language throughout the libraries’ instructional materials. Concerns were raised about the need for different assessment questions for certain disciplines, particularly in graduate courses, suggesting a need for reevaluation of the assessment and the principles that the department should emphasize. The implementation of the assessment prompted self-reflection on teaching practices and encouraged collaboration and co-teaching.

The assessment was paused at the end of Spring 2018, having received roughly 3000 assessments. Each librarian completed a self-reflection detailing their thoughts on the efficacy of the assessment and their feedback for future iterations. Ample research exists on the creation and implementation of assessment, but this survey aimed to reflect RIS’ culture of assessment and how the assessment reshaped departmental values. This poster will also detail findings from the assessment data, including the frequency of question usage and SLO alignment to shed light on which objectives individual librarians gravitated towards. Data from the assessments will undergo content analysis to determine trends in students’ use of terminology and where in the research process students face the most difficulty. This data will also be used to make changes to future assessments and pedagogical principles. Other institutions may benefit from “assessing their assessments,” which can lead to meaningful conversations about an assessment’s impact on shaping values.


29. Exploring the Effectiveness of Visual Literacy and Communication Skills Instruction

Heather Seminelli (United States Military Academy Library)

Purpose: The Association of College and Research Libraries defines visual literacy as “a set of abilities that enables an individual to effectively find, interpret, use, and create images and visual media.” There are assumptions that students growing up as digital natives will have an inherent aptitude for visual literacy. However, exposure does not equal competency. To help students improve their presentations, the library’s liaison to Department of Mathematical Sciences worked with MA104: Single Variable Calculus to develop a lesson on visual literacy and communications skills. About 90% of the students at our four-year undergraduate liberal arts college take this class during the spring of their freshman year. This study investigates the effectiveness of this instruction.

Design, Methodology or Approach: The visual literacy and communication skills class was developed to include presentation preparation tips, critical thinking, preattentive visual properties, choosing appropriate visuals, fair use, copyright, and attribution. This class was taught by the library liaison to all MA104 instructors, and to 3 sections of students. Some instructors used the lesson with their students, and others did not. From this group, a sample of 116 students from 8 sections of MA104 were identified as being part of group 1 (did not receive instruction), group 2 (received instruction from their instructor), and group 3 (received instruction from the library liaison). All presentations were graded using rubrics.

Findings: Students who received instruction in visual literacy and communication skills performed at a statistically significantly higher level in citations, use of visual aids, and legibility of text. There was not a statistically significant difference in student performance between students who learned these skills from their instructor or a library liaison.

Practical Implications or Value: This result suggests that this program can be scaled successfully so that more students can benefit from learning about visual literacy to improve their communications skills.


30. Starting Where Students Are: Assessing What Students Know about Government Information

Kathryn Tallman and Allan Van Hoye (University of Colorado Boulder)

Purpose: Demonstrating value to students has become a primary driver of library assessment, this is particularly true for Government Information librarians who often face unique challenges to their professional expertise and collections. Much has been said about government information as a rich source for research and scholarship; it is seen by many as inherently valuable. But doesn’t government information also have the potential to engage and inform students in their civic lives beyond the walls of the academy? We believe that Government Information librarians are in a unique position to help students in both their academic and civic lives.

In order to understand how government information provides this value to our students, however, we need to begin by answering some fundamental questions, like: what do students know about government information, how do they interact with it, and do they even know when they are using it? Intentional teaching and outreach mean nothing if our intentions do not match the needs of students. In order to begin to discover what students know about government information the Government Information librarians at the University of Colorado-Boulder conducted a small study to establish a baseline for what the undergraduate students in our classes know about government information and how they use it.

Design: This survey was designed specifically to find out if undergraduate students understand what we mean when we talk about government information. Do they use government information in their academic work? Would they attend workshops designed to incorporate government information into their academic and civic lives? What kind of help do they want from the Government Information library? From what sources do they get their news? We were particularly interested in three questions: how do students define government information, where do students get their news information, and how can the government information library help? We collected data by handing out fliers in our classroom instruction. These fliers had a link to a Google form that students could fill out. They were handed out before each of our class sessions began. While this survey had a limited reach, it was designed as a practical starting point to inform our outreach, teaching, and further research. It was not designed to make general statements about students in higher education or the University of Colorado-Boulder.

Findings: We had 75 students respond from all undergraduate levels and a variety of academic majors. We discovered that, while many students understand that government information is information produced by the government, others either did not know what government information was, or they thought that it was information about the government. Further, only a few identified government information as a free, public information source. Most students could not identify what kind of information one might get from the government.

We also found that while 67 students said that they did not use government information in their academic work, most stated that the Government Information library could help them in their academic work, Most notably, these students wanted help with their research papers. 61 students responded that they might or would come to workshops outside of class to help them find government resources for their academic work.

Finally, most of the students said that they get information about the world from news and news sites, but many responded that they also get news from social media. Interestingly, only 16 students responded with specific sources of information.

Implications: While we found that students generally understand the basic nature of government information, many do not seem to know the types of information that the government could provide. This might account for how many students did not use information in their academic work. This means that when we engage in outreach and teaching we need to make sure that we clearly define government information and the breadth of government information. When examining the answers more closely there are hints that students do not know where to find or evaluate government information. These are question to follow up with more closely. Practically this small study tells us that students are willing to engage in government information, but might not know what it is, how to use it, or how to find it. These three themes require additional research, but in the meantime, they help us design more appropriate lesson plans that not only help students use and find government information, but also are relevant to student specific concerns.


31. Express Yourself: Using Empathy to Create Better Assessments

Erica England and Erin Hvizdak (Washington State University)

For the last two years, librarians from Washington State University Libraries have been invited into a fall-semester pedagogy course new English graduate students must take before teaching composition, where they create their own lesson plans and assignments for the courses they will teach. Librarians visit the class for one day, where they are placed into small groups with the students to workshop their assignments for the upcoming semester. Librarians use their experience assisting students with library research to provide critical insight into the feasibility of these course assignments, especially regarding availability of sources and the developmental stage of the student. What we find is that graduate students, being far removed from their first year in college, forget the difficulties and roadblocks that a student might experience, especially with little or no prior knowledge of the research process.

In the spring of 2018, one of these librarians held a workshop with students in a capstone course. Students were asked to form small groups and map out the research process while discussing their feelings (excitement, fear, etc.) at each point. This led to fruitful discussion amongst group members about differences in how people approach and feel about research. A faculty member present at this workshop suggested that we give this workshop to instructors as well, as it would help them to take a step back from their teaching and remember that each individual experiences anxieties during the research process.

In fall 2018, WSU librarians will hold the workshop during the English department’s professional development course, which includes both experienced and new graduate student instructors. Immediately after completion of the workshop, we will interview participants to better understand how it impacted their upcoming course and assignment design.

This interactive poster will allow conference participants to participate in a mini-workshop by taking part in a crowd-sourced drawing of the research process, and utilizing emojis to describe their feelings about the research process. The poster will also present the preliminary data gathered from interviews conducted immediately after the workshop completion to determine potential impact on attitudes toward student research and on course design and assignments.

The data from this workshop has the potential to demonstrate how librarians can utilize their expertise on the student research process to contribute to the development of a culture of empathy on campus. This empathy culture can contribute to the creation of better assessments of student learning, and aid in better understanding the impact of mental health on student abilities.


32. Defining and Refining Information Literacy Dispositions

April Cunningham (Palomar College)

Michelle Dunaway (Wayne State University)

Richard Hannon (MiraCosta College)

The purpose of this study was to first identify core learning dispositions that are necessary for information literacy development. Then the goal was to determine if the dispositions we identified were sufficiently distinct from one another and if they were being effectively measured by the test we created.

For phase one of this study, we conducted a rhetorical analysis of the Framework for Information Literacy to define the core learning dispositions. For phase two of this study, we conducted an exploratory factor analysis to determine whether and how the test items we created to measure each disposition correlated with one another.

We found that underlying the knowledge practices and dispositions in the Framework there are four core IL dispositions: Productive Persistence, Toleration for Ambiguity, Responsibility to Community, and Mindful Self-Reflection. Exploratory factor analysis suggests that these are four distinct constructs and the analysis led us to retain, revise, or remove test items in order to create a test using scenario-based items to measure IL dispositions.

In addition to the test we have created, the value of this work is in building evidence that suggests Productive Persistence, Toleration for Ambiguity, Responsibility to Community, and Mindful Self-Reflection are a meaningful set of dispositions upon which to base instructional interventions and new assessments. This set of core dispositions offers focus that the Framework does not. It is also easy to find alignment between these core dispositions and locally defined general education learning outcomes as well as national projects like the LEAP Essential Learning Outcomes and Degree Qualifications Profile. For these reasons, the core dispositions can facilitate collaborative instruction and assessment among librarians as well as between librarians and our colleagues in other disciplines.


33. Dancing with the Framework: Letting the Frames Lead Us to Learning Outcomes

April Cunningham (Palomar College)

The purpose of this project was to define learning outcomes and performance indicators based on the Framework for Information Literacy (IL).

Librarians and other educators familiar with IL from across the country participated in an iterative process adapted from the Delphi Method to develop IL outcomes and performance indicators. The process started from identifying undergraduates’ common IL behaviors, beliefs, knowledge, and misunderstandings. These were then organized into themes which were compared with the knowledge practices and dispositions outlined in the Framework. Finally, the themes were revised and constructs were defined and the experts gave holistic feedback on the list of constructs and outcomes. This poster will also emphasize guidelines for writing performance indicators for higher-order thinking. Indicators must be aligned, observable, unambiguous, and not compound.

Through this iterative process of defining and refining outcomes and indicators, we found that the Framework consisted of four constructs: The Value of Information, Research & Scholarship, Evaluation of Process & Authority, and Strategic Searching. Two outcomes were identified for each of the constructs. Finally, between five and 14 performance indicators were chosen from among the dozens that we wrote for each outcome. Then a test item was written for each indicator.

The outcomes and performance indicators created through this process are now available for reuse, revision, and remixing with a Creative Commons license. The process we used can also be replicated at individual institutions and among institutions who want to collaborate on instruction/assessment.


34. Indicators for Information Literacy Performance in College Students?

Abby Currier and Sara Lowe (IUPUI)

This poster will highlight the results of a yearlong mixed methods research project assessing students’ Information Literacy competencies at a large, public university with a diverse student population. The project is examining first-year and senior student responses to the National Survey of Student Engagement (NSSE) Information Literacy survey (n=630) as well as Information Literacy (IL) rubric scores for their final research products (n=750). The project seeks to determine if there is any correlation between NSSE responses and IL rubric scores. Importantly, the library is collaborating with the Office of Institutional Research for more robust data analysis to explore if there are any indicators (for example, type of high school attended, first-generation student status) that correlate to Information Literacy performance in first-year students. Paper reading ends in May 2018 with data analysis completed by August 2018. For the institution, study results have the potential to inform not only library IL instruction and scaffolding through subject librarians’ disciplinary curriculum but also the university’s general education curriculum. The results of this project will contribute to the larger body of research on college students’ Information Literacy competencies. Although this was a large-scale project, the methodology can be scaled and adapted to inform IL programs at academic libraries of any size or even via an individual liaison librarian.


35. Critical Information Literacy & Outcomes Assessment: Mutually Supportive, Not Mutually Exclusive

Joshua Hughey and Megan Oakleaf (Syracuse University)

Purpose: Learning outcomes enable effective, engaging, and empowering library instruction in several ways. Learning outcomes help librarians articulate the purpose of an instructional episode, communicate it to students, and tie it to larger library instruction, academic program, or institutional learning goals. Outcomes enable students to decipher instructor intent, overcome barriers in decoding unfamiliar contexts, apply content to a need, problem, or task at hand, and transfer learning to new situations. Outcomes align with empowering instructional design methodologies, including Understanding by Design and Universal Design for Learning, and power assessments that help both students and librarians to leverage metacognition and reflective practice in their ongoing journeys of self-actualization and lifelong learning. Outcomes also enable librarians to capture student development within specific instructional contexts and over time; and, armed with outcomes assessment information, librarians can make informed decisions about how best to teach in the moment and in the future, as well as connect their individual instructional efforts to broader programmatic and institutional initiatives.

In recent years, librarians have integrated critical information literacy and critical pedagogy into their instruction. As they embraced critical approaches, some librarians found learning outcomes and outcomes assessment to be at odds with their instructional content and methods. Some observed that critical pedagogy outcomes are often realized after the student has left an instructional episode or completed their program or education entirely. Critical concepts presented in the ACRL Framework may not be realized until connections are made beyond the classroom. These observations have merit, but do not exclude critical information literacy instruction from the benefits of intentional practice and measurable outcomes. Furthermore, there is little reason for assessment and critical pedagogy to be mutually exclusive. Assessment and critical information literacy have the potential to be mutually supportive in ways that have yet to be fully realized. To enable librarians to envision this mutualistic symbiosis, the authors are investigating cognate fields for critical information literacy and their learning outcomes.

Approach: The authors have selected a purposeful sample of materials related to the teaching of critical information literacy cognate fields, including: critical action, critical reflection, critical consciousness, cultural competence, empowerment, political efficacy, privilege and oppression, and social justice. They will undertake a limited content analysis of instruction materials related to those fields, particularly within a higher education context, and glean a list of learning outcomes that illustrate ways in which critical information literacy and its cognate fields can make productive use of learning outcomes and outcome assessment.

Findings: This analysis is the subject of an independent study undertaken for credit at the iSchool at Syracuse University. Currently, the purposeful sample is being identified and analysis will be completed over the summer months. Therefore, at the time of this proposal, pilot work has been completed, but the findings of the full study are not available.

Value: The results of this study will provide a “proof of concept” for the idea that critical information literacy and learning outcomes (and assessment) are mutually supportive, not mutually exclusive.


36. Integrated Information Literacy Assessment: Implications for Efficiency, Faculty Engagement, and Closing the Loop

Sarah Dahlen (California State University, Monterey Bay)

What happens when assessment of information literacy is combined with that of other intellectual skills: critical thinking, written communication, oral communication, and quantitative reasoning? Efficiencies are created, the breadth of data collected increases, and faculty are engaged. This poster will describe a method of campus-wide assessment that has accomplished all of this and raised the profile of information literacy on campus.

Our campus conducts periodic assessment of information literacy, and our methods have evolved over time to include measures of information literacy in the assessments of each of our other intellectual skills. The objective of our assessment is to gauge the proficiency of our students who are nearing graduation in five intellectual skills: information literacy, critical thinking, written communication, oral communication, and quantitative reasoning. We do this by collecting student capstones from across campus, grouping them by format (papers, oral presentations, projects using quantitative data), and employing faculty assessment scholars to score them using integrated rubrics. Each of the integrated rubrics includes two to three dimensions of information literacy, meaning that information literacy is assessed for the entire sample of student artifacts. The result is a broad collection of information literacy data from multiple departments and assignment types.

Faculty assessment scholars are drawn from a variety of departments across campus and are trained to apply the rubric to student work across disciplines. Reflection plays an important role in our assessment process, as faculty assessment scholars have designated times to reflect on their evolving understanding of information literacy and other skills, ways in which the rubrics might be modified, and strategies for improving teaching and learning in their classes. Because of the integration of information literacy with all of the other intellectual skills, faculty who are drawn to be assessment scholars due to interest in any of the skills are exposed to information literacy concepts and the ways in which they manifest in student work.

Our most recent assessment, conducted in 2017, revealed room for improvement in student information literacy skills. The number of students achieving proficiency on various information literacy dimensions ranged from 49–65%. The assessment uncovered specific areas for librarians and other faculty to focus on in “closing the loop” and making improvements to teaching and learning. Equally revealing is the number of assignments where the dimensions of information literacy are absent and unable to be scored. These absences have possible implications for faculty expectations of students at the capstone level. Campus-wide improvements to the teaching and learning of information literacy cannot be effectively enacted by librarians alone, thus the involvement of other faculty in the assessment process and in closing the loop is key to the potential of our method for effecting change.


37. Student Self-Reports on Their Library Research Comfort Levels: Using the Data to Improve Library Instruction

Donna Harp Ziegenfuss (University of Utah, J. Willard Marriott Library)

The library instruction literature contends that students rely heavily on Google and other public web-based resources to conduct research (Head and Eisenberg, 2009). However, when students begin a library research assignment, they realize it is not as easy as they thought. This researcher has witnessed examples of these student challenges first hand. When talking to students, they also report that library research and learning information literacy skills are overwhelming and they do not feel comfortable doing academic research.

Therefore, the purpose of this exploratory mixed methods research study was to collect data in a variety of library sessions to uncover student comfort levels about doing research. Over a two-year period, 889 pre- and post-paper surveys were collected and analyzed from 32 courses. Pre-surveys were collected before the library instructional session and post-surveys were collected at the end of the semester. Both pre- and post-surveys contained the same eight quantitative likert-scale questions (1 being not comfortable; 5 being very comfortable) as well as, two open ended qualitative questions in the pre-survey and one open ended question in the post survey. Mean likert-scores for each of the eight survey questions were analyzed with SPSS using t-test methodology. Pre- and post-means scores for each of the eight questions were compared and five of the eight questions were found to be significantly different. In addition, 797 unique comments (399 comments in the pre-survey and 398 in the post-survey) were coded, categorized, and analyzed using qualitative methods (Strauss and Corbin, 1990). The comments coded into five categories; (1) learning about library research resources, (2) valuing library resources, (3) becoming a more effective/efficient researcher, (4) other library resources, tools and support, and (5) expressing anxiety and needs.

Areas where student reported low research comfort levels and comment categories were then used to redesign and improve library instructional sessions. The process used for this research study to uncover student concerns could be applied to other institutions also trying to uncover the concerns of their unique student population. The themes from the qualitative findings have been used to create a framework that can be used to redesign not only library research sessions but also consultations and workshops for more advanced library research. This approach to instruction, identifying areas for improvement first before designing session, may lead to a more sustainable and effective way to think about designing library instruction. Sharing of the finding with faculty partners to encourage discussion and collaborative problem solving was the most valuable aspect of this research study. This project resulted in faculty requesting more focused library sessions and flipped sessions instead of just general traditional library sessions. Sharing research data has also resulted in new partnership projects.

Head, A. J., & Eisenberg, M. B. (2009, December 1). Lessons learned: How college students seek information in the digital age. Project Information Literacy Progress Report (v. 2). Available at: http://projectinfolit.org/pdfs/PIL_Fall2009_Year1Report_12_2009.pdf

Strauss, A., & Corbin, J. M. (1990). Basics of qualitative research: Grounded theory procedures and techniques. Thousand Oaks, CA, US: Sage Publications, Inc.


38. Librarian Teaching and Learning Assessment by Design! (Thinking)

Deepa Banerjee, Jackie Belanger, Conor Casey, Amanda Hornby, Leslie Hurst, and Anna Nakano-Baker (University of Washington Libraries)

The Assessment Committee of the University of Washington (UW) Libraries Teaching and Learning Group is using a design thinking approach to improve student learning assessment support for librarians. The committee seeks to develop services and resources that help libraries staff engaged in classroom teaching learn about and integrate student learning assessment strategies into their teaching practice. This poster will highlight key takeaways from the design thinking project, including lessons learned about using the design thinking approach for internal improvements to teaching and learning support, as well as the resources we developed as a result of the process.

The UW Libraries has run three user-focused design thinking projects since 2015, and the recent ACRL publication “Keeping Up With… Design Thinking,” points to the growing interest in using design thinking to improve library users’ experiences. There are, however, fewer examples in LIS literature about using design thinking for the improvement of internal organizational processes and growth. In our current project, the design thinking approach considers our library colleagues as the users of the services and resources offered by the Assessment Committee. Using the design thinking process to surface librarians’ student learning assessment needs not only ensured that we would develop robust and relevant resources to support them, but also supported the ongoing improvement of our instruction program through a better understanding of student learning. The project also aims to build staff awareness of the potential value of the design thinking method in support of internal needs and the needs of students and faculty. As a large, tri-campus institution, it can be challenging to understand the varying needs of librarians engaged in teaching. The Assessment Committee took this approach in order to experiment with a more agile process to understanding the assessment needs of librarians in creative ways. Design thinking’s emphasis on building empathy and seeing the world from the user’s perspective enabled the team to explore in more nuanced ways the barriers and opportunities for librarians interested in conducting student learning assessment.

The Assessment Committee recently completed its initial round of interviews with librarians to surface their student learning assessment questions, experiences, and challenges. Rather than asking about specific sources of support librarians might want, the interviews focused on broader questions about what librarians wish they knew about student learning and their goals for students in their instruction sessions. Based on this input, the committee will engage in prototyping services and desired resources, and gather feedback from librarians on those ideas in order to determine those which will best meet their needs. We aim to conclude that portion of the process this summer and are striving to undertake planning and possibly deploying the student learning assessment support services and resources over fall.

Attendees of this poster will learn: why and how we undertook the design thinking process; how we identified student learning assessment support needs and the related services we plan to offer librarians; and how each of these can apply in their libraries.


39. Creating Sustainable Research Consultation Assessment Using Multiple Methods

Ashlynn Kogut and Pauline Melgoza (Texas A&M University)

Texas A&M Subject Librarians use a research consultation setting to teach engineering student teams information literacy related to the team’s professional project. Seven subject librarians provide consultations for over 40 teams each semester, and the number of students enrolled in the course is growing. The research consultations with teams have been occurring for over ten years, but no formal assessment has been conducted. The purpose of this study is to determine the best method for librarians to use on an ongoing basis to assess the effectiveness of the research consultations with the teams.

In order to determine which assessment technique would be sustainable in the long-term, we plan to use four different data collection methods over two semesters. We collected data during the spring 2018 semester and will collect data again in fall 2018. The first assessment method is a one-minute paper given to each team after the research consultation to assess the immediate impact of the instruction. Second, after each project deliverable, one to three teams will be interviewed to determine when students have information needs that can be addressed by the library. Third, during the last class session, a questionnaire will be distributed to both students who met with a librarian as well as those who did not meet with a librarian. The questionnaire for students who met with a librarian will focus on what information students used from the consultation and what information students will use in the future. The questionnaire for students who did not meet with a librarian will ask about finding information for the project. Fourth, after the project is completed, we will use focus groups to solicit feedback on the instruction, the library guides, and the timeliness of research consultation.

Our findings will be presented in the form of lessons learned and best practices for each of the four assessment methods: one-minute papers, focus groups, interviews, and questionnaires. So far, we have found that the in-class questionnaire method offers the best opportunity to reach both students who met with a librarian and those who did not, if the instructor continues to provide class time for the assessment. Other lessons learned and best practices include the questions to ask on the one-minute papers, strategies for focus group recruitment, and students’ willingness to meet multiple times with a librarian.

The significance of this study is the comparison between multiple data collection methods to determine the best method for continued assessment of the research consultations. Sustainable assessment requires an understanding of one’s institutional context. Our findings can assist other libraries in balancing time and resources to collect data for long-term research consultation assessment.


40. Shared Outcomes, Shared Practice: Evaluating an Instruction Program with One Assessment Technique

Meghan Wanucha (East Carolina University)

Purpose: When it comes to instruction program assessment, we often define “program” as those common curricular experiences that students encounter during their academic careers, such as first-year seminars, composition courses, or general education requirements. Discipline-specific library instruction and assessment often falls to subject liaison librarians in different library departments or housed outside the library altogether. At East Carolina University, the Research and Instructional Services department teaches instruction sessions encompassing this entire range of student experiences, from introductory composition to graduate-level classes. While our composition classes were relatively uniform and straightforward to assess as a “program,” would it be practical to expand our definition and assess student learning throughout this range of instruction with the same assessment technique? This poster will describe our attempt to assess students’ learning throughout an instructional program based on shared learning outcomes and assessment instrument.

Design/Methodology: The library instruction team recorded the learning outcomes taught and assessed across all library instruction sessions during fall semester 2017. At the end of the semester, the coordinator of instructional assessment analyzed the data and discovered two outcomes that were taught and assessed most often. These outcomes were used to design a new summative assessment instrument that could be used in any library instruction class and flexible enough to accommodate assignment-specific instruction scenarios. Piloted in spring semester 2018, the instrument was tweaked and is being used again in fall 2018 instruction sessions.

The shared assessment includes closed- and open-ended questions that ask students to demonstrate learning taught in the session. Open-ended questions will be coded for common themes related to dispositions from the ACRL Framework for Information Literacy in order to determine whether our instructional program results in learning at intended beginning, developing, and advanced levels.

Findings: This poster will present the results gleaned from the shared summative assessment, included insights into teaching practices for individual library instructors, what research experience our students have, and whether our assumptions of student progression align with our teaching program.

Implications: Assessing student learning is about discovering evidence for whether a student has achieved a specific learning outcome. If instruction sessions for a variety of levels and contexts share a similar learning outcome, it may possible that students’ learning in those sessions could be assessed through a similar tool. This poster presents one library’s attempt to use a shared instrument to assess student learning across an instruction program encompassing introductory composition through upper division classes.


41. Over the Threshold? Analyzing Students’ Reflections on Their Information Literacy Growth through the Lens of the Standards Versus the Framework

Rachel Hamelers (Muhlenberg College)

Jennifer Jarson (Penn State University-Lehigh Valley)

Many libraries offer information literacy award programs to celebrate their students’ accomplishments in research. Often, these awards recognize students’ achievements as demonstrated through their final products like papers, posters, and presentations and the subject knowledge apparent therein. Our library’s award program instead asks students to reflect on their research processes and information literacy development as they worked toward these final products. These reflections make the process of their research and the steps toward the creation of their final products visible. Their applications offer a unique opportunity to assess students’ information literacy understanding and growth in their information literacy attitudes and practices as detailed in their own words.

Our award application asks students to describe how their research paths and/or processes have enhanced their information literacy skills and knowledge through the lens of one or more core information literacy concepts. During the award’s first four years (2011–2014), we asked students to reflect on their work through the lens of the standards outlined in the Association of College and Research Libraries’ (ACRL) Information Literacy Competency Standards for Higher Education. We revised our award application in 2015 after ACRL introduced the Framework for Information Literacy for Higher Education. The Framework offers a different approach, focusing on six core “threshold concepts” that are “passageways or portals to enlarged understanding or ways of thinking and practicing,” rather than on standards, performance indicators, and learning outcomes. The revised award application asks students to consider their processes and growth through the lens of the new, more theoretical language of the Framework. The shift from the Standards to the Framework has been controversial for some in the field of librarianship. Discussion in the library community has centered particularly on the utility of the Framework’s core concepts in teaching and learning and the comprehensibility and relatability of the Framework language for non-librarian stakeholders such as faculty and students.

This poster presents our examination of how students described their engagement with information and their research processes, as well as how they understood, reflected on, and developed information literacy. Moreover, students’ applications offer us access to their interpretations and use of the language of the Standards and the Framework. Our findings also include analysis of which concepts students chose to write about most frequently and how students translated and applied those concepts. Altogether, these findings inform our practice as information literacy educators by offering insight into students’ perspectives on the information ecosystem, their information literacy development, and their identities as learners and researchers. These findings also contribute to the discussion regarding the utility of the Standards and the Framework as pedagogical tools.


42. One-Minute (ir)Relevance? Using Feminist-Influenced Pedagogy to Assess Evidence of Student Learning through Library Instruction

Rebecca Blunk (College of Southern Nevada)

This poster will demonstrate how the implementation of feminist pedagogy in information literacy (IL) instruction, through the use of the “one-minute paper” (OMP), is used to engage undergraduate students in assessment activities that demonstrate both experiential knowledge and learned comprehension at the College of Southern Nevada (CSN), Las Vegas. OMPs, frequently utilized as curriculum assessment techniques (CATs) in library instruction, are aimed at gauging whether, and to what extent, students grasp IL concepts commonly presented over the course of a one-shot (50–70 minute) session. While typical iterations of implementing the OMP within library instruction invoke a Plus/Delta or “muddiest point” approach whereby students are prompted to consider newfound comprehension of processes or functions of information-seeking behavior, as well as areas of knowledge deficit in which further clarification is needed, librarians at CSN have introduced an alternative approach. By asking students to provide examples or illustrations cited in the library session they most related to, this poster will exhibit how, through inclusion of feminist pedagogy, the collection of perceived personal connection to elements of instruction within the library classroom provide relevant assessment data regarding conceptual comprehension of IL principles and demonstrated knowledge practices.

Purpose: The purpose of this poster is two-fold in that it addresses both strategic approaches to learning and teaching practices, as well as issues of social justice, equity, and inclusion in higher education. The examination of using qualitative methods of inquiry, such as the OMP, both provides students with a more inclusive opportunity to demonstrate their engagement with IL concepts of learning in the library classroom, and allocates a means to collect student perspective in order to improve both content and delivery strategies for teaching (Cobus-Kuo & Waller, 2016, p.40). Additionally, the purpose of this poster is to illuminate the need for incorporating feminist pedagogy into teaching practices in order to subvert hegemonic discourses that continue to exist in a higher education system dominated by a patriarchal culture of sexism, racism, and homophobia (Accardi, 2013, p.24).

Design/Methodology/Approach: The case study to be presented in this poster draws on critical and feminist post-structural concepts of pedagogy that are epistemologically expressed in the interpretivist tradition and emphasizes issues of power through subjectivity, equality, reciprocity, collaboration, non-hierarchical relations, and emancipation (Bryman, 2012; Cohen, Manion, & Morrison, 2017). Aligned with feminist theory that challenges disciplinary structures and power/knowledge associations within the disciplines, the study presented will also consider a poststructuralist contextualization of power that is dispersed, rather than situated in patriarchy or economy (Blackmore, 2013, p.181–186).

The one-shot instruction sessions discussed in this poster were held over the Fall semester and constituted a total cohort of approximately 400 first-year undergraduate students enrolled in 100-level courses at CSN. All academic faculty requesting instruction sessions were prompted to choose from seven prescribed learning outcomes developed by librarians prior to the confirmed instruction date in order for the teaching librarian to customize lesson plans and create optimal learning activities. Instruction sessions throughout the semester varied in length between 30–120 minutes, averaging 60 minutes. In attempt to further incorporate feminist pedagogical practices, consciousness-raising through the use of specific search string and example keyword selection were applied, as well as emphasizing student-centered practice and employing counter-structure modes of teaching by validating the importance of experiential learning, as demonstrated through individual discussion of research/essay/speech topic. (Shor, 1987) At the conclusion of each instruction session, regardless of course, section, or assignment prompt, students were provided with 5 minutes, a piece of paper, and were prompted to answer the following question in one minute: “What example or illustration cited in today’s class could you relate to the most?” Papers were collected as students departed the classroom.

Findings: The findings of the study presented in the poster will illustrate how incorporation of feminist pedagogy in library instruction is a valuable, yet underused element of learning assessment with regard to IL comprehension and applied practice. In addition, the findings of the presented study will demonstrate the varied thematic nature in which students relate and connect to learning activities associated with information-seeking behavior.

Practical Applications: The discussion of incorporating feminist pedagogical approaches to library instruction assessment will simultaneously benefit students and instructors participating in higher education, and specifically those attending or teaching at minority-serving institutions (MSIs), where applications of transformative education are necessary to challenge systematic structures of power and privilege. In addition to amplifying the voices of students from marginalized identities, participation in this kind of critical pedagogy will prompt library instructors to make connections with students on a more effective level “for the purpose of achieving the larger goal of liberation and the radical transformation of society” (Beilin, 2016, p. 17).

As a culture of assessment continues to grow in academic libraries, radical forms of inquiry will explicate how they provide a tangible, measurable impact. Due to the limited volume and publication of feminist pedagogical application in library instruction currently available, the study presented in this poster will serve to widen the scope and demand for this particular exploration of critical theory in practice. In conclusion, this poster will seek to draw triangulation amongst critical pedagogy, information literacy comprehension, and library assessment for the purpose of sharing best practices that are divergent, inclusive, and equitable.

Accardi, M. (2013) Feminist pedagogy for library instruction. Sacramento: Library Juice Press.

Beilin, I. (2016). How unplanned events can sharpen the critical focus in information literacy instruction. In N. Pagowsky and K. McElroy (Eds.) Critical library pedagogy: essays and workbook activities (Vol. 1, pp. 17–24). Chicago: Association of College and Research Libraries.

Blackmore, J. (2013). Within/against: feminist theory as praxis in higher education research. In M. Tight & J. Huisman (Eds.) Theory and method in higher education research (pp. 175–198). Retrieved from http://bit.ly/2DEAMUL

Bryman, A. (2012) Social research methods. (4th ed.). Oxford: Oxford University Press.

Cobus-Kuo, L., & Waller, J. (2016). Teaching Information Literacy and Evidence-Based Practice in an Undergraduate Speech-Language Pathology Program: A Student Reflection. Contemporary Issues In Communication Science & Disorders, 4335–49.

Cohen, L., Manion, L., & Morrison, K. (2017). Research methods in education. London: Taylor and Francis.

Hooks, B. (1994). Teaching to transgress: education as the practice to freedom. New York: Routledge.

Shor, I. (1987). Critical thinking and everyday life. Chicago: University of Chicago Press.


43. No Cat Herding Needed: Collaborative Outcomes-Based Assessment in the Library

Robin Ewing (St. Cloud State University)

Librarians at St. Cloud State University (SCSU) developed a three-credit course that combines critical thinking and information literacy. This course, Critical Thinking in Academic Research, is offered as part of the critical thinking goal area of the university’s Liberal Education Program (LEP). In this course, students examine and evaluate critical reasoning in scholarly research, the construction of arguments, and the management of their own academic research. We offer the course in a variety of formats. We’ve had sections paired with English composition courses, sections offered as a part of the Honors Program, and we regularly offer an online section as well.

Like other institutions, St. Cloud State University requires various levels of assessment. At the course level, instructors are expected to assess student learning and make adjustments. All campus programs, curricular and co-curricular, must have an assessment plan that includes a mission statement, student learning outcomes, and a timeline for assessing the outcomes. Additionally, departments with courses in the Liberal Education Program, are required to assess those courses according to the student learning outcomes in the LEP.

To assess whether or not the course outcomes are achieved and to identify areas for improvement, we determined the most efficient course of action was to develop a final project that addresses outcomes from the course and program level. Fortunately, we have the environment of trust needed to establish common learning and teaching goals as well as assessment measures. A key component is that each librarian still has the academic freedom to select the pedagogy appropriate for their section of the course. We view assessment as a long-term method for continuous improvement of student learning. While we each use different assignments and activities in our courses, the common final project helps us to measure student learning across the multiple sections of the course.

In this poster, I’ll describe how we developed the student learning outcomes for this course and how they connect to the campus Liberal Education Program and institutional learning outcomes. I will explain how we collaboratively developed the final project through many lively discussions. We are not shy about expressing our opinions on what works best for students. I will also describe the pilot test we completed in the Fall 2017 semester and how we fine-tuned the assessment based on the pilot. Finally, I will share the results of the project to date.



Poster Session II

Collections

44. Building the Complete (Holistic) Collections Assessment Manual

Madeline Kelly (Western Washington University)

Assessment is integral to building, managing, and justifying library collections. Unfortunately, collections assessment can be a daunting undertaking, and not every library has time for an extensive literature review prior to embarking on an assessment project—let alone time to synthesize, prioritize, and apply what has been read. With that in mind, I am setting out to create a one-stop-shop for collections assessment: an in-depth manual that not only guides readers step-by-step through major assessment methodologies, but also provides concrete guidance on how to contextualize those methodologies within a broader holistic assessment framework. Under contract with ALA Editions, this manual will bridge the divide between big picture and detail—the why and the how-to—in a nuanced and flexible way, delving into both the theory of how to make assessment choices, as well as what those choices can actually be in a variety of contexts.

This poster session is an opportunity for feedback from the assessment community, to ensure that the final manual is as helpful as possible to librarians. The poster focuses especially on the book’s many templates, worksheets, and checklists (which will be published under a CC-BY-NC license). It also focuses on the manual’s structure, which combines the high-level thinking of a more theoretical work with the granular practicality of a cookbook-style assessment guide. Ultimately, the manual is intended to fill a real support need among collections assessment librarians, and this poster is a chance to revise, focus, and strengthen what that support looks like.


45. Visualizing Institutional Repository Value

Sarah Fitzgerald and Zhehan Jiang (University of Alabama)

The purpose of this project was to design visualizations for institutional repository content and usage to attract scholarly interest. Open access is important to make scholarship accessible outside of elite research universities. However, faculty participation in institutional repositories is still low (Kim, 2010). As Tennant, et al (2016) point out, open access has two primary benefits. It increases the impact of scholarly articles and it allows researchers to mine scholarly literature. This project allows libraries to market their repositories to scholars by showcasing both of these benefits.

To visualize institutional repository usage, we created an interactive map to display global downloads of content from the repository. This shows scholars the international reach of making their work open access. It shares similarities to a map created for Oregon State University (Zhang and Lopez, 2017).

To visualize institutional repository content, we created an interactive wordcloud to enable scholars to use a text mining approach to see differences in dissertation and thesis content across disciplines and year of publication. The map visualization uses Tableau software to display Google Analytics data. The text mining tool uses R programming language, the Shiny software package, and a text mining software package called tm. An important step in creating this text mining tool was identifying stop words to exclude from the wordclouds such as “thesis”, “research”, “dissertation”, “study”, and “data”.

Both visualizations created for this project can be viewed and interacted with at https://ir.ua.edu/. Tableau was a useful tool for building maps to display data patterns, but its interactive functionality is limited because users are unused to the navigation functions. The cost of Tableau licenses and launching R applications on the Shiny server is also a drawback to these approaches toward visualizing institutional repository use and content.

Libraries can follow these methods to create visualizations and discovery tools to encourage interest in their institutional repositories and highlight the benefits of making scholarship open access. Though previous research demonstrates that open access articles are downloaded more than toll access articles (Wang, Liu, Mao, & Fang, 2015), there remains a gap in the literature describing strategies to motivate scholars to make their work open access. This project begins to fill that gap.

Kim, J. (2010). Faculty self-archiving: Motivations and barriers. Journal of the American Society for Information Science and Technology, 61, 9, 1909–1922. DOI: 10.1002/asi.21336

Tennant, J. P., Waldner, F., Jacques, D. C., Masuzzo, P., Collister, L. B., & Hartgerink, C. H. (2016). The academic, economic and societal impacts of Open Access: an evidence-based review. F1000research, 5. Retrieved from https://f1000research.com/articles/5-632/v3

Wang, X., Liu, C., Mao, W., & Fang, Z. (2015). The open access advantage considering citation, article usage and social media attention. Scientometrics, 103, 2, 555–564. Retrieved from https://f1000research.com/articles/5-632/v3

Zhang, H., & Lopez, C. (2017). An interactive map for showcasing repository impacts. The Code4lib Journal, 36. Retrieved from http://journal.code4lib.org/articles/12349


46. Developing a Method to Examine the Impact of Library Electronic Resources on Research Output for a Medium-sized Academic University in the UAE

Lilly Hoi Sze Ho (Zayed University Library and Learning Commons)

Library collections at Middle Eastern universities have usually prioritized teaching over research resources. Zayed University is now transitioning toward research. Library and Learning Commons at Zayed University has just completed year one of collection assessment, transitioning from teaching- to research-based collection. Technical Services, where the centric hub of electronic resources acquisition and usage statistics provision, support the current direction of collection development. The purpose of this study is to provide an overview of how the electronic resources adequacy is measured, and thereby determine the correlation between the level of utilization of resources and growth of research activities in the University to ensure the best use of funding on the resources. Throughout the study, a systematic approach is developed to address the impact of adequate electronic resources and to demonstrate the Library’s value on research output.

Design/Methodology/Approach: The methodology is targeted to measure the adequacy of electronic resources and determine the correlation between the resource utilization and growth of research activities. Article publications affiliated with Zayed University from 2009–2016 have been collected from Scopus and their sources were analyzed. COUNTER usage statistics of three indicators: searches, sessions and full-text downloads, were collected from the most heavily used electronic resources in Library to define the reference point of usage that will be used to calculate the correlation between usage and research activities of campus researchers.

Findings: The results were used as a reference to drive improvement on future collection development. The most important are to provide evidence-based information to Library management for strategically planning and to tactfully present the Library’s impact on research output to senior university administrators.

Practical Implications: The study is an attempt to seek the correlation between the usages of Library electronic resources and it direct impact on the research activities in the university.


47. Assessing the Texas Data Repository: Determining What to Measure and How

Christina Chan-Park (Baylor University)

Anna J. Dabrowski (Texas A&M University)

Nerissa Lindsey (Texas A&M International University)

Laura Waugh (Texas State University)

The Texas Data Repository (TDR) was launched in Spring 2017. The TDR is built on the Dataverse software platform and hosted by the Texas Digital Library (TDL)—a consortium of higher education institutions in Texas. Currently, 11 institutions participate in the TDR, and liaisons from these institutions serve on a TDR Steering Committee to provide feedback and guide the direction of the repository service. The Assessment Working Group (AWG) is a subgroup of the Steering Committee and is tasked with evaluating progress of the TDR.

Purpose: In Fall 2017, the AWG began an assessment to identify the needs for reporting on the TDR by addressing the following research questions:

  1. Which usage and descriptive information about the TDR will be most valuable?
  2. What process for gathering and distributing these metrics/information will be most useful?

Approach: The first step in determining the most valuable usage metrics was distributing a survey to all TDR institutional liaisons. The survey was vital in identifying the widely varying resources and needs of the participating institutions as well as the information the liaisons were interested in seeing both institutionally and consortially.

The AWG also decided that the most valuable usage metrics should allow for comparative assessment across repositories globally. To that end, the group is currently aggregating descriptions of metrics recommended by five recent publications which suggest best practices for tracking the impact of research data, including the Make Data Count project’s “Code of practice for research data usage metrics.” These sources will be combined with the results of the survey to determine a prioritized list of metrics.

Findings: In the survey, the following five metrics were most frequently reported as desirable: dataset download counts; unique page visitors; size of datasets (MB); size of collections (MB); number of files within datasets. Participants were also interested in information including: Institutional researchers using the TDR; Hierarchy collections and datasets; List of links to all collections and datasets; List of dataset DOIs. Of these desired metrics, only download counts and unique page visits are included in the usage metrics recommended by the best practices reviewed; the other desired metrics are descriptive information.

Practical Implications: Currently, the following tools are available for gathering metrics in the TDR: Google Analytics; the Miniverse add-on for Dataverse; and reports generated from Dataverse logs. Each tool provides different measures and information. By first identifying which use metrics and descriptive information will be most valuable—including their definitions and scopes—the AWG will be able to determine the best tool for each metric and thereby answer research question 2. It is possible that not all metrics will be obtainable or accurate using the tools currently available. However, this will guide the AWG to propose improvements for existing tools and possible integration with new tools. Regularly gathering metrics with an understanding of their value will help to assess the TDR and establish it as a quality data repository adhering to current recommended practices and standards.


48. We’re All In This Together. Using Systems Thinking and Data Visualization to Influence the Ordering Habits of Liaisons

Jamie Hazlitt (Loyola Marymount University)

Liaison work is a secondary role for most of the librarians at the William H. Hannon Library at Loyola Marymount University, and although each librarian takes this responsibility seriously, the task of ordering books is often one that gets put off throughout the busy Fall semester. Although the library’s approval plan keeps current materials across all subject areas coming in a relatively steady stream throughout the year, over 50% of our books still come in through title-by-title liaison selection. Thusly, liaison procrastination historically resulted in a deluge of book orders—often triggered by increasingly insistent reminders from the acquisitions and collection development team—at the end of the fiscal year.

In FY2016, the Collection Development Librarian undertook an effort to use visual data related to historical and current ordering patterns to engage liaisons with selection activities throughout the entire academic year, with the hope of both evening out the number of books across disciplines that come in the library month-to-month, and to open up the annual March bottleneck caused in acquisitions, collection development, cataloging, and collection management by liaisons expending their funds at the last minute.

Attendees engaging with this poster will learn about the tools used to educate and inform liaisons about the often “invisible” work that takes place after they place an order, the system of intermediate deadlines established and data-sharing with liaisons, and the results of this effort across three calendar years. They will leave with ideas about how to connect liaisons at their own institutions with the often invisible work of acquisitions through training, communication, and compelling data-sharing.


49. “Greatest Hits” of the Circulating Collection: Subject-Based Retrospective Analysis of Circulation Data to Inform Local Collection Priorities

Louis Becker (University of Tennessee Knoxville)

Purpose: Improvements in virtual access to every form of information have meant that print monograph usage has declined precipitously in recent years. Nevertheless, most libraries continue to loan, and to purchase, print books. As we increasingly focus on sharing collections to preserve and deliver specialized material, the need to know what patrons are using locally grows in importance. The purpose of this project is to help decision-makers in libraries understand the remaining demand for print monographs and inform decisions for weeding, participation in shared-print and remote storage programs, and future collection development.

In many cases, reports on circulation are presented in terms of LC class or a discipline breakdown derived from call-number classification. (e.g. Rose-Wiles 2013, Hughes 2012, Britten and Webster 1991). The amount of detail presented by the upper levels of LC classification is variable and limited. As discovery services increasingly mediate patrons’ interactions with collections, the value of shelf location in predicting circulation may be declining. This project evaluates subject headings of frequently circulated monographs as detailed indicators of patron interest.

Approach: In the early 1990s, a project at the author’s current institution explored circulation data collected in their first-generation ILS over seven years (1982–1989) to develop the collection in a “demand based” way. Britten and Webster (1992) sought shared characteristics of the 400 most-circulated titles in the largest LC classes. They examined the subject headings assigned, the age since publication, and, for foreign-literature classes, the language of the item. This analysis pointed out several areas of unexpected patron focus where collections could be improved.

While we no longer have access to the raw data used for this study, the library has collected cumulative circulation counts over more than twenty years. We plan to repeat the 1992 study using this longitudinal data, and also take advantage of detailed circulation data from our current ILS, implemented 3.5 years ago. We will compare the relative frequency of subject, title, and author strings associated with these highly-circulated books in recent discovery service search logs. These comparisons will shed light on whether popular items are being selected for subject applicability or through known-item quests for “classic” titles or recent “bestsellers” of a field.

Findings: The circulation and search log data for this study has been collected through our ILS. Summary data shows that the time to circulate an average volume once has increased from three years in the 1982–1989 data to nearly ten years from 1998 to the present. Since 2015, the typical book might circulate once in twenty-one years, while hundreds of books have circulated ten or more times. Detailed analysis is ongoing and will be completed before the conference.

Value: For librarians considering large-scale retention or remote-storage projects, this study will provide a fresh view of what is actually circulating from a large ‘just-in-case’ collection and will provide valuable information on the connection between subject and monograph use. In addition, we have an unusual opportunity to examine long-term changes in use at a single institution.

Britten, W. A., & Webster, J. D. (1991). Class relationships: circulation data, collection development priorities, and funding for the future. The Bottom Line, 4(1), 8–11. https://doi.org/10.1108/eb025266

Britten, W. A., & Webster, J. D. (1992). Comparing characteristics of highly circulated titles for demand-driven collection development. C&RL, 53, 239–248. https://crl.acrl.org/index.php/crl/article/view/14715/16161

Hughes, M. (2012). Assessing the collection through use data: an automated collection assessment tool. Collection Management, 37, 110–126. https://doi.org/10.1080/01462679.2012.653777

Rose-Wiles, L. M. (2013). Are print books dead? An investigation of book circulation at a mid-sized academic library. Technical Services Quarterly, 30, 129–152. https://doi.org/10.1080/07317131.2013.759496


50. Using WorldCat API to Identify Unique and Distinctive Materials within Collections

Juleah Swanson and Philip White (University of Colorado Boulder Libraries)

Purpose: “…Libraries want to be known for their distinctive collections, not by some characteristic shared by every other library,” observed Nicolas Barker (2007) in his work documenting the special collections and archives of ARL libraries. A report from Research Libraries UK (2014) also discusses distinctiveness by suggesting that, “sometimes it is the collection itself, not its component parts, which constitutes what is special.”

While libraries may have a sense of what is rare, unique, and distinctive within their special collections, identifying what sets a library apart from its peers within the circulating collection may be undervalued or unknown. Efforts to identify such pockets of distinction have previously involved costly tools (Genoni & Wright, 2011), or labor-intensive work (Barnes, Kelly, and Kerwin, 2010). This research seeks to determine if the WorldCat API can be used as a fast and cost-effective solution to begin to identify pockets of distinctive materials within a library’s collection. For this project, two test cases of the University of Colorado Boulder’s collection were selected for analysis: 1) monographs acquired and added to the circulating collection through donations, and 2) monographs within a specific area of study.

Design/Methodology: Researchers used the WorldCat Search API to assess the distinctiveness of the library’s circulating collection, enabling efficiency and scalability. A selection of items from the library’s collection were selected for analysis, with each items’ OCLC Number used as a query parameter. A script of program code was written in the Python programming language that automated a series queries against the WorldCat Search API. Each query returned the fifty libraries nearest to the researchers’ institution that held an item based on its OCLC Number. Results were automatically compiled into a csv file to permit human readability, analysis, and visualization to evaluate uniqueness of items.

Findings: Ongoing analysis of results of this study’s test cases lead the authors to conclude that these methods are generalizable and may be applied to a variety of collection assessment scenarios, particularly at a subject-level scale. However, this methodology does have limitations when looking at the item level. Preliminary findings generally suggest that the researchers’ library possesses both known and newly discovered areas of collection strength by subject and language.

Practical Applications & Value: These new methodological approaches allowed for assessing at a scale—tens of thousands of items at a time—that would be effectively impossible if attempted manually. Furthermore, this study presents a framework for larger-scale analysis of collection uniqueness, and is best used to uncover areas of distinction by subject, language, or other factors. A deeper understanding of collection distinction can better inform how libraries attract and interact with scholars as well as demonstrate value to the university.

Barnes, M., Kelly, R. G., & Kerwin, M. (2010). Lost gems: Identifying rare and unusual monographs in a university’s circulating collection. Library Collections, Acquisitions, and Technical Services, 34(2–3), 57–65.

Cronenwett, P. N., Osborn, K., Streit, S. A., & Barker, N. (2007). Celebrating research: rare and special collections from the membership of the Association of Research Libraries. Washington, Wash: Association of Research Libraries.

Cullingford, A. (2014). Unique and distinctive collections: opportunities for research libraries. C. Peach & M. Mertens (Eds.), Research Libraries UK. London: RLUK. Retrieved May 7, 2018 from: www.rluk.ac.uk/wp-content/uploads/2014/12/RLUK-UDC-Report.pdf

Genoni, P., & Wright, J. (2011). Australia’s national research collection: overlap, uniqueness, and distribution. Australian Academic & Research Libraries, 42(3), 162–178.


51. Assessing the Diversity of the E-Collection of the William H. Hannon Library

Marie R. Kennedy (Loyola Marymount University)

Purpose: The American Library Association’s 1982 statement[1] on Diversity in Collection Development reminds librarians of the professional responsibility “to select and support the access to materials on all subjects that meet, as closely as possible, the needs, interests, and abilities of all persons in the community the library serves. This includes materials that reflect political, economic, religious, social, minority, and sexual issues.” In an effort to ensure that the collection of the William H. Hannon Library (of Loyola Marymount University, Los Angeles, California, USA {LMU}) aligns with its institutional vision[2] (including “bridging disciplines” and “representing diverse topics and perspectives”) and meets the research needs of a diverse campus population, a team of library staff has designed a project to assess the library’s electronic collection through the lens of diversity. While some similar studies have been done at larger research institutions (notably that of Ciszek and Young (2010)),[3] this project further interrogates inclusivity in database collections and integrates LMU student learning into the research process. The results of the evaluation will inform the library collection strategy and ensure that collections are built that deliberately and positively contribute to an inclusive campus climate.

Methods: Guided by the project team, a select group of library student employees will be working in spring 2018 to use a series of predetermined keyword phrases to search through about two hundred of the LMU library’s databases. The project team will code the keyword phrases into categories of diversity, so that the library can better understand its e-resource collection. Categories like Disability, People of Color, and Gay, Lesbian, Bisexual, and Transgender will be used to determine if content is well represented in the collection, particularly compared to the vendor supplied description. In addition to conducting the keyword phrase searches, the student evaluation team will respond to reflective prompts along the way, answering questions like, “Based on the search results, do you consider the database to be ‘diverse’?” and “Would you recommend this database to someone doing research about diversity or inclusion, in your major? Why or why not?”

Findings and Value: The project team is anticipating the completion of the student evaluation, so that they can determine if the library collection is diverse, or if there are gaps in the electronic resource collection that can be strategically filled. The process and results of this study will be shared as broadly as possible, with an outline of subsequent actions to be taken.

[1] https://www.ala.org/Template.cfm?Section=interpretations&Template=/ContentManagement/ContentDisplay.cfm&ContentID=8530

[2] http://library.lmu.edu/aboutthelibrary/libraryvisionmission/

[3] Ciszek, Matthew P., and Courtney L. Young, (2010), “Diversity collection assessment in large academic libraries”, Collection Building, Vol. 29 Iss 4 pp. 154–161.


52. Assessment of the Value of Ebooks Purchased via Interlibrary Loan at University of Michigan

Hanh Bui (University of Michigan)

This study examines the value of electronic books (e-books) to the University of Michigan’s Document Delivery e-book acquisition PILOT program.This program started in 2014 and has a unique purchasing model that sets it apart from other Patron-driven acquisition (PDA) models at the Universityof Michigan. The e-books have been purchased by the Document Delivery/InterlibraryLoan Department for patrons that have requested those titles or requested at least two chapters from the titles via interlibrary loan or document delivery. Specifically, the study analyzes the cost and usage statistics of about 450 e-book titles purchased from January 2014 to the end of July 2017. Statistics on patron demographics (faculty, staff, students), patron’s field of study/research and book discipline will be analyzed together with the usage data using SPSS. The study also compares the cost and usage statistics between e-books and their print counterparts. The results and methodology of this study can help libraries gain a better understanding of how effective and valuable these e-book programs are, as well as further explore options that work best to fulfill patrons’ needs.

Methods & Data

53. Selecting Method Combos for Library Assessment: A Critical Review of Triangulation

Giovanna Badia (McGill University)

While the literature is replete with descriptions of how individual services have been evaluated in specific libraries, fewer studies have reported using a combination of methods to assess a particular service or the library as a whole. Similarly, in its 2017 Academic Library Impact report, the Association of College & Research Libraries poses questions about the data triangulation methods that have been employed to show a library’s effect on student achievement. As a new assessment librarian, the author sought to identify which evaluation methods could be successfully paired together for collecting data that would capture as complete a picture as possible of the service(s) being assessed, for the purposes of successfully informing senior decision-makers and revealing service impact. To accomplish this task, a search of the published literature was first conducted to find studies that employed triangulation for assessing library services. Subsequently, a critical appraisal of these studies was completed using the Mixed Methods Appraisal Tool (Pluye et al., 2011) and the Rigor Attribute Model (Patton, 2015; Zelik, Patterson, & Woods, 2007). This poster will summarize the findings from the critical literature review to present effective method combinations reported for assessing library services, which information professionals will be able to try at their own institutions. In addition, it will introduce critical appraisal tools for mixed methods studies and describe practical strategies for triangulating data.

Patton, M. Q. (2015). Qualitative research & evaluation methods: Integrating theory and practice (4th ed.). Thousand Oaks, CA: SAGE Publications.

Pluye, P., Robert, E., Cargo, M., Bartlett, G., O’Cathain, A., Griffiths, F., … Rousseau, M.C. (2011). Proposal: A mixed methods appraisal tool for systematic mixed studies reviews. Retrieved from http://mixedmethodsappraisaltoolpublic.pbworks.com.

Zelik, D. J., Patterson, E. S., & Woods, D. D. (2007, October). Judging sufficiency: How professional intelligence analysts assess analytical rigor. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 51, No. 4, pp. 318–322). Los Angeles, CA: SAGE Publications.


54. Are We Speaking the Same Language? Developing Best Practices for Technology Interfaces and Teaching through User-Focused Assessment

Mary Galvin and Mary Oberlies (University of Oregon)

As librarians we often believe that students are using research tools in specific ways and we design them to meet this assumption. But what if we are wrong? How would that change the way we design our tools and approach our teaching?

In our first phase, we gathered preliminary data from Primo and held “Listening Sessions” with instruction librarians to find out more about how they were using LibGuides and Primo in their classes. We also performed usability studies with undergraduate students to learn more about how they went about performing common tasks using Primo.

Following this work we began analyzing quantitative and qualitative data from LibGuides and Primo in combination with conducting and analyzing student interviews and focus groups. Using a mixed methods assessment approach, the process is revealing how students currently interact with library tools and how the design can lead to frustration and seeking out non-library methods to complete their research. Through our analysis, we are examining how we can improve the content and design of LibGuides and Primo to be more user-centered. We are specifically focused on clearly defining the purpose of tools and updating terminology to match student expectations. One of the methods employed is an analysis of domain-specific terms used in librarian class plans combined usability research. Through this process, we developed a list of commonly-used terms that seemed to clear to the librarian but were, in fact, ambiguous to undergraduates.

This study seeks to strike a balance between assuming our domain expertise and seeking to “instruct” students on how to navigate the complex and sometimes disjointed web of library resources with a sincere interest in learning more about how they perform their research “in the wild” and meshing some of our best practices to be more compatible with theirs.


55. Dress Library Data to Communicate Value and Impact: Advanced Tableau Charts

Sarah Murphy (The Ohio State University)

Purpose: Since the 2014 Library Assessment Conference, a growing community of library assessment professionals has mastered data visualization using Tableau. As several members of this community are empowered to securely publish interactive data using their campus Tableau Server, interest grows in learning how to craft engaging Tableau dashboards which effectively communicate library value and impact. This paper will introduce four advanced Tableau charts—the waffle chart, the lollipop graph, the bump chart, and the Sankey diagram—and provide detailed instructions on how to construct these charts within the Tableau desktop application.

Design/Methodology/Approach/Findings: Any student of data visualization must learn to quickly analyze the nature of the discrete and continuous data with which they are working and to match this data to an appropriate chart type that will guide and advance the narrative emerging from this data. In introducing each advanced Tableau chart, a description of the chart will be provided, along with the types of data for which the chart is best suited, and a real-life scenario for utilizing the chart with library data. The Sankey diagram, for example, is useful for depicting the flows between multiple dimensions in a system. Library assessment professionals may use this chart to illustrate the flow of students through library instruction throughout the course of their academic career. The waffle chart offers a fun and engaging alternative to a pie chart, while depicting proportions to a whole. This chart may be used to show the proportion of faculty who used a particular library service by academic department. The lollipop chart focuses viewer’s attention on the aggregate continuous values achieved by a category. Also known as a barbell plot, it is the result of combining a line chart with a dot plot. The bump chart is particularly useful for benchmarking data, allowing the analyst to quickly depict changes in rank over time. The benefits and limitations of each chart introduced will be briefly addressed and reference to advocates and critics for each chart type within the data visualization community will be provided.

Practical Implications/Value: Mastery of advanced Tableau charts may assist interested library assessment professionals with advancing their library’s value and impact narrative. This paper will provide a practical introduction with detailed instructions to allow the library assessment community to enhance its data visualization competencies.


56. Graphs in the Library Assessment Conference Proceedings: A Review of Design, Practices, and Quality

James Cheng, Azhlee Costa, and Starr Hoffman (University of Nevada Las Vegas)

Using graphs to display data is an essential part of communicating research, particularly for practitioners and researchers of library assessment. A well-designed graph affords the efficient and accessible interpretation and understanding of data instead of only describing data through using text. As Steven Pinker (1990) writes, “Perhaps pictorial displays are simply pleasing to the eye, but both introspection and experimental evidence suggest that, in fact, graphic formats present information in a way that is easier for people to perceive and reason about.” However, the usage of graphs in the library assessment literature has not been previously investigated.

We investigated the corpus (nearly 450 papers) of the Library Assessment Conference proceedings from 2006–2016 in order to better understand the graph-related practices of those contributing to the field of library assessment. For each visual object in each paper, we classified them into different categories, such as tables, diagrams, pictures, and data graphs; however, we focused only on data graphs for further analyses. Data graphs were analyzed and scored based on the method provided by Chen et al. (2017) into the following: type of graph (simple, intermediate, or complex); data density index (average amount of information per square centimeter); completeness (e.g. graph includes title, x and y axis labels, definitions for all data elements); visual clarity (e.g. absence of improper scales, clutter, redundancy); special features (e.g. graph includes stacking, small multiples, number at risk); and dimensionality (e.g. shape, shading, pattern, color). In addition, graphs were also analyzed for gender differences (e.g. use of red or pink for women and blue for men, men placed on the left or top of graph), correlation between number of authors with number of graphs (e.g. if multiauthor articles contain more graphs than single author articles), and color discrimination (e.g. use of red-green or blue-yellow combinations). Graphs were compared over time in order to determine whether changes occur in each category.

The results will describe the nature and quality of data graphs produced by practitioners and researchers of library assessment over time through graphs and statistical comparisons. Full results are forthcoming.

The ultimate purpose of a graph is the communication of information, and this research will reveal whether this is being done effectively or not. The ACRL 2017 Academic Library Impact report determined the most commonly identified deficit in library impact is the communication of the library’s impact to external stakeholders. By describing the overall quality of graphs and determining what has been done well and what needs improving, practitioners and researchers of library assessment may use this research to improve the creation and design of graphs by authors, improve the understanding of graphical literacy of the audience, and determine potential areas for professional development and future research. More broadly, this may also lead to the development of a profession-wide style guide for graphs.

Pinker, S. (1990). A theory of graph comprehension. In R. Freedle (Ed.), Artificial intelligence and the future of testing (pp. 73–126). Hillsdale, NJ, US: Lawrence Erlbaum Associates, Inc.

Chen, Cooper, Mcmullen, & Schriger. (2017). Graph Quality in Top Medical Journals. Annals of Emergency Medicine, 69(4), 453–461.e5.


57. Data Mining of Citations in Doctoral Dissertations: Tool for Collection Development and Instructional Services

Jose Cuesta and Lee Yen Han (King Abdullah University of Science and Technology)

Usage statistics, such as access and download data, are a widely used tool in a collection development librarian’s toolkit to assess the relevance and usefulness of a library’s collection to its patrons. The use of citation analysis of students’ theses and dissertations adds another dimension to this evidence-based user-centered approach to assessing collection development activities of the library.

In this project, a liaison librarian and a systems specialist teamed up to make use of a systems approach to analyze the citations of doctoral dissertations from the Biological and Environmental Science and Engineering (BESE) Division in a graduate research institution. Making use of KNIME, an open source data-mining software, we created a workflow to examine citation data to discover citation patterns of student dissertations across the different programs within the BESE division and resource usage. This is matched against the current library holdings as well as compared with usage statistics obtained from JUSP.

Results suggest that as an academic division, the BESE Division is not a homogeneous division and citation patterns are different across the different programs. What and how references are cited are also valuable information to inform, direct and focus our collection development and information literacy program.

The use of an open source data-mining software helps to automate the citation analysis process and provides an efficient and replicable framework to analyze citation data to supplement usage statistics. This would be useful for academic libraries planning to conduct similar studies to assess the usefulness of their collection with respect to the research activities of graduate students.


58. Library Chat Analysis: A Navigation Tool

Mark Fienup and HyunSeung Koh (University of Northern Iowa)

Chat features on academic libraries’ websites have become an important communication channel that connects patrons to library resources, services, and spaces. Previously, researchers have made efforts to analyze chat data using different research methods and from different perspectives (Matteson, Salamon, & Brewster, 2011). Analysis and findings of chat transcripts could provide librarians with rich insights into improving the quality of their resources, services, and spaces.

Even though considerable research into chat-data analysis has already been done, research into high-level tools to aid librarians’ perform analyze on their own chat-data is lacking. In practice, it is burdensome for librarians to go beyond simple quantitative analysis (e.g., chat duration, message count, word frequencies) with existing tools, given that chat transcripts are unstructured text data that require time and efforts in obtaining rich insights. This lack of chat-analysis tools hinders librarians’ reaction to patrons’ wants and needs in a timely manner. Our aim is to develop tools that aid librarians in navigating and analyzing their own vast amounts of chat transcripts in efficient and timely ways without needing a background in programming.

In this preliminary research, we report on processes and results obtained by our evolving chat-analysis tool. Initially, we developed a chat-analysis tool in the Python programming language that helps extract topics from about 7000 chat transactions collected from April 10, 2015 to March 31, 2018 using our LibChat module in LibAnswers from Springshare. For topic extraction we utilized Latent Dirichlet Allocation (LDA), one natural language processing technique for topic extraction utilizing Python modules of numpy, scipy, gensim, and nltk. We found that tuning of allowable topic words versus stop words was an important step in improving the quality of chat-topics identified.

We found that some topics are more accurate than others in representing topics of each chat. A topic of “Interlibrary Loan” is one example that stands out very accurately and is easily identifiable on which chat is associated with this topic. Our tool allows librarians to easily identify a landscape of topics in their chat data, map each topic back to a small set of chat data for their further qualitative and deeper analysis, and take appropriate actions in a timely manner.

We continue to evolve our chat-analysis tool by improving its accuracy of topic extraction and its usability. Incorporation of multiple topic extraction techniques (e.g., non-negative matrix factorization) should help improve accuracy of topic extraction. Improvements to usability include application of visualization techniques for chat-analysis results, and the ability to accept chat-data from other commonly used chat programs (e.g., LibraryH3Ip). We believe that librarians at a broad range of institutions will take advantage of a user-friendly high-level chat-analysis tool developed by this study.

Matteson, M., Salamon, J., & Brewster, L. (2011). A systematic review of research on live chat service. Reference & User Services Quarterly, 51(2), 82–100.


59. How are Students Using Our Guides? A Consortium LibGuides Assessment Project

Melissa Becher (American University)

This poster reports on the development of and findings from a collaborative LibGuides research project. Librarians from the Washington Regional Library Consortium (WRLC), an eight-member consortium in the Washington, D.C. area, wanted to understand how differences in design choices might affect student perception of and use of research guides. Consortium member guides consisted of a mix of Libguides versions 1.0 and 2.0 and presented significant variations in navigation and content. They provided a seemingly perfect test bed for discovering best practices. Additionally, the members of the project team were interested in comparing results from a consortium test with recent tests conducted at single institutions.

During Fall 2016, the author led a team consisting of six librarians from four WRLC schools and a staff member from the consortium headquarters in developing testing for the project. The team chose a two-pronged methodology consisting of a usability test and a rubric assessment. The testing would take place at two of the consortium schools, with other members of the team participating in administering the test. Guides were selected in Psychology and Biology, two subjects found in guides across the WRLC schools. The group developed usability tasks and follow-up questions about student perception and use of research guides. Assessment rubric categories addressed guide purpose, contact information, navigation, content, and guide language.

Testing was conducted in Spring 2017 with eleven students split between Georgetown University and American University. The author analyzed recorded usability test sessions and rubric data during Fall 2017. Results reveled several usability stumbling blocks, including the LibGuides search, matching information needs to guide text, “Get Started” pages, lists of book links more prominent than the catalog link, and multiple options for getting help. Assessment rubric results showed that students rated consortium guides best on using clear language and providing contact information and worst on navigation and content. Only 50 percent of students said that the guides were easy to navigate and contained the content they needed. Students indicated that they were more likely to use a guide to link directly to resources than to start a research project or find out something new in a subject area.

This project stands out among usability tests on guides for being a collaborative consortium project. Drawing samples from the WRLC provided a useful comparison among guide designs and levels of content. The mixed method of usability test and rubric also is a significant contribution. Project results generally support findings from other usability tests with some variations. For example, WRLC students seemed to have less trouble with top navigation than those in other recent studies. Results from the rubric assessment and follow-up questions confirm other findings that guides need clear navigation and simplified content while adding useful information about student perceptions of guides. The findings are applicable to both single libraries and consortia as they try to decide the best way to design guides and present them to students.


60. Make Your Library’s Data Accessible and Usable: Create Live Dashboards with Google Data Studio

Kineret Ben-Knaan and Andrew Darby (University of Miami Libraries)

This poster presents the implementation of a collaborative approach and solution for data gathering, data consolidation and most importantly data accessibility through a live connection to free Google Data Studio dashboards.

Units at the University of Miami Libraries (UML) have long been engaged in data collection activities. Data from instructional sessions, consultations and all type of user interactions have routinely been gathered. Other assessment measures, including surveys and usability testing activities, have been conducted with the aim of understanding user needs. However, data collection outside of the ALMA integrated library system has been decentralized. Libraries’ departments have maintained their statistical data locally and without a routine procedure of gathering or sharing data. Every request for cross-departmental data has involved time-consuming tasks like sending numerous emails, compiling the data provided and reconciling data inconsistency. It has been clear that we had not been utilizing our existing data to its full potential. Establishing a unified solution based on trust and openness is needed.

To meet these needs, the Assessment Librarian and the Head of Web & Application Development were determined to find a shared and straightforward data collection platform, which could automatically connect and present statistics visually from multiple and diverse systems. We were looking for a tool that was familiar, accessible, secure and free.

Our key goal was to facilitate the use of data from isolated data sources, not only to encourage evidence-based decision making, but also to better communicate how UM Libraries’ activities support student learning and faculty research. We wanted to present a panoramic view of all UML activities and initiatives visually in one place.

While most UM librarians were willing to collaborate on data collection and saw the potential benefit from data insights, they were less excited about unifying their data collection definitions and metrics. Library services can no longer be clustered under only a few general metrics like instructional sessions or reference questions; in order to demonstrate value, everyone agreed that we should include new services and initiatives metrics, like data management support, scholarly partnerships, and flipped learning classes. We needed a collaborative approach to data collection which respected our stakeholders’ various needs.

Our solution involves moving to shared Google worksheets for data collection, and then connecting these sheets to Google Data Studio, a free tool that allows users to create customized and shareable data visualization dashboards.

With shared Google worksheets tailored for each reporting unit, units can not only work together gathering data in their sheets simultaneously, but also allow the Assessment Librarian real-time access to data across all departments and campuses. Data Studio, in turn, can automatically ingest these data from any source with an API, allowing for real-time dashboards that aggregate information from multiple sources. For instance, statistics from Alma Analytics or Google Analytics can be filtered to a particular department or subject area, and then displayed with other data streams, such as instructional sessions reporting or session feedback survey results. By creating multiple dashboards customized to each unit’s needs and a “bird’s eye view” dashboard for library administrators, all levels of the organization are better informed and can communicate the libraries’ contributions to the broader university community.


61. 45 Minutes to Clarity: The KJ Method for Prioritizing UX Design Improvements

Jacline Contrino (Ashford University)

Purpose: The Ashford University Library plans to redesign its website using insights gained from user research and the library team’s ideas for website improvements. For such a complex issue as website redevelopment, it could have taken weeks of conversation to reach a consensus of highest priorities. The purpose of this project was to use the KJ Method (also known as an affinity diagram), a project management tool often used in user experience (UX) design work, to quickly brainstorm ideas and gain team consensus on the biggest priorities in enhancing UX on the library website, all in one 45-minute meeting.

Design: The KJ Method, or affinity diagram, was chosen as the tool to efficiently gain qualitative insights from the library team about what they felt were the most important improvements to make to the library website to enhance UX. After one meeting using the KJ method to brainstorm and prioritize specific ideas, I compiled and shared results and rankings with relevant stakeholders. The team’s ideas were taken into consideration and built into the website redesign strategy.

Methodology/Approach: The library team of about 10 people were gathered and given various sticky notes to write on. They were asked to “write down suggestions for how to make the website better. If it’s easier, think of what isn’t working so well.” After writing down ideas, librarians placed the notes on a wall. Then, silently, everyone moved sticky notes into groups, organizing them into themes (e.g. information architecture, navigation, etc.), and labeled those groups without any discussion. Then, also silently, people voted on the most important groups by marking “x’s” next to each group. Once people voted, it was clear what the team prioritized in terms of enhancements to make to the website. Results were analyzed and communicated to the team.

Findings: The library team prioritized the following website enhancements:

  • Visually appealing design that spurs positive user emotions
  • Information architecture
  • Search functionality
  • Navigation
  • DIY section
  • Engagement

Specific ideas under each broad category were also identified. I compiled and shared the results with relevant stakeholders. Using the KJ Method was a great success and librarians had fun doing it.

Practical Implications: Given most librarians’ busy schedules and limited time, the KJ Method is an excellent time-saving tool for efficiently brainstorming, organizing, and prioritizing ideas for any given issue in an accurate, objective way. What would normally take several lengthy discussions can be accomplished in one 45-minute meeting. Additionally, it is a very democratic process, as all voices are given equal weight. No time is wasted on endless discussion or debate; concrete concepts are identified resulting in quicker action.


62. The Tragedy of Faculty Frank: Creating Dynamic Assessment Tools to Inspire Holistic Innovation

Sheila Garcia and Denise Leyton (University of Michigan Library)

In the last decade, user experience methods have become a core tenant of library assessment programs, pushing institutions to think more holistically about their user base and create resources like personas to guide them when determining resource allocation. However, unlike businesses, public and academic libraries are typically not targeting one specific type of user; in fact, these libraries proudly declare that they exist to “serve everyone” in their designated communities. Through this lens, the tragedy of personas like Faculty Frank or Bookworm Betty, who embody an average user from a particular group, becomes apparent: If we’re focusing on meeting the average person’s needs, we leave out the people at the margins.

The goal of our project at the University of Michigan Library is to more deeply represent the community we serve through a series of user-centered resources that build on the knowledge and capacity of our colleagues. These resources are being made with the intent to inspire holistic innovation by deviating from the traditional persona model to present a multi-layered, complex representation of users. By presenting information in a way that goes beyond even what a set of personas can do, our resources will expand our colleagues’ viewpoint around how our users experience campus, sparking new ideas about how they can serve and partner with our community.

This poster will walk through the experience mapping methodology we used to create these resources. This included:

  • Mapping what influences our users
  • Compiling user research data from our library and the library world
  • Conducting interviews
  • Coding data
  • Creating physical resources as well as structured activities for their use

By presenting our research methodology and findings, we hope to incite the critical examination of the use of personas in libraries with the aim of improving our collective approach to this tool.


63. Asking Students about Metadata Without Asking Students about Metadata

Jeanette Norris (Brown University)

Between 2016 and 2018, the Brown University Library began to actively examine methods for assessing metadata practices based on what information graduate students use for describing, finding, and evaluating resources.

This poster presents a combined analysis of data from two studies that used different methodologies to examine what information graduate and medical students find useful. The goal of these projects is to contribute to user-centered qualitative assessment of metadata and cataloging practices. The first project compared students’ descriptions of books with cataloging records. Each participant was given the same three books, and was asked to write a free-text description of that book. The texts were coded for the types of information that was included. This study sought to better understand how graduate students conceptualize text resources outside of the influences of a library search interface.

The second project was part of a larger usability study for the Brown University Library discovery layer. We asked participants to walk us through their research process. We focused the tasks on how they find and evaluate resources that may be of interest and finally how they manage resources that they decide to save. The transcripts of the videos and the participants’ actions were coded for the types of data presented by the participant and in what context that data was being used.

Combined, these two projects provide information about what metadata is important to this group of users and whether that metadata is being used to find, identify, or evaluate resources. The data has been gathered; analysis with respect to this combined analysis project is in progress and should be completed by August 2018. This research can begin to inform how cataloging and metadata departments prioritize their work and the design of discovery layer interfaces.


64. Developing a STEM Outreach Plan with Data Viz

Emily Hart (Syracuse University)

As an academic liaison librarian serving a wide range of departments in the STEM fields, there are many competing priorities. My goal in developing strategic planning matrices for STEM outreach is to move away from a reactionary service approach to one that is proactive and evidence based. Mapping the current assessment goals of the university, colleges, and departments I serve, alongside the Libraries’ initiatives to support student and faculty research, information literacy, open-access and scholarly communications endeavors, has been a large but worthwhile undertaking. To pilot the project, data for the College of Engineering and Computer Science was collected into Excel. The complexity of the information collected and the ability to visualize overlapping priorities could not adequately be represented with Excel tables. Multiple Excel files are being combined in Tableau Public to create visualizations such as matrices, heat maps, and treemaps. Upon completion of the project, detailed methods for data collection and templates for creating visualizations in Tableau Public will be shared openly.


65. Examining the Connection between Library Ratios and LibQual+ Results

Mike Meth and Jacob Tompkins (Florida State University Libraries)

Purpose: To explore whether a number of different ARL and ACRL library statistics ratios per student FTE are related to LibQUAL+ Scores between nine peer institutions. For example, are more staff per student FTE ratios related to lower Superiority Gaps between perceived and desired Affect of Service scores on LibQUAL+?

Design, Methodology, or Approach: Step one was to compare ratios between nine institutions and determine trends in the data. This step has been completed. For example, there appears to be a trend between LibQUAL+ Superiority Scores and Affect of Service Superiority Gap Scores.

Pulling data from ARL Statistics and ACRL Metrics, we have calculated ratios of particular interest, which include determining the number of students per librarian, number of seats available per student, etc. In conjunction, we analyzed the superiority means from the satisfaction reports of AS (Affect of Service), IC (Information Control), and LP (Library as Place) for each individual school. After performing an ANOVA analysis, we determined whether each superiority mean was statistically different.

To test for further trends, an IRB will be completed in order to request individual level data from LibQUAL+ from approximately nine peer institutions.

Findings: We analyzed trends among the means to assess any patterns between the ratio data set and the superiority means we calculated. After finalizing both data sets, we developed scatter plots and bar graphs to illustrate general trends we discovered within the data. We noted schools which had particularly small superiority means, which indicated their perceived satisfaction of their library facilities were almost on par with their desired satisfaction, and considered how that compared with ratios of specific interest. For each of the three categories of student satisfaction listed in the LibQUAL+ data, we considered general trends within the ratios and whether their satisfaction levels followed a trend as well. Categorizing the ARL/ACRL ratios into different groups, we hypothesized their effect on one of the subgroups, like AS, IC, and LP. We tested to see if the trends within the ratios lined up with trends of satisfaction from the LibQual+reports.

Practical Implications or Value: Our intention for these analyses was to take quantitative data from ARL/ACRL and synthesize any possible connections it may have with qualitative data, such as student satisfaction superiority means from LibQUAL+. In conclusion to our research, we hope to idealize reasonable ratios, such as number of students per library staff or number of seats per student, in order to maximize efficiency and student satisfaction in all library facilities.


Organizational Issues

66. Using Tableau to Derive Insight from Data at University Library System, University of Pittsburgh

Anne Koenig and Berenika Webster (University of Pittsburgh)

Purpose: Tableau was adapted by ULS’s Assessment and Quality Assurance Unit last year to streamline presentation and increase use of data to support informed decision making at ULS. This poster describes selected examples of Tableau visualisations and how they were used by different stakeholders for decision making. It also discusses efficiencies gained by the Unit in producing new visualisations and updates to data.

Design: The assessment of the Tableau integration was conducted by analysing usage statistics and interviews with selected users.

Findings: Adoption of Tableau visualisations helped in engaging larger groups of colleagues to interact with the data as well as led to the increase in the use of data for decision making across library units due to improved: access, ability to understand data quickly, and ability to actively engage in exploring facets of data. Some examples of such engagement include review process and subsequent changes to approval plans, identification of measures of success and benchmarks for monitoring progress of our strategic plan, and review of reference workflows.

Practical Implications or Value: Ability to provide data in formats that are user-friendly, visually attractive and rich in context can lead to increased use of the data for decision making, thus adding to institutional effectiveness and improved services.


67. Visualizing Success: Tracking and Reporting Progress Toward Library Strategic Goals

Michelle Brannen, Regina Mays, and Manda Sexton (University of Tennessee Knoxville)

While most libraries now engage in strategic planning, systematic tracking of progress toward strategic goals is less common. In order to have any hope of success, maintaining focus and engaging in continuous assessment and reassessment of goals is crucial. One roadblock to doing so, however, is the complexity of most library strategic plans, coupled with the many moving pieces that must be tracked and the various stakeholders who must be kept informed.

In 2016, the UT Libraries developed a new strategic plan with 5 areas of emphasis, goals in each area, and action items for each goal. In 2017, with the new plan in hand, the library formed a group tasked with keeping the organization focused on the strategic plan by tracking and sharing progress across the library. The Strategic Achievement Review (StAR) team has spent over a year exploring a variety of approaches to keep staff connected with the strategic plan, encouraging self-reporting, blogging stories of success, and developing progress indicators for action items.

From experimenting with dashboards to encouraging staff to share their stories, the StAR team has taken multiple approaches to the charge of assessing and communicating progress toward the library strategic plan. With successes and failures under our belt, the StAR team is excited to share insights to help you make your library’s strategic plan a living, meaningful document. This poster will detail the timeline of StAR efforts to date, showcase dashboards and other assessment tools developed, and strategies for communicating strategic plan progress within the organization.


68. Faculty Perceptions of Librarians and Library Services: Exploring the Impact of Librarian Faculty Status and Beyond

David Murray and Cathy Weng (The College of New Jersey)

Purpose: The Association of College and Research Libraries recommends that librarians with faculty status have the same privileges and responsibilities as other faculty on campus. In addition to promotion and pay equity, tenure is intended to create a culture of respect between teaching faculty and library faculty across campus, provide opportunities to participate in college governance, and grant librarians academic freedom in their research. A recent study by Galbraith et al. (2016) revealed that librarians feel that being on an equal footing with other faculty improves their relationship. It is not clear whether teaching faculty feel the same way. Does the status of librarians affect faculty’s view of librarians and the services they provide?

This study seeks to investigate faculty perceptions of academic librarians in two types of institutions: those unambiguously granting and not granting librarians faculty status. Specifically, the research aims to find differences, if any, in teaching faculty’s perceptions of librarians and the latter’s role in helping with teaching, research and service in contrasting academic settings.

Methodology: The authors surveyed faculty at two large research universities (Indiana University, Bloomington and University of Pennsylvania or Penn) and two smaller colleges (Gettysburg College and The College of New Jersey or TCNJ). One institution from each type of school grants equal faculty status to librarians (Indiana Bloomington and TCNJ), while the other does not (Gettysburg and Penn). Randomly selected faculty at the four institutions the researchers invited to participate in the survey. Approximately 500 faculty responded.

Findings: We found that a noticeable higher percentage of faculty from institutions granting librarians faculty status were pleased with their librarians’ status. Among faculty who were aware of their local librarians’ status (whether that status was faculty or non-faculty), a higher percentage perceived closer, more trusting relationships with librarians than faculty peers who were not similarly aware. Faculty at institutions granting librarians faculty status recognized the important role librarians played in service (i.e., campus governance). Additionally, faculty overwhelmingly perceived that librarians played an important role in assisting faculty research and teaching. However, a lower percentage of faculty perceived librarians as relevant in assisting their individual teaching and research. Proactive communication, frequent interaction, and in-depth academic participation by librarians, as respondents suggested in their open-ended responses, can help foster and strengthen the faculty-librarian partnership.

Practical Implications or Value: The current study reveals much needed information about scholars’ perceptions of librarians and the services librarians provide. It contributes to the ongoing debate about the value of faculty status for librarians and suggests that when librarians are thoroughly integrated into academic life they become empowered to further their institutions primary missions: teaching and research.

The study results can be used to promote a better understanding and enhance relations between teaching faculty and librarians, whether librarians have faculty status or not. Results can also help determine what strategies might lead to more active, productive, and positive collaborations.


69. Silos to Stories: Making Assessment More Meaningful for Everyone

Travis Jordan, Kirsten Kinsley, and Emily Mann (Florida State University)

Purpose: After many years of collecting and siloing data, our library administration encouraged staff to consolidate the reporting process to tell a more holistic story of library impact and value. In order to achieve this goal, a collaboration was formed between assessment, marketing and communications, and public services to broaden the existing public services statistics team to develop a library-wide Storytelling and Statistics (SAS) working group. SAS seeks to be an integrated and inclusive data sharing group that includes all library divisions and units. The advantages of this collaboration is that it will be more integrated with the strategic mission of the library. Another benefit of SAS is that it encourages and empowers staff to make data-driven decisions and to use data to increase the impact of their story by reaching a broader audience (e.g., students, faculty, donors, staff, etc.). By presenting the stories through an omnichannel approach, we will use social media, blogs, videos, website, etc. to connect gathered statistics with the user experience. While some libraries have adopted an assessment committee approach to achieve this, SAS aims to embed and extend the breadth and depth of assessment activities library-wide.

Design, Methodology, Approach: Moving from an insular public service statistics gathering focus to include all divisions of the library including marketing and communications, we hope to create a full feedback loop rather than simply gathering data and placing it in silo. We hope to solve the problem by giving context to the data we collect and providing access and integration of data sets. SAS is charged with communicating and discussing statistical trends and analyses within the University Libraries, sharing this information to key stakeholders in a palatable format, while also aligning assessment activities with strategic priorities of the library and university. SAS will provide advising and training for staff to perform assessment activities and support evidence-based decision-making. To meet these goals we created three working groups:

  • Communication and Marketing
  • LibInsights
  • Assessment Advisory

With this poster we hope to explain the rationale for the formation of these working groups, how we recruited members from all of the library divisions, and if we were successful in transforming gathered statistics and data into narratives demonstrating library impact.

Findings: We will measure success by looking at consistency in participation from each library division, our ability to conduct a successful data audit and implementation of standardization and a culture of assessment throughout the library measured by a short internal survey. Will also use digital analytics to measure story impact.

Practical Implications or Value: This method of statistics gathering and collaboration could be helpful for other organizations to make data more relatable and present a more palatable narrative. It also helps to create more stakeholder buy-in for the assessment process and empowers staff to share the impact they have on the institution and library patron in an easier workflow. Ultimately this approach allows clear demonstration and dissemination of library impact that showcases the stories and everyday experiences of its users and stakeholders.


70. Comparing Monetized Key Performance Indicators (KPIs) of Peer Institutions using Tableau Visualizations

Greg Davis (Iowa State University)

The Iowa State University (ISU) Library’s Assessment Plan uses a balanced score card framework that includes a strategy map. The ISU Library’s strategy map lists thirteen objects spread over four perspectives: service, financial, internal operations, and learning/growth. In order to track progress for the strategy map objectives, various measures and metrics, including key performance indicators (KPIs), have been defined, developed, and reported.

Included in the strategy map’s financial perspective, the library has an objective to “Be data driven with material acquisition decisions and when promoting the value of investment in the library”. To support this objective, the library’s assessment department monitors the trends of selected financial indicators compared to ten peer institutions. In support of this work, information from the ARL Statistics data warehouse is used by the assessment team to generate monetized KPIs. The monetized KPIs are ratios that can be expressed in financial terms. For example, the ratio of material expenditures to total expenditures, or the ratio of professional staff expenditures to total staff expenditures.

To assist in data analysis, the monetized KPI information is charted by the assessment team using Tableau data visualization software. The information in chart form is shared and reviewed by library decision-makers, and can be used to enact change across a wide range of library management and leadership activities, including budget construction/advocacy, fund raising, and library strategic planning.

This poster presentation will contribute to the existing and overarching library assessment body of work by describing a replicable process used by the assessment team for constructing the data visualizations, and through the display of examples of the various Tableau data visualizations that have been developed in support of the ISU Library’s strategy map.

This work is timely and relevant:

  • By using ARL Statistics data, this project makes use of data easily available to ARL academic libraries.
  • With the use of the powerful and feature rich Tableau data visualization software, the benefits of data visualization can be experienced first-hand.
  • While the Iowa State University Library’s experience with data visualization has been positive, a section of the poster would be dedicated to highlighting various “lessons learned”.

71. Tracking Relationships and Measuring Professional Development: A Multipronged Approach to Assessment for Liaison Services

Rebecca Stuhr (University of Pennsylvania)

The University of Pennsylvania Libraries’ newly formed liaison program is redefining and revitalizing liaison services. As we foster a multifaceted outreach to Penn’s communities, we anticipate finding new, unexpected, and highly productive grassroots connections and practices. Measuring the unexpected presents a challenge for assessment design because the outcomes are unknown. Additionally, we are constrained to the use of brief instruments that concentrate on liaison reporting in order to minimize the assessment burden on library patrons. In this poster, I illustrate the multiple assessment approaches we use to identify, document, and encourage innovative liaison approaches and a culture of continuous professional development. Our practices are informed by current scholarship (Bales 2015; Díaz and Mandernach 2017; Kenney 2015).

The liaison program was established to energize the work of its liaisons, raise the level and relevance of service to our diverse scholarly and professional communities, and to create a consistent level of liaison service across the libraries. We broadly define liaison, applying it to the entire professional staff. Liaisons self-identify and work in all parts of the Penn Libraries system and serve library colleagues, departments, centers, schools, offices, and the local Philadelphia community. They have subject or curatorial expertise or special technical, or methodological skills.

Our assessment efforts measure the growth of our relationships across campus, track effective information gathering and sharing (leading to improved design and targeting of services), and evaluate our professional development efforts.

In this poster, I will provide a brief description, examples, and an evaluation for each method used. Some methods are well underway, and some are in their early stages. Some are clearly successful, and others are in need of revision. The methods are both qualitative and quantitative:

  • Weekly report—tweet-like accounts highlighting innovative efforts
  • Monthly “One Question” survey—sent out to all professional staff
  • Liaison interviews—four questions to reveal unique contributions
  • Blog—posts devoted to successful collaborations and learning experiences
  • Biennial skills survey—measures knowledge of Penn Libraries’ services and tools
  • Learning Group follow-up survey—determines success and refines process
  • Liaison activity data collection—measures anticipated and actual outcomes and relationship building

Through these methods, we aim to provide a nuanced picture of our program as it develops over time. Ultimately, assessment should inform the liaison group by revealing the innovative and effective work of individual liaisons and the growth and development of liaison staff. A successful liaison program will mean that our liaisons are being contacted as often as they are reaching out. The libraries will be known as a center for engaged and innovative professionals who are responsive and even a step ahead of the needs of their communities.

The act of measurement imparts value to what is being measured. By raising awareness of innovative liaison methods, liaisons are encouraged to replicate successful practices. By measuring staff engagement with professional development, previously unmeasured in any way at the Libraries, we demonstrate the qualities of the staff and encourage an environment of continuous learning.


72. Non-Academic Support Outcomes and Key Performance Indicators at E. H. Butler Library: A New Partnership with Buffalo State’s Office of Institutional Effectiveness

Eugene Harvey (Buffalo State College, State University of New York)

A new partnership between Butler Library and the recently established Office of Institutional Effectiveness at Buffalo State College (SUNY) resulted in a collaborative opportunity aimed toward demonstrating the library’s contributions toward institutional operational outcomes, strategic directives, and campus improvements. As one of its first initiatives, the Office required all campus units to develop an assessment plan highlighting non-academic support outcomes and key performance indicators (KPIs). The assessment librarian then educated, mobilized, and collaborated with heads of library units in actively developing KPIs for their own units, which culminated in the library’s plan for assessing non-academic operations over a period of at least five years. KPIs were developed selectively for several categories: user experience, services, operations & administration, collections, instruction, and employee development.

This poster session will: 1) provide a practical definition of a KPI, 2) showcase examples of library KPIs, and 3) illustrate how they are mapped to institutional endeavors by using a common performance measurement software system. Additionally, the presenter will be able to engage attendees and answer individual questions through brief discussions on topics such as the process for developing the plan, the partnership with the Office, and cultivating a culture of assessment through tactical and sensitive conversations with library personnel.


73. Dashing Assessment: Results of the LLAMA Assessment CoP Needs Assessment of Dashboard Educational Opportunities

Steve Borrelli (Penn State University Libraries)

Cinthya Ippoliti (Oklahoma State University)

Elizabeth Joseph (Ferguson Library)

Michelle Ornat (Chesapeake Public Library)

Purpose: The LLAMA Assessment Community of Practice (CoP) is transitioning from a committee based structure to a new organizational model using “project teams” or “taskforces” to develop and implement CoP activities. This poster reports out on the work of the pilot “project team” charged to develop an action plan for the development of library assessment dashboard educational opportunities. The team developed a survey to help us determine broad training needs both in terms content as well as delivery format and serve as a guide in providing recommendations for a program that would be of most use.

Design, Methodology, Approach: To address charge elements relating to investigating constituent needs for dashboards to scope professional development opportunities and identify preferred methods for providing them, the project team conducted a survey of LLAMA membership and the broader library community. The survey was conducted using Qualtrics and distributed through the LLAMA listserv as well as national and state library listservs.

Findings: The survey received a total of 302 responses. Findings include, that approximately 90% of respondents are moderately to extremely interested in training opportunities regarding dashboards; while less than 30% of respondents indicated that their library currently employs dashboards, over 60% of respondents indicate that their library has considered developing dashboards; that the biggest organizational challenges for developing a dashboard are expertise (39%) and staffing (20%).The format most frequently identified for training is on-demand webinars (37%), followed by “a toolkit that I can refer to any time” (35%).

Practical: Through the lens of operating in a new operational context, this poster presents an evidence-based approach to assessing demand for content and delivery mechanisms for action planning for future library assessment dashboard educational opportunities.


74. Mind the Gap: Identifying Front-Line Motivations Versus Supervisor Perceptions to Reform Library Customer Service

Kristina Clement (University of Wyoming)

Brianne Dosch (University of Tennessee Knoxville)

Delivering high-quality customer service is often at the forefront of library priorities and yet it can be a struggle to maintain a consistent level of quality service. This research project, drawing on alternative models of customer service, examines how an understanding of the underlying motivations of front-line library staff—including student workers—can help supervisors develop better working relationships with those whom they supervise in an effort to ultimately improve library customer service. We explore the principles behind corporate administration models and how they could be adapted to improve employee support and patron satisfaction in libraries. The primary administration principles for this research are drawn from John R. DiJulius’ book Secret Service: Hidden Systems That Deliver Unforgettable and presents the final phase of research and examines the data collected from an upcoming survey distribution in May–June 2018 as well as the implications for customer service training in libraries. The poster will discuss the findings and recommendations for practical uses of this information. We believe that uncovering the gaps in what front-line staff say motivates them to give quality customer service and what their supervisors believe motivates them could lead to new customer service training methods and models that focuses on relationship building rather than traditional task training as a sustainable way to continually improve library customer service.

This poster is intended to be the third in a three part series of this longitudinal research study, representing results and practical implications. Part One was presented at the 2017 Charleston Conference as a digital poster outlining the theoretical framework for this research study. Part Two will be presented at the 2018 ALA Annual Conference in New Orleans as a physical poster outlining the methodological framework, survey instruments, and preliminary findings.


75. Preparing for RCM (Responsibility Centered Management) Assessment in a Low Information Environment

Paul Beavers and Rachael Clark (Wayne State University Libraries)

Wayne State University (WSU) is embracing a Responsibility Centered Management (RCM) approach to budgeting. In fiscal year 2021, budgeting will shift from the central administration out to the individual schools, colleges, and divisions (units). We have also been told that, though RCM has been adopted by many institutions, the specific approaches are unique to each of them. WSU will be developing an approach that is tailored to WSU while avoiding the pit falls associated with other RCM plans. The current plan is to distribute the funds from tuition to the units that generate that income. Non-revenue generating units, like the WSU Libraries, will be funded through WSU’s allocation from the State of Michigan. This allocation is already far smaller than needed and is unlikely to increase significantly. Thus the non-revenue generating units (NGUs) will be prompted to increase efficiency and will be assessed on that basis.

The Libraries’ Assessment Team is now strategizing to reframe our assessment activities to capture data that will demonstrate value in such a model. This challenge is revitalizing assessment efforts in the WSU Libraries. Up to this point, our assessment efforts have been sporadic. We perform usability testing regularly on our web pages, occasionally administer service quality surveys, and gather data as projects arise. The necessity of regular assessment to justify the Libraries’ allocation within RCM has brought new focus and seriousness to the issues of assessment.

While we are pleased for this revitalization, the necessity to develop plans quickly is daunting especially given that the specifics of what will be required and rewarded are still being formulated. Assessment planning in such environment requires both agility and the ability to think in terms of possibilities and contingencies. Clearly, our university’s dedication to student learning will make the Libraries’ impact on academic success one of the factors. Beyond that we are reviewing RCM implementation and assessment practices in other university libraries. Vague suggestions that the NGUs might be asked to provide “systematic feedback from faculty and other users, …comparative analysis with other universities, and financial analysis” have lead us to explore a wide range of other practices. We are expanding our approach to bench marking against peer institutions pushing beyond a narrow focus on aggregate and library materials budgets. We are reviewing and developing surveys and other activities with an eye on what feedback will best support our budget requests. We are emphasizing to our colleagues that assessment is no longer something desired for internal planning and improvement, but a necessity for the Libraries’ well-being.

Our poster will focus on the demands created by this shift to RCM and the manner in which we are developing a flexible response while precisely what will be demanded of the Libraries is unclear. This agility, this ability to work with contingencies, is a necessity not only for such a change in budgeting, but in many situations in contemporary librarianship. As the pace of change once more increases, we are all working in low information environments.


76. Rating Retention: A Qualitative Assessment of Institutional Commitment to Diversity and Inclusion

Rose Chou (American University)

Ebony Magnus (Southern Alberta Institute of Technology)

Mark Puente (Association of Research Libraries)

Purpose: This poster will describe our use of qualitative analysis methodology to evaluate the presence of diversity-, equity-, and inclusion-promoting themes in the institutional documentation from academic libraries and their parent institutions. We will present results from a literature review on employee retention in academic libraries, with an emphasis on research that gives substantive consideration to retention issues among people from marginalized or traditionally underrepresented populations. We will describe how we used thematic findings of the literature review to develop a rubric for documentary analysis of institutional communication (including strategic plans, mission and vision statements, and other similarly “official” documents).

Methodology: We are currently conducting a literature review on employee retention in academic libraries, with a specific interest in research that discusses the retention of individuals from marginalized groups. The findings from the literature review are being used to inform the development thematic categories or “retention markers”—activities and attitudes that suggest an awareness or encouragement of an inclusive and equitable organizational climate. We will employ documentary analysis—“the process of using documents as a means of social investigation and involves exploring the records that individuals and organizations produce” (Gibson & Brown, 2009)—to examine corporate communication documents and code for the presence of “retention markers”. We will examine documentation from our own institutions first—a mid-sized applied education institute in Canada and a mid-sized research institution in the United States—followed by documentation from sets of peer institutions.

This poster will convey how a qualitative method of analysis can be used to evaluate the degree to which an institution formally acknowledges themes and language that signal important topics which can be related to the retention of employees from marginalized or underrepresented groups.

Findings: The anticipated findings for this project are twofold: (1) to identify through the literature review themes associated with retention of academic library professionals from marginalized or traditionally-underrepresented populations, and (2) to determine the efficacy of documentary analysis as a method for measuring qualitative organizational factors.

Value: Recruitment of individuals from marginalized or underrepresented populations has long been a concern for academic libraries; and for just as long, efforts have been measured by counting the employees who self-identify as non-members of the majority population. As the field of academic librarianship sees a marginal increase in these Human Resource metrics, the challenge becomes one of retention. Numeric measures—such as personnel counts or years of service—are used to report on retention as well as recruitment, but these metrics stop short of providing an explanation of the immeasurable (in the quantitative sense) factors that contribute to an employee’s decision to stay at or leave an organization.


77. Responding to Leadership Priorities and Campus Requirements in Strategic Planning

Elizabeth Brown (Binghamton University Libraries)

Introduction and Purpose: Strategic planning has undergone a change at our institution, migrating from a static, library-centric document to active participation in a university-wide strategic planning tracking system. Migration to this new system began in 2013 and coincided with changes in library leadership. One library dean left, a one-year interim dean was appointed, and the current dean was appointed from 2013–2016. Each leader has influenced the content and scope of the strategic plan and content in the system. This evolution has allowed the library to refine our strategic plan over time to allow for better communication between library leadership and improved collaboration with other campus units such as the Office of Institutional Research and Assessment.

Design, Methodology or Approach: Prior to implementation of the new tracking system, our library created static strategic plans focused on daily activities and scope of activities rather than specific projects. While objectives for each functional unit were created and refined over time, this static planning approach was not conducive to effectively creating project-based work and also limited the alignment of projects to campus priorities and accreditation standards. It also limited the number of times the plan was consulted and reviewed collectively by library leadership.

Each library dean interpreted how to populate for this tracking system differently and the plan changed significantly over this time. The initial version of the strategic plan was almost a direct parallel of the existing plan into the new format. As a result, there were many more objectives and measures created than was feasible for the library to effectively track and show progress. The second leader was very focused on creating new measures and objectives, with progress shown quantitatively rather than qualitatively. As a result, the library was not able to track projects or activities to best communicate their value. With the current dean the measures and targets became more project-focused with fewer ongoing measures remained.

Findings: Migrating the Libraries’ existing strategic planning information to a tracking system allowed us to track the status of projects more efficiently and better communicate the impact of projects on campus initiatives and campus priorities. It also helped the Libraries focus on higher-priority and more visible projects.

With the current leadership there is now a focus on fewer, higher-level projects with more visibility to the campus, rather than tracking increases in daily activities. Project-based work can be highlighted, with ongoing measures and targets used in cases where the information can be tracked longitudinally to show progress. One example of an ongoing measure would be library renovations and space upgrades. This information can now be collected in one place to show progress of spaces over time, and show how our library has responded to student and faculty space needs.


78. Building a Master’s Program in User Experience and Assessment

Dania Bilal, Rachel Fleming-May, Amy Forrester, Regina Mays, and Carol Tenopir (University of Tennessee Knoxville)

Purpose: In the fall of 2016, with funding from the Institute of Museum and Library Services, the University of Tennessee, an ALA-accredited master’s program, launched a specialized program to prepare 12 students for careers in User Experience and Assessment. Of the 12, six stated interest in academic librarianship, while the remaining students planned to enter careers outside traditional library settings. All 12 students completed their program and graduated in May, 2018. This poster reports on the structure and preliminary outcomes of the program, as well as plans for program sustainability.

Design, Methodology or Approach: In addition to the standard curriculum required for all M.S.I.S. students, students in the specialized program were required to complete a number of additional requirements:

Curricular:

  • Courses in
    • Research Methods
    • Planning and Assessment
    • Human-Computer Interaction (HCI)
    • Academic Libraries or Special Libraries/Information Centers (depending on interest)
    • Graduate-level Statistics (outside the LIS department)

Co-Curricular Activities:

  • Regular cohort meetings
  • Synchronous and asynchronous training in user experience and assessment tools, methods, and approaches;
  • Two semesters of on-site practicum placement,
  • A semi-structured research project.

At various points during the two-year program, students were asked to provide feedback on different aspects of the program. On-site practicum mentors were also asked to evaluate the program’s potential to contribute to the profession and prepare students to meet specific skill needs.

Findings: Surveys related to assessing the students’ and on-site supervisors’ experience of the practicum were approved by the University of Tennessee IRB. The poster will share those findings which indicate the practicum provided a beneficial opportunity for students to engage with mentors in the workplace and was largely a positive experience for both students and supervisors.

Practical Implications or Value: While the program was bolstered by significant initial funding, its success makes clear that User Experience and Assessment are areas of high demand for libraries and other agencies, and that there is a need for LIS education to support preparation for new professionals. As such, the program’s leadership team used feedback from students, mentors, and faculty to identify and prioritize the most important elements of the program for future students interested in developing expertise in this area.

Sustainability plans include the following:

  • A second, “semi-funded” cohort of students
  • A “Pathway” document to provide advising for students interested in UX and Assessment
  • Commitment from program mentors to host user experience- and assessment-focused students in the future
  • Plans to support User Experience and Assessment through Master’s program curriculum.

In addition to reporting on the design and outcomes of our program, the poster will provide attendees with suggestions for supporting formal and informal assessment- and user experience-related education and professional development for future library professionals.


79. Applying Organizational Assessment: Improving Processes, Communication, and Documentation through Workflow Visualization

Robert Heaton and Liz Woolcott (Utah State University)

Many libraries use workflow mapping to inform reorganization decisions, process improvements, and employee training. This project takes the next step and translates a workflow into different formats for different audiences. Utah State University Libraries’ technical services division developed over 40 maps (i.e., flowcharts) of key workflows in a collaborative and iterative process that spanned five months. In our analysis of these workflows, we found that the workflow for handling faculty requests for streaming video was unusually complex and its success highly dependent on one trained individual. As leaders in two library units that carry out this workflow, we took the workflow map as a starting point and streamlined its steps in concert with the team members involved. We then revised and reformatted the visually presented information to meet specific team member needs. We found that a flowchart (created in LucidChart) gave a high-level overview of all steps in the process, which was valuable for administrators considering process improvements and staffing changes. A Kanban board, created using the free web app Trello, could provide detailed process instructions so that an untrained employee could substitute for the usual specialist without becoming overwhelmed. Workflowy, another free program, offered a third format for storing documentation. With the unique ability to condense the steps of a written process into overview form or expand them to show detailed descriptions, as well as filter tasks by the party responsible, Workflowy was an ideal venue for leaders of different units to review team members’ procedures in the context of the larger workflow. Libraries can address organizational and procedural issues by displaying in creative layouts the information gathered in their assessment projects, efficiently formatting that information to meet the needs of specific audiences.


80. …OK, Now What? Reflections on a Six-Year Assessment Cycle

Carolyn Heine (California Baptist University)

The purpose of this poster is to present findings and reflections from a recently completed six-year assessment cycle (2012–2018) conducted at a university library that employs 9 librarians and staff members and supports just under 11,000 students across all degree levels. The design of the assessment cycle included the creation of Student Learning Outcomes (SLOs) and Library Capacity Strategic Goals (LCSGs), five annual assessment plans and reports, one five-year review of assessment practices and findings, one follow-up report to the Director of Assessment on planned changes to SLOs and LCSGs, and one 18-month follow-up to the Associate Provost for Accreditation, Assessment and Curriculum.

Major challenges included justifying to the Director of Assessment the need to alter typical assessment reporting requirements to better suit the library’s areas of assessment, redesigning assessment templates to be meaningful to the library and still meet requirements for both the university and the university’s regional accrediting body, and conducting assessment with limited staffing. Despite these challenges, the library was able to use assessment findings to justify (and receive) increases to several budget lines and help key stakeholders better understand the needs of our library.

The goal of this poster is to provide library professionals an example of a long-term assessment plan, highlight strategies to make assessment findings meaningful to the library and administration, and identify potential challenges and solutions during the creation, completion and follow-up of the assessment cycle. The poster will include visuals of parts of the templates used and the URL for all assessment items will be made available.


81. Sorting Out Library Programs and Services: A Strategic Service Review

Kelley Martin and Cindy Thompson (University of Missouri Kansas City-University Libraries)

Have you tried to list all the programs and services your library provides your users? Or tried to map those services to library programs? UMKC University Libraries created an activity that clearly showed the interconnectedness of our work, helped forge new connections between teams and departments, informed structural changes in the Libraries, and provided us with comprehensive documentation of our programs and services that will guide internal assessment in the future.

Outcomes

  • Create a definitive list of library services that are mapped to library programs
  • Encourage an understanding of the interconnectedness of library work
  • Build consensus for core library services and inform the prioritization of services
  • Develop a shared language around programs, services and tasks

Methodology: Every library department/team created a work inventory. Each item listed on the inventory was termed an activity. We printed each activity on cards with a form, which each department filled out as a group.

The form included space for:

  • the outcome of the activity
  • whether the activity was a program, service or task
  • primary department that performs the activity
  • other departments involved in the activity
  • priority (of services)

To give everyone a frame of reference and create a common language, we created the following definition:

  • Program=general area of responsibility/work in the Libraries in support of the mission
  • Service=specific area of responsibility/work in the Libraries in support of a program
  • Task=specific work completed in the libraries in support of a service

If the activity was a service, groups assigned a priority level using the following definitions:

  • 1—Priority—This service needs additional investment this year (though we might still need to look for opportunities for efficiency)
  • 2—Core service to maintain—This service is core to the Libraries function and should be maintained it at our current level, though we should always look for opportunities for efficiency
  • 3—Core service to scale back—This service is core, but we are not positioned well to maintain or grow it, and thus we should scale it back and look closely for opportunities for efficiency
  • 4—Not a core service—We need to look closely at this service and why we continue to do it, or if we need to eliminate it altogether

Before completing the forms, the participants were instructed to consider the Libraries’ mission as they made their decisions.

Once participants completed the forms, they:

  • sorted the tasks into specific services
  • assigned each service to a library program
  • created new cards for unlisted programs and services

While the participants were filling and sorting the cards, members of the project team listened to conversations and noted discussions and insights during the participants’ decision-making processes.

During the next phase, department heads and library leadership took the sorted cards and made additional connections between departments.

The project provided critical information for discovering and exploring the libraries’ internal connections, enabling us to take a fresh look at our structure and priorities.


82. Point-Counterpoint: Library Standards and University Accreditation Can Improve Academic Libraries

Megan Stark and Kate Zoellner (University of Montana)

Academic library scenarios related to the application of library standards and university accreditation in the development of library goals, plans, and reports are presented in a point-counterpoint format to expose the benefits and drawbacks of such adoption and alignment. The poster addresses the conference topics of methods and tools, organizational issues, and value and impact.

Purpose: Library director respondents to a 2010 survey on the ACRL Standards for Libraries in Higher Education “clearly indicated a need to align library standards with regional accreditation standards” (Iannuzzi & Brown, 2010, p. 487) yet the “number of research studies from LIS and higher education journals addressing institutional mission and goals and alignment, which includes accreditation, has decreased” (Connaway, Harvey, Kitzie, & Mikitish, 2017, p. 3). The 2016 Ithaka S+R library survey showed that “Library directors feel increasingly less valued by, involved with, and aligned strategically with their supervisors”, and they perceive institutional budget allocations to not reflect recognition of the value of the library. (Woolf-Eisenberg, 2017, p. 4)

This poster explores the ways in which standards and accreditation impact academic libraries, focusing largely on the possibilities the ACRL (2018) Standards for Libraries in Higher Education offer academic libraries to improve. Improvement is presented in terms of achieving desired measureable outcomes and demonstrating value. The influence of aligning library goals, plans, and reports with the standards of regional postsecondary institution accreditors on library improvements is also evaluated.

Method and Approach: Academic library scenarios and opinions related to the application of library standards and university accreditation in the development of library goals, plans, and reports are presented in a point-counterpoint format. Scenarios are based on institutional experiences documented on library and university websites and in published literature and reports. Opinions are based on those held by different stakeholders.

Findings: The point-counterpoint format exposes the challenges facing library administrators in the current complex higher education assessment environment, and at a time when funding for public higher education is decreasing.

Practical Implications or Value: The scenario arguments reveal considerations for librarians to address in their use of standards for benchmarking and peer comparisons as well as in their use of accreditation standards in the development of library goals, plans, and reports. This poster relates to the research question “How do library administrators and staff support accreditation efforts, and are these efforts recognized by the institution?” identified for further study in the ACRL (2017) Academic Library Impact report (p. 4) and responds to the newly revised ACRL (2018) Standards for Libraries in Higher Education.

Association of College & Research Libraries. (2018). Standards for libraries in higher education. Chicago, IL: American Library Association.

Connaway, L. S., Harvey, W., Kitzie, V., & Mikitish, S. (2017). Academic library impact: Improving practice and essential areas to research. Chicago, IL: Association of College & Research Libraries.

Iannuzzi, P., & Brown, J. M. (2010). ACRL’s standards for libraries in higher education: Academic library directors weigh in. C&RL News, 71(9), 486–87.

Wolff-Eisenberg, C. (2017). Ithaka S+R US library survey 2016. New York, NY: Ithaka S+R.