Teaching & Learning
Transparency in tutorials and research guides: about students, with students, for students
Abby Koehler, Whatcom Community College
View Poster (PDF)
Keywords: tutorials, research guides, content creation
View “Transparency in tutorials and research guides” abstract
Purpose & Goals
Librarians update tutorials and research guides as needed but don’t always know what student experiences of those tools is like. A two-year student-driven study reformed our library’s tutorials and guides using the Transparency in Learning and Teaching (TILT) framework.
Design & Methodology
Transparency in Learning and Teaching framework, focus groups, usability testing using authentic assessment, qualitative analysis.
Findings
Nothing about students, made without students, can truly be for students. This research demonstrated new pathways for designing library content based on student and faculty scholarship and undergraduate course-based research experiences.
Action & Impact
We began to programmatically assess student experiences of our tutorials and research guides. We completely designed the tutorial site and have a burgeoning method for mixing and reusing content on research guides to meet student needs within courses.
Practical Implications & Value
Our college community has come together through this work, and it has led to deeper partnerships with STEM faculty on research, instruction, and digital repositories. The authentic assessment is a lightweight tool to capture student needs and align library content and instruction with course and college goals.
Continuous Improvement in the Face of Change: Evaluating and Enhancing Library Webinars at an Online University
Amanda Bezet, National University
Tamara Ivins, National University
View Poster (PDF)
Keywords: Online university, virtual library, instruction, webinars
View “Continuous Improvement in the Face of Change” abstract
Purpose & Goals
It may seem counterintuitive to launch a new assessment project during a time of massive institutional change, but we wanted to hitch a ride on the wave of change and innovation sweeping our university and leverage it to establish a culture of continuous assessment and improvement of our library instruction offerings. Our project had two specific goals. First, we wanted to minimize the cognitive load needed to understand the format and purposes of our webinars. In the wake of National University and Northcentral University’s recent merger, our students were bombarded with a vast number of new resources and updated information. It was therefore essential that our library services (including webinars) were intuitive and easy to understand. Our second goal was to ensure effective return-on-investment of librarian time by identifying both less-attended and highly popular webinars, in the wake of an increased, post-merger student population.
Design & Methodology
We conducted a quantitative analysis of webinar attendance data over the previous year. Data collection involved collating and correcting data from two Libcal systems, due to the recent merger of two library systems. Attendance data was then averaged and correlated with the audience, time of day, and day of the week. The results were then color-coded to support rapid interpretation of the results. The quantitative analysis was augmented with a qualitative assessment of librarian feedback and published literature to contextualize attendance data, interpret meaning, and extrapolate best practices.
Findings
From our research, we generated a list of best practices for webinar titles, descriptions, branding, and tags. We also were able to identify several under-attended webinars suitable to be retired from their current live webinar format, as well as some highly popular webinars that should be offered more often. Through the internal assessment process, we also identified our webinars as an opportunity for continuous review and improvement.
Action & Impact
Our first output was a written report of recommendations shared with library leadership. From there, we conducted presentations on our findings to library leadership, the library instruction team, and the campus teams. Impacts included implementing our recommendations and launching a biannual systematic assessment to foster a culture of continuous improvement among our instruction team.
Practical Implications & Value
Attendees will take away practical tips for collecting and analyzing library webinar data and gathering internal expertise to contextualize the data. Attendees will be prepared to implement a similar project at their institution, share findings with internal and external stakeholders, and generate support for continuous assessment and improvement of their library instruction offerings. Institutional change can feel overwhelming, but embracing that change creates the opportunity for impactful and meaningful assessment.
Evaluating One-shot Asynchronous, Online Historical Primary Source Instruction: a Case Study Using Student Feedback
Amanda Roth, UC San Diego Library
Additional Author:
Dominique Turnbow, Instruction Librarian, UC San Diego Library
View Poster (PDF)
Keywords: Teaching and learning, primary source assessment, asynchronous instruction assessment, tutorial assessment, primary sources, rubric design
View “Evaluating One-shot Asynchronous, Online Historical Primary Source Instruction” abstract
Purpose & Goals
How can we effectively assess students’ ability to apply skills about how to evaluate and use primary sources taught via an online tutorial for a large-scale (250+ students) writing program course which does not provide access to student work products?
Design & Methodology
We reviewed data from a writing program course taught in two quarters with a combined enrollment of 574 students. Students completed an online tutorial where they learned skills required to analyze primary sources. The examples used in the tutorial were a letter and an image from the Haitian Revolution. The assessment methodology reflects best practices in rubric design and instructional design assessment, including sample sizes and interrater reliability. The questions students answer about primary sources are grounded in best practices in how to teach critical evaluation of primary sources. Some research that supports our work Zenobia Chan & Simone Ho (2019) Good and bad practices in rubrics: the perspectives of students and educators, Assessment & Evaluation in Higher Education, 44:4, 533-545, DOI: 10.1080/02602938.2018.1522528 D. Royce Sadler (2009) Indeterminacy in the use of preset criteria for assessment and grading, Assessment & Evaluation in Higher Education, 34:2, 159-179, DOI: 10.1080/02602930801956059 Robert L. Johnson , James Penny & Belita Gordon (2000) The Relation Between Score Resolution Methods and Interrater Reliability: An Empirical Study of an Analytic Scoring Rubric, Applied Measurement in Education, 13:2, 121-138, DOI: 10.1207/S15324818AME1302_1 Keeping Up with…Primary Source Literacy: https://www.ala.org/acrl/publications/keeping_up_with/primary_source_literacy.
Findings
Our data show 50% of students can describe primary sources after completing the tutorial, but many struggle to compare two sources. Most students are able to make inferences and discuss how they would use primary sources for research purposes. We conclude that there are ways we can improve the instructional content and assessment questions. Findings: Describe: Half of the students scored highly on being able to describe a source (i.e. identifying people, objects, and activities, correctly identifying the setting, making observations about the creation, and correctly identifying the audience for the text). Students struggle most with identifying where the primary source originated. Compare: Less than one third of the students were able to compare the primary sources in the tutorial. Most students were able to compare observations about the two sources, but they lacked the critical thought that higher scoring responses demonstrated. The students who scored higher in this category were able to articulate observations around a shared theme. Infer & Reflect: Most students were able to make inferences and reflect on the source; however, they struggled with discussing a source’s bias. This category includes why a source is created, understanding the bias, and asking questions about the source. Students were able to infer the purpose of a source and ask relevant questions about it, but they were unable to discuss bias beyond obvious observations, e.g. “it was written in the first person.” Students who scored higher were able to connect the historical context of the source with the creator. Use: Most students were able to demonstrate that they knew how to use this source. This category includes observing what information could be provided about the context of the source and how the information could be used to support data, argument, or background (the framework presented in the tutorial).
Action & Impact
Assessment allows instructors to make adjustments where it is needed related to content or delivery to improve student learning. Moving forward, we plan to refine our rubric so that it accurately reflects our intentions for student learning. We want to create an assessment process that is scalable to hundreds of students so that it doesn’t require hours of librarian labor to review student responses. Actions: Describe: Student responses indicate that students need more instruction about how to identify where a source originates. We plan to include more explicit content in the tutorial to address this. Compare: We reviewed the learning outcomes and content and realized that there wasn’t a specific learning outcome for “compare” and there was minimal content presented. We will add a learning outcome and content to address this, specifically we want to encourage students to focus on similar themes from the text and image rather than simple observations. Infer & Reflect: We want to improve how we present content around identifying a bias for a source so that students can critically evaluate a creator’s bias within the historical context of the source. Use: The content for the tutorial is solid here based on student scores; however, many do not consult the framework we are asking them to use when considering how to incorporate a source into their paper. We will draw more attention to this in the next iteration of the tutorial.
Practical Implications & Value
This project tackled the challenge of teaching students how to evaluate and use primary sources via an online tutorial, as well as how to assess student learning for qualitative responses for one-shot instruction for a large population. Summative assessment is particularly challenging for one-shot courses due to the lack of access to student artifacts. Our assessment design provides librarians with a rubric that could be modified to examine student artifacts to make conclusions about the effectiveness of student learning as well as identify ways to improve workshop content.
We Discover and They Learn: Partnering with Undergraduate Business Students on Library Assessment Projects
Carolyn Heine, California Baptist University
Kathryn Meeks, California Baptist University
View Poster (PDF)
Keywords: experiential learning, business students, assessment methods, campus partnerships
View “We Discover and They Learn” abstract
Purpose & Goals
The purpose of this poster is to provide examples of two types of library assessment projects completed through undergraduate course partnerships. Libraries have far more potential assessment projects than capacity to complete. Staff size, time, budget, and other resources determine the quantity, breadth, and depth of assessment queries. Some questions may never be investigated because other questions take priority. Libraries are also encouraged to involve students throughout the assessment process, but it can be challenging to find students willing to participate in the design and analysis part of the process. Undergraduate business students learn how to conduct basic research studies and create a product or plan to address a client’s need. Experiential learning (EL) is an evidence-based pedagogical approach to teach the skills needed to serve future clients. However, undergraduate courses often rely on fabricated or historical case studies for EL, and students have limited access to current, real-life opportunities outside of an internship or directed learning course (Mitchell & Rich, 2021). The CBU library had two assessment projects they could not pursue due to capacity limitations, but were able to complete by partnering with undergraduate courses. These assessment projects are 1) a mixed methods investigation of the underutilization of the Interlibrary Loan book borrowing service by CBU students and 2) the creation of an interactive data dashboard on database cost and usage.
Design & Methodology
The first partnership was with an upper division marketing research course where students learned how to design a mixed methods study, collect and analyze data, and make recommendations to the client. One librarian visited the course to present the issue: Why are so few CBU students using the Interlibrary Loan book borrowing service? The presentation included an explanation of the service, the current usage data, and other relevant information. Students were then split into teams of 4-5, and each team spent several weeks designing and implementing a questionnaire and analyzing the data. A librarian and course instructor provided feedback to drafts of each team’s survey instrument. After project completion, each team gave a 15-minute presentation to multiple library staff members on their findings and recommendations. All data collection instruments, raw data, write ups, and presentations were given to the library. The second partnership was with two students in an upper division business analytics course to design a streamlined data dashboard. We met with our assigned students to present the current data and data entry processes and explain what we were looking for in a data dashboard. After the initial meeting, the students met with one of us every 2-3 weeks to exchange questions and feedback. After the dashboard was completed, the students made a formal presentation. We tested it for user experience and final adjustments were made.
Findings
The ILL project validated what we suspected but lacked data to support: students were not using the ILL book borrowing service because 1) they did not know the service existed and 2) it takes longer compared to requesting journal articles. However, the teams unearthed a third reason for low usage: the request process for books is not integrated into our databases like journal article requests are integrated. Each team also made several recommendations for improving general social media practices and presence. Feedback from each team was collected through course evaluations and, more informally, through class discussions after the presentations. Feedback was very positive. Students enjoyed having a legitimate, current issue to investigate, and felt their work was a valuable contribution to the library’s assessment processes. Students felt heard and seen when library staff responded positively to their many recommendations, particularly for the ILL project. The two students who created the data dashboard indicated their excitement in creating a tool that would improve workflow, decision-making, and reporting. An unexpected outcome was both students and faculty indicated a greater appreciation for the library. In particular, the faculty and students involved in the database usage dashboard project all commented that they had no idea just how expensive library databases could be and were surprised at the complexities of database usage and cost analysis. We consider both projects a success. The time spent meeting with the faculty and students was far less than we would have spent completing each project internally. Although the ILL project methodologies had significant limitations, it still gave us enough to confirm our hypotheses while also providing unexpected findings. The data dashboard project was more straightforward and easily adopted.
Action & Impact
The ILL project findings and recommendations were used to adjust marketing strategies, and there has been an increase in student usage of the ILL book service. We are currently investigating a way to better integrate an ILL request link for books within relevant databases. Additionally, the library is creating a paid marketing intern position to improve and expand our outreach through social media. The data dashboard for database cost and usage has been adopted by the library and improved the data entry, analysis, and reporting processes. The data dashboard has also been a useful tool educate faculty about database usage and cost without revealing potentially sensitive information or overloading faculty with raw data. We are currently working on designing projects for future classes.
Practical Implications & Value
This project serves as a model for how other libraries can partner with campus faculty to improve student learning experiences and expand library assessment practices. Partnering with undergraduate business students benefits students and libraries. Libraries can complete more labor-intensive projects and begin answering those questions often left on the back burner. Students can apply what they’re learning through coursework to meet a real world need and bolster their project portfolios. It is challenging to engage students throughout the assessment process, and this can be a potential stepping stone to drawing student interest in library assessment projects. We hope this project helps expand the profession’s conception of what library assessment looks like and the ways in which we can involve the people our libraries serve.
Curator Definitions and Duties: A Qualitative Case Study
Duane Wilson, Brigham Young University
Additional Authors:
Emily Rodriguez, Brigham Young University Student
Jillian Bunderson, Brigham Young University Student
Isabella Beals, Brigham Young University Student
View Poster (PDF)
Keywords: Qualitative Methods, Case Study, Special Collections, Special Collections Curator
View “Curator Definitions and Duties” abstract
Purpose & goals; What questions does this paper address
The purpose of this study was to understand the position of a Special Collections Curator at the BYU Library. There have been questions about the role of the curator, what they think of themselves, how they act with each other, and how they interact with others in the library. This study was designed to answer the following questions: 1) How do special collections curators define their position and associated duties? 2) How do they work together and with others in the library? 3) What are the qualifications needed for their positions?
Design & methodology
This study followed a qualitative case study design, using interviews of all curators to obtain the data. Interviews were transcribed, anonymized, and coded. Thematic network analysis was used to develop the themes.
Findings
Curators defined their position as an individual having stewardship over a collection. This stewardship was performing the duties of developing and maintaining a collection development policy, working with donors, identifying and appraising potential collections to acquire, providing basic processing, identifying items for conservation, and providing outreach. They thought that providing primary source research opportunities to students was critical.
Some curators felt alienated because their duties were different from others, or they were excluded from certain important meetings. Some felt that they didn’t have time to complete their required tasks. They also felt like they should work closer with subject librarians to provide outreach. They thought that the BYU curator position was unique and that other institutions primarily had archivists who were focused on maintaining rather than developing a collection.
Curators considered an MLS as critical for their positions in addition to some related subject knowledge in the area that they were assigned. Archivist certifications were not as important, though people skills were considered critical.
Action & Impact
This study helped uncover and describe some of the concerns that curators had with their positions. It also helped highlight that curators wanted to work more with others in the library but didn’t want to step on the toes of subject librarians who had primary responsibility for outreach with faculty and students. It helped the administration understand the issues related to overworked curators so that they could be discussed and resolved. After the study, an additional full-time position was allocated to assist with the busiest curator.
Practical Implication & Value
This study can serve as a template for qualitative studies at other institutions. It allows institutions to understand the needs of certain groups and helps uncover and allow these needs to be addressed. This study also helps define a special collections curator since there are very few descriptions available in the literature. It can provide a point of comparison for other institutions to see where there are similarities and differences between them and the BYU curators.
Making a Difference in Healthcare: Impact of Librarian-Mediated Literature Review Services
Heather Martin, Providence Health System
Carrie Grinstead, Providence Saint Joseph Medical Center
Additional Authors:
Danielle Linden, MLIS, AHIP, Medical Librarian, Providence St. Joseph Hospital
View Poster (PDF)
Keywords: Medical Libraries, Hospital Libraries, Medical Librarians, Healthcare, Literature Searching, Reference Services, Program Evaluation
View “Making a Difference in Healthcare” abstract
Purpose & Goals
Providence Library Services provides electronic resources and professional library services to employees and affiliated medical providers of a large not-for-profit health system consisting of 52 hospitals and affiliated clinics geographically distributed throughout 7 Western US states. Library patrons include employees in all number of job roles, but predominantly from clinical positions such as physician, nurse, or pharmacist. One of the most heavily used services is the literature search service where Librarians search databases and grey literature for evidence to support patient care delivery. The Providence Library did not have a systematic way to gather feedback from our users regarding our literature search service. While we receive many grateful responses to our literature searches, we need more detailed, consistent information about our impacts to inform leadership and improve our services. Previous studies have documented the importance of library services in hospitals, and the impact librarians make on quality patient care. As a small team we need to carefully evaluate the effects of our work, share successes, and gather information that will help us improve. We conducted this IRB approved quality improvement project to understand whether our professional literature review service impacts patient care and saves clinicians’ time in a large integrated health system.
Design & Methodology
We adapted, with permission, the survey instrument and methods of a recent quality improvement project authored by Siemensma, et al (2021) in the Journal of Hospital Librarianship.
Our quality improvement project received IRB approval in late 2022. Throughout 2023, we sent the adapted survey to all employees who requested literature searches. When a librarian completed a search and sent results to the requestor by email, they noted the requestor’s email address in a REDCap form. Each Friday we ran a report of all email addresses that received literature searches two weeks prior. Using a generic email address associated with our private SharePoint site we sent an email with instructions and the survey link to requestors. Because the purpose and results of each search were different, employees received an email with the survey link each time they requested a new search. We used Excel to compile summary statistics, documenting: survey response rate, respondents’ primary roles, impact on practice, perceived quality of the literature review, and time saved.
Findings
Data collection closed in January 2024 after one full year. 1048 emails were sent to 924 unique clinicians/employees, with a response rate of 22.61%. Respondents came from across all Providence regions, with the top job roles being Nursing Staff (22.89%), Medical Staff (15.14%), Administration (12.68%) and Educator (12.32%). 203 respondents said our search added to their knowledge base, 208 said it modified or confirmed their current clinical practice, 89 said it modified or confirmed current policy, 44 said it improved their productivity, and 38 said it impacted advice given to their patient. Perceived quality of literature search results were overwhelmingly rated as Very High (84.98%) or High Quality (13.3%). Most respondents (59.83%) indicated that they saved 4 hours or more by using the literature search service. Overall, we received very positive feedback from users and nearly everyone perceived our search quality to be high or very high. Results show that our literature search service saves clinicians’ time and the information provided impacts both patient care and policy. However, one limit to our study is the possibility of voluntary response bias, whereby only satisfied users answered the survey. These results are not generalizable outside of our organization, but our project builds on a body of literature that demonstrates the value of professional library services in medical settings.
Action & Impact
These survey results allow us to carefully evaluate the effects of our work and gather information that will help us improve. Direct feedback from clinicians will be used to inform leadership and make a case for growing our services. We plan to share these results with health system leadership, at nurse research councils across the enterprise, and in the greater professional librarian community to encourage other libraries to evaluate their service impacts.
Practical Implications & Value
Hospital libraries are non-revenue-generating departments and are at risk of closure as the healthcare industry faces financial constraints. A librarian-mediated literature search service can positively impact patient care, save clinician time, and align with health system objectives. Projects like ours demonstrate the tangible value of these services to healthcare leadership and other stakeholders. Additionally, they build on the existing literature adding to the evidence base and strengthening the value proposition of hospital and health system libraries. The methods outlined in this project could also be applied to other special library or information services contexts where librarians are providing literature search services to a professional patron base.
Assessment Matters: Gathering evidence to support student learning outcomes, librarians research and professional development
Louise Lowe, University of Arkansas Little Rock
View Poster (PDF)
Keywords: Evidence based practices, peer observation, SoTL, program assessment, library instruction
View “Assessment Matters” abstract
Purpose & Goals
Libraries often struggle to demonstrate their impact on student learning outcomes beyond anecdotal evidence. With an increasing value on skills-based learning, co-curricular areas, including the library, have extraordinary opportunities to demonstrate impact on student learning and achievement measures. Background In 2018, our campus participated in a regional assessment academy to implement a systematic approach to co-curricular assessment. Co-curricular leaders, including a librarian, participated. Over the past five years, assessment practices for library co-curricular (teaching and learning) activities have evolved into an integrated practice that includes multiple forms of assessment evidence from multiple voices. This poster will highlight assessment strategies guided by Universal Learning for Design, Community of Inquiry, Council for the Advancement of Standards in Higher Education (CAS) and the university’s general education core skills requirements. Questions this poster will address How does the use of self, peer evaluations impact the effectiveness of teaching and learning strategies when used alongside assessment of student learning outcomes? This poster will highlight the use of observations that incorporate self, peer, and students to improve quality of teacher evaluations. What assessment evidence generated from self and peer evaluations supports SoTL research This poster will highlight assessment as research evidence, supporting evidence-based practices and scholarship of teaching and learning for faculty librarians. How does self and peer evaluations contribute to self efficacy and ownership of professional development of individual librarians? This poster will highlight how developing critical reflective practices strengthens one’s own responsibility over their work, and increases ownership of one’s own professional development.
Design & Methodology
Librarians complete an assessment summary for each section. First, librarians describe the intended outcomes, summarize assessment results, and provide artifacts. Second, librarians describe what happened, reflect on how engaged students were with the librarian, content and each other. Finally, librarians state how they will use the assessment information to improve the session or future sessions. Currently, librarians must complete at least two self observations and two peer observations per semester. They must be observed and observed by other librarians. First, the current observation form is very simply designed to gather initial buy-in. The observation form provides basic yes/no options to ensure design meets University Design for Learning guidelines. An additional three yes/no questions focus on students’ engagement with the instructor, content and each other, integrating concepts from the Community of Inquiry (i.e., teacher, social, cognitive presence. The last question is open ended; and librarians write about what they observed in the session, using the pre meeting and the previous observation questions for context. Currently, librarians are encouraged to share what went well and provide one achievable strategy to improve. Recommendations are not required but are encouraged. Librarians are also encouraged to meet before and after an observation to discuss desired outcomes and to discuss the feedback. Multiple measures and multiple perspectives increases the value of assessment evidence that can improve teaching and learning strategies, increase librarians ownership of their professional development and help demonstrate the library’s impact on skills development that can be Krishnan, S., Gehrtz, J., Lemons, P. P., Dolan, E. L., Brickman, P., & Andrews, T. C. (2022). Guides to Advance Teaching Evaluation (GATEs): A Resource for STEM Departments Planning Robust and Equitable Evaluation Practices. CBE Life Sciences Education, 21(3), ar42–ar42. https://doi.org/10.1187/cbe.21-08-0198
Findings
Preliminary Findings /Emerging Themes 1. Librarians are very critical in their own abilities. Observation has improved confidence. 2. Librarians are embracing continuous improvement. 3. Librarians are suggesting ways to improve assessment practices 4. Librarians are able to use data to support their teaching and learning research. 5. Need training on how to provide feedback 6. Need to define, normalize feedback 7. The library is increasingly becoming part of the campus conversation on teaching and learning best practices and co curricular assessment
Action & Impact
Yes, we plan to make adjustments to the assessment tools based on feedback and lessons learned. We will collaborate on ways to make the tool more effective and feedback more valid. We will see professional development on ways to provide effective feedback.
Practical Implications & Value
This poster will provide strategic ways to increase assessment practices. This method is grounded in theory, yet delivered in a practical way because of the flexibility to start where you are, take what you want, and improve as you go. How does this contribute to the overall body of work in library assessment or related areas? Libraries primary learning opportunities are typical in a one shot 50 minutes session. Relying on multiple assessment methods can help libraries tell a more accurate and comprehensive story to campus stakeholders while providing evidence to support librarian scholarship of teaching and learning.
The National Survey of Student Engagement: Information Literacy, Student Engagement, and Building Connections with Institutional Assessment
Maoria Kirker, George Mason University
Britt Foster, University of California, Riverside
Additional Authors:
Nicole A. Branch, Santa Clara University
Craig Gibson, Ohio State University
Carrie Forbes, Southern Oregon University
Merinda Kaye Hensley, University of Illinois Urbana-Champaign
Lalitha Nataraj, California State University San Marcos
Mea Warren, University of Houston
Jessica Hawkes, Nicholls State University
Amy Pajewski, West Chester University of Pennsylvania
View Poster (PDF)
Keywords: Student engagement, Information literacy, Campus assessment partnerships
View “The National Survey of Student Engagement” abstract
Purpose & Goals
This poster will present the ongoing research of the ACRL National Survey of Student Engagement (NSSE) Information Literacy Module Review Task Force, which was tasked with revising the “Experiences with Information Literacy” topic module. The NSSE measures first-year and senior undergraduates’ engagement and participation in programs and services at four-year colleges and universities. In 2020 the “Experiences with Information Literacy” topic module was placed on hold to review and realign it with the ACRL Framework for Information Literacy for Higher Education. This module offers libraries an opportunity to connect information literacy to student engagement indicators on their campuses. Additionally, it can serve as an outreach tool for librarians with their faculty and campus partners. Revised questions for the topic module were beta tested during the 2023 administration of the NSSE and the finalized module is an option for the 2024 survey. The Task Force work is now focused on the research question, “How can libraries use the assessment results of the NSSE and ‘Experiences with Information Literacy’ module to inform outreach practices and student engagement work?” We hope that this project will help address how libraries can build stronger connections and partnerships with their institutional assessment offices as they are often the responsible parties for the NSSE administration.
Design & Methodology
Assessment and research have driven the work of the ACRL NSSE Task Force thus far. To revise the module, we deployed a survey and ran a focus group of libraries who ran the NSSE “Experiences in Information Literacy” module during its initial iteration. The results of these projects informed the module revision. Beta-testing through the national administration, and in partnership with administrators from NSSE, occurred in spring 2023. The results of that testing informed final revisions of the module. At the time of the conference, we will have institutionally de-identified results from the first initial administration of the revised module. We plan to analyze not only the results of the module questions, but also compare them to the NSSE’s engagement indicators across the main survey. These indicators include Higher-Order Learning, Reflective & Integrative Learning, Learning Strategies, Quantitative Reasoning, Collaborative Learning, Discussions with Diverse Others, Student-Faculty Interaction, Effective Teaching Practices, Quality of Interactions, Supportive Environment. In comparing these indicators with the self-reported data from the IL module, we hope to draw statistical connections to the role of information literacy with student engagement on campuses.
Findings
Beta testing of the modules provided the Task Force with preliminary results about the interaction of information literacy with student engagement questions. For example, analysis using Pearson’s’ Correlation from previous modules indicated positive correlations with moderate effect size to questions connected to the engagement indicators of “Reflective & Integrative Learning” and “Supportive Learning.” We plan to test these correlations and relationships with the results from the revised “Experiences with Information Literacy.”
Action & Impact
The goal of the task force in the next couple of years is to inform the library community about how they can partner with institutional assessment/research on their campuses to add the “Experiences with Information Literacy” module to their local administration of the NSSE. This national survey offers libraries an opportunity to plug into institutional assessment efforts that may already be underway on their campuses. By administering this module, libraries can collect assessment data to support the role of information literacy with student engagement. To this end, the Task Force plans to create a toolkit for libraries who run the NSSE – either with the “Experiences with Information Literacy” module or not – to help use this assessment data as an outreach tool with institutional partners such as Student Life or other student success initiatives.
Practical Implications & Value
In addition to educating the library community about the NSSE and its topical options, we hope to inspire libraries to partner with institutional assessment to connect the work of libraries to larger campus-wide initiatives.
What did students learn? Comparing two models of integration in a first-year writing course
Mariya Gyendina, University of Minnesota
Lindsay Matts-Benson, University of Minnesota Libraries
View Poster (PDF)
Keywords: assessment, student feedback, curriculum mapping, writing courses, course integration
View “What did students learn?” abstract
Purpose & Goals
The university library has a long-standing partnership with the first-year writing course. In a typical year the course enrolls approximately 2000 students, with the libraries engaging with 96% of the students. In the past four years, we have tried many collaboration models, including facilitating curriculum-based information literacy activities, providing assignment feedback, teaching a single-session workshop (one-shot) and consulting with instructors to empower them to deliver the IL instruction themselves. The choice of collaboration model can depend on the libraries’ and the writing course staff’s capacity, logistics, timing and curriculum buy-in. One persistent internal conversation focuses on the learning outcomes and impact of the libraries’ instruction. To answer this question, we collected end-of-class feedback from first year writing students in the Fall of 2023 and Spring 2024. Specifically, we seek to answer the following questions: How do the students “lessons learned” map onto the course and session objectives? What patterns appear in the responses to the questions about what they already knew and the remaining questions? Are the results different for the course sections with the embedded librarians and one-class?
Design & Methodology
Two different groups of students completed the feedback survey: a smaller section of students who were in sections with embedded librarians (librarians provide feedback on 3 assignments and teach a live class on finding and evaluating sources) and a larger group of students with a single-session information literacy workshop (100-minute class on finding and evaluating sources). At the end of the session the students completed a short survey via QR Code or link asking them about what they already knew on the topic, what they learned, and what questions they still had (a KWL activity).
Findings
While the data collection for the Spring semester is still ongoing, the preliminary results point to limited differences between the sections with different library partnership models and also within the groups. Preliminary data suggests that students recall information related to learning outcomes covered at the beginning or end of the workshop, rather than the content in the middle of the session. Data also suggests that nearly all students are able to identify something new that they learned, but this varies based on their prior experience with information literacy and the research process.
Action & Impact
Based on the findings, we plan to use this data to develop a more effective pre-class assignment to assess prior knowledge to focus the workshop session. Additionally, we plan to use the data on what students identified as areas that are unclear to adjust the workshop curriculum and post-class resources. On the staff side, we plan to use the student feedback data to address ways to enhance our teaching practices in a workshop setting including connecting the student feedback to a peer observation of teaching model with our cohort of instructors.
Practical Implications & Value
Practical implications of this work include reflecting on how we can build consistent feedback channels and contribute to the larger understanding of how course integration models can work.
Assessing Library Instruction: Lessons Learning from a Pilot Evaluation Survey for Non-Credit Instruction in Academic Libraries
Matthew Frizzell, Georgia Tech Library
Marlee Givens, Georgia Tech
View Poster (PDF)
Keywords: Instruction, Evaluation, Teaching and Learning, Student Success, Mixed Methods, Promotion Tools, Qualtrics
View “Assessing Library Instruction” abstract
Purpose & Goals
Can we create an evaluation survey for course integrated instruction and drop-in workshops that complements for-credit course evaluation tools used at the campus level? Many librarians and archivists at this institution teach, but most of this instruction is not for course credit. The few who teach courses for credit benefit from evaluation at the campus level through the Course-Instructor Opinion Survey (CIOS) from the Office of Academic Effectiveness. CIOS results are attached to librarian and archivist promotion dossiers. Our library has piloted a survey for non-credit library instruction. When this project started, changes to our department’s promotion process and integration with promotion review processes at the campus level motivated us to show instruction effectiveness in some way. The pilot survey is similarly meant to offer meaningful feedback for individual instructors, their supervisors (annual review), the library review committee (promotion and cumulative reviews), library leadership, and our non-tenure-track faculty peers (promotion reviews).
Design & Methodology
The non-credit instruction evaluation survey is a mixed methods approach utilizing Qualtrics software. The survey was designed in two parts: 1) The survey itself which consists of 5 Likert scale questions and one open ended essay response field and, 2) a URL generator which embeds metadata, allowing for consistent formatting and easier categorization. In designing this project and creating a methodology, one goal was to develop a consistent vocabulary and approach to evaluating instruction. Part of the design process included creating a shared glossary which helped determine the scope of instructional types. A working group developed the questions and refined survey focus, gathered feedback from library faculty colleagues, and launched a pilot program with training.
Findings
After launching the pilot project in spring of 2023, we analyzed the first round of results and feedback from the pilot participants. The focus of this initial debrief was to understand the experience of administering the survey and any issues. Following the debrief, we made changes to the tool itself, editing the questions based on instructor feedback and attendee responses. At the time of this proposal, one year into the pilot, we have data from 12 instructors across 85 classes. We have collected over 1000 responses and over 250 open-ended comments. The poster will show average scores, the estimated response rate, a selection of typical comments, and any differences noted between responses for course-integrated instruction and drop-in library workshops. A finding of the project was that the nature of the short survey did not lend itself to deep or actionable criticism of the instructors or courses. This was purposeful – the survey designers sought to maximize student response rates in an environment where students suffer from survey fatigue. In reviewing the literature during survey design, we found that students are not motivated to complete a long survey (Hoel & Dahl, 2019), and that low response rates can indicate lower validity of the results (Chapman & Joines, 2017). Moreover, librarians and archivists can use other means to solicit more meaningful feedback, such as peer teaching observations, or surveying the professors in whose courses we provide instruction. Chapman, D. D., & Joines, J. A. (2017). Strategies for increasing response rates for online end-of-course evaluations. International Journal of Teaching and Learning in Higher Education, 29(1), 47-60. Hoel, A., & Dahl, T. I. (2019). Why bother? Student motivation to participate in student evaluations of teaching. Assessment & Evaluation in Higher Education, 44(3), 361-378.
Action & Impact
The Georgia Tech Library now has a tool which we can use to show quality, impact, and value of Library non-credit instruction for both workshops and course integrated instruction. At the time of this proposal, we have presented the first years’ worth of findings to library faculty and leadership, and general results will be included in the Library’s 2023 impact report. In the coming year or two, we will be able to test whether the survey results are meaningful additions to annual or promotion review dossiers. This will be evaluated by gathering feedback from the librarian or archivist using the tool, their managers, and the library and Institute faculty review committees. Another area of focus will be on whether the comments and scores create actionable feedback to improve instruction or lead to follow-up processes such as peer or campus led teaching observations. We hope that the tool and those resulting critiques will improve instruction and therefore student success.
Practical Implications & Value
Student success is a nationwide concern. The extent to which improvements to library instruction contribute to better student outcomes is a question we hope to shed light on. A review of the literature indicates ongoing efforts at assessing student learning in library instruction sessions, but fewer examples of student evaluations of library teaching. We hope our poster will add valuable feedback and contribute to the body of research and librarianship. Additionally, there seems to be a trend toward data-based decision making within libraries, mirroring trends across the academy. Our project looks at a potential model for applying student evaluations of teaching to non-credit instruction formats, that complements existing evaluations of credit-bearing courses.
Information & Data Literacy as a Core Competency for CMU-Q First-Year Students
Reya Saliba, Carnegie Mellon University in Qatar
View Poster (PDF)
Keywords: Information literacy, data literacy, core competencies, first-year students, undergraduate students
View “Information & Data Literacy as a Core Competency” abstract
Purpose & Goals
First-year students join the university with limited information retrieval, analysis, and synthesis skills. For many years, students would rely solely on Google as the main search engine to find information. This trend was quickly replaced by the emergence of Generative AI where students embraced the latest tools to save them time and effort collecting and synthesizing information. Such technological development poses a threat to students’ critical thinking abilities. This poster aims at describing a pilot mini-course that took place in the fall of 2023 during which Information and Data Literacy was introduced as a core competency. The poster will also address the challenges and opportunities that instructors involved in designing, developing, and delivering the course have experienced and share some suggestions to improve the course and enhance students’ engagement.
Design & Methodology
Using self-assessment, this poster analyzes qualitative data that emerged from a faculty focus group following the delivery of the mini-course.
Findings
The focus group discussion indicates that faculty were satisfied with the level of student participation. This might be linked to several factors: a. the limited number of sessions as this is a mini-course and runs for a limited time; b. the relatively short sessions where each is 50-min long; and c. the use of a hybrid teaching model consisting of a brief lecture followed by group work to keep students engaged and attentive. However, faculty highlighted some challenges that need to be addressed in the future iterations of this course. For instance, the content needs to include more practical examples, coupled with activities to practice. Furthermore, the transition between the different components of the course and different faculty members did not allow students to build strong connections with the instructors which might have impacted the communication channel between students and instructors. This also impacted the transition between one faculty to another as we did not contextualize the different sections and how they are linked or, in some cases, they complement each other. The limited communication might have also impacted students’ motivations and they were not putting any efforts in pre-class work and they only focused on reaching the passing grade which was set to 70%.
Action & Impact
There are three main areas that need to be considered to improve this course and ensure a successful delivery and satisfactory level of student engagement:
-
- Course syllabus, material, and content:
a. The online course page needs to be simplified to avoid endless layers and sublayers of content pages.
b. The course syllabus needs to be reviewed in class in the first session to ensure students clearly understand the expectations in terms of attendance, in-class activities, quizzes and assignment submission, as well as out-of-class work.
c. Emphasize the role of this course as a transition from high school to university by giving the students the chance to reflect on their current experience and inviting senior students to share their experiences with new students. - Course structure, delivery, and assessment:
a. One faculty member needs to be identified as the main instructors, while the other faculty can be brought in as guest instructors.
b. Students should be clearly advised how to contact the main instructors and guest instructors to avoid miscommunication or duplicate communication and ensure requests are addressed in a timely manner.
c. Quizzes, assignments, and group work activities should be undertaken, completed, and submitted to the instructor during the delivery of their teaching section to avoid overlapping with the deliverables of the next instructor/section. - Students’ perception and engagement:
a. Ensure students are engaged with the content by revising and condensing the material, showing strong connection between materials, providing detailed instructions, and giving the students the chance to reflect on their learning journey.
b. Ensure students are engaged with the instructors by changing the class venue to a more collaborative space.
c. Ensure students are engaged with other students by encouraging teams to meet with instructors to ensure that team work is being divided equally and everyone is contributing.
- Course syllabus, material, and content:
Practical Implications & Value
For first-year students, the university can be an intimidating and overwhelming environment. Having the required skills to navigate through their academic journey requires students to develop and continuously improve their critical thinking abilities. The C@CM course is designed to ensure students have the foundational skills for their success. The variety of materials integrated in this course ensures that students develop transferable skills that would serve them well throughout their academic endeavors as well as in their future workplace. A new and improved version of C@CM is being developed currently for Fall 2024 taking into consideration CMU values of dedication, impact, collaboration, creativity, empathy and compassion, inclusion, integrity, and sustainability.
Undergraduate Student Perspectives on Asynchronous Library Modules
Rocio Lopez, University of Michigan
Doreen Bradley, University of Michigan Library
Additional Authors:
Rachel Hoster, University of Michigan Library
Karen Reiman-Sendi, University of Michigan Library
View Poster (PDF)
Keywords: online modules, student interviews, information literacy instruction
View “Undergraduate Student Perspectives” abstract
Purpose & Goals
What question or issue does the poster address? Building on a previous assessment of the asynchronous use of the library’s digital learning objects by faculty in course-related websites, librarians conducted undergraduate student interviews to learn more about their individual use of online instructional modules. These modules cover topics such as Searching Library Databases, Evaluating Sources, Reading Scholarly Articles, Finding Books, Academic Integrity, etc. Librarians wanted to understand:
- how effective the modules are for student learning
- what content is helpful
- how effective videos are for student learning
- how students applied the information in this and subsequent courses
- if they have suggestions for other topics for inclusion in an online module.
This assessment effort provided complimentary feedback on library-created Canvas modules, from the student perspective.
Design & Methodology
What materials, traditions, methods, data, and/or literature are employed to explore the question? Librarians obtained data detailing course websites that imported one or more of the library modules from Canvas Commons during the fall 2023 semester. In November 2023, an email was sent to 36 course instructors who taught 42 courses, asking them for permission to contact students in these courses, for the purpose of participant recruitment. Over 1,700 undergraduate students received an email invitation to participate in a short interview with librarians, to be scheduled in January 2024 (winter semester). Fourteen interested students indicated their willingness to interview; eight students actually scheduled 30-minute interviews with librarians.
Findings
Although interview participation limits the extent to which findings can be generalized, the following themes were observed:
- Topics of the library’s instructional modules are of interest. Most participants were required or directed to use more than one module (e.g. Searching Library Databases, Finding Books, Reading a Scholarly Article, Academic Integrity).
- Much of the content of these modules served as a “refresher” for about 75% of participants. 25% of students indicated that the content was entirely new to them. Every student reported learning something new, however, even if most of the content was a refresher.
- Most participants found it easy to navigate through the modules. Most believed the content was easy to understand.
- Half of the participants did not view the embedded videos. They provided suggestions for better integration into the module.
- All participants clearly understood the relevance of the module contents to their course content or course assignments.
- Almost all participants reported that the modules helped them understand where to go to locate library sources.
- 75% of participants indicated that they were able to apply what they learned in the module(s) either for other course assignments or for personal use of the library. The remaining 25% reported that they were not able to apply what they learned yet because they had not received other research assignments at the time of these interviews.
Action & Impact
- In our previous interview with faculty, they stressed that students want more videos in online learning modules. This was not the case for the students we interviewed. If videos are embedded, they encouraged us to indicate learning goals to contextualize why they should watch the video and indicate how the content in the video is different from the written content in the module. We are planning updates for this summer to incorporate this feedback.
- We will integrate more “sign posts” to help students understand navigation through sections of modules, and in addition to describing particular concepts we will include more language describing why a particular concept is important.
- Understanding the impact and learning that happens through these modules helps us to tell the story of the library’s impact on student learning and success, and the ability of students to complete their assignments well.
- We may work with our IT services to create a more granular way to track module usage so we can identify how many times individual modules are integrated into which courses.
Practical Implications & Value
- Students find the content of these modules helpful for their immediate coursework and beyond, demonstrating that online modules can result in transferable skills. Therefore, it is very worthwhile to create asynchronous modules for independent use by faculty who do not wish to have a librarian visit their class. It extends the impact and influence of the library.
- It is important to talk to both faculty and students on perceptions related to library instruction; they can differ greatly on certain aspects, such as the role of videos in online asynchronous modules.
- Our modules are on Canvas Commons for use and editing by anyone.
- Interviews are indispensable in gathering detailed feedback from students on their own learning. We will gladly share our interview protocol.
Information Literacy Assessment: Partnering with a Living Learning Community to Assess Library Instruction
Sara McCaslin, Western Kentucky University
Additional Author:
Larry Sean Kinder, Western Kentucky University
View Poster (PDF)
Keywords: Library Instruction Assessment, Living Learning Communities, Outreach, Student Success
View “Information Literacy Assessment” abstract
Purpose & Goals
Assessment is critical in creating, measuring, and maintaining effective information literacy instruction for all students. Kuh and Gonyea (2015) recommended a necessary collaborative relationship between classroom faculty, student affairs, and library faculty to promote “educationally purposeful activities” (p. 373). Smith (2019) discussed assessment culture in academic libraries and used a single institutional case study to determine that there is importance in highlighting individual efforts related to assessment of information literacy instruction in academic libraries. Establishing strategic partnerships between academic libraries and campus partners can lead to more student involvement both inside and outside of the classroom. Libraries are positioned to provide the necessary collaborations needed to increase institutional involvement that leads to student success and retention. This case study sought to assess the impact of information literacy library instruction on first-year Living Learning Community (LLC) Criminology students at Western Kentucky University and asked the following question: Are the general information literacy class and the Special Collections information literacy class beneficial to Criminology LLC students? Information literacy and library instruction in higher education provide students with foundational research skills to locate, organize, evaluate, and synthesize resources and to communicate findings through writing and other means. Delivery of effective information literacy and library instruction can play “a greater role and responsibility in creating new knowledge, in understanding the contours and the changing dynamics of the world of information, and in using information, data, and scholarship ethically” (ACRL Framework, 2015). Association of College & Research Libraries, Framework for Information Literacy for Higher Education (2015), accessed February 13, 2024, www.ala.org/acrl/sites/ala.org.acrl/files/content/issues/infolit/framework1.pdf. Kuh, G. and Gonyea, R. (2015). The role of the academic library in promoting student engagement and learning. College and Research Libraries, 76(3), 359-385. Smith, M. (2019). Working toward a culture of instructional assessment. Reference Services Review, 47(4), 487–502.
Design & Methodology
As a primary research method, the survey was designed to collect data directly from the subjects. The pre and post surveys measured students’ knowledge before and after the information literacy classes. Questions were designed to reveal students’ comfort and awareness levels related to services offered through University Libraries. Students enrolled in the Fall 2023 Criminology 101 course titled Introduction to Criminal Justice were the subjects of this case study. Students received a link to the pre and post-test surveys through their CRIM 101 Blackboard course site. The surveys were created via Qualtrics and consist of eight questions for the Helm/Cravens Library pre/post survey and six questions for the Special Collection Library pre/post survey. All questions were on a Likert scale ranging from strongly agree to strongly disagree or similar response options. Upon institutional IRB approval, students received a message through their CRIM 101 Blackboard course site informing them of the study, which was conducted in collaboration with the course instructor. Students had access to the consent form prior to completing the pre and post surveys and were asked to read it before proceeding. Students then provided their responses to eight multiple choice questions for the general library pre/post survey and six multiple-choice questions for the Special Collections pre/post survey about their experience during these two information literacy sessions. The assessments took approximately five-eight minutes to complete during one sitting. Students had two days to complete the pre-surveys and two days to complete the post-surveys after the information literacy sessions. After two days, access to the survey links was closed.
Findings
Overall, the results of this case study showed that students’ comfort, awareness, and knowledge of library services increased from the pre to post surveys. For example, Special Collections Survey question one asked: On a scale of 1-5 (1 is extremely unaware, 5 is extremely aware), how aware are you of the Special Collections Library at WKU? Of the 11 total responses on the pre survey, four were extremely unaware, two were somewhat unaware, one was neither unaware nor aware, and four were somewhat aware. Zero respondents were extremely aware of the Special Collections Library at WKU. In contrast, the post survey showed that of the 11 total survey respondents, six were extremely aware, three were somewhat aware, one was neither unaware nor aware and one respondent was extremely unaware. Comparing the pre and post extremely aware and somewhat aware responses shows an increase in knowledge about Special Collections existence and awareness for the Criminology LLC students. Most pre and post responses showed an increase in knowledge gained during the general library and Special Collections library information literacy sessions. Some conclusions can be drawn from these responses including a sense that students are generally unaware of the services and resources available through the library as first-year students and that interaction via library visits can increase knowledge and awareness. The Criminology LLC students only visit the library once for a general information literacy session and once for a Special Collections information literacy session throughout the semester. Another conclusion could be drawn in relation to additional library sessions and a possible deeper understanding of library services and resources. Further partnership building and collaboration would be needed to increase the number of library visits throughout the semester.
Action & Impact
The researchers who conducted this case study include two WKU Department of Library Public Services librarians. One is the Core Curriculum Instruction Librarian, and the other is the Humanities/Social Sciences Librarian. They are currently collaborating to write an article for submission into an academic, peer-reviewed journal to share their findings with the academic library community. Their timeline notes a submission deadline of July 2024. Beyond publication, the librarians will share these findings with their colleagues to further solidify and encourage library involvement in LLC each academic year. Further sharing of this case study with campus partners include Housing and Residence Life as well as Student Enrollment and Success administrators. The institution has relied heavily on the success of LLC as a recruiting tool for future students and added data collected from this case study could improve and bolster recruitment efforts in the future. Within the Library, this assessment case study data will be used to show the impact WKU Libraries has on student success as the Library Dean prepares reaffirmation accreditation documentation for the Southern Association of Colleges and Schools Commission on Colleges (SACSCOC). Such data can assist in the reaffirmation process and show dedication to institutional effectiveness offered through library activities.
Practical Implications & Value
LLCs at WKU have proved to positively impact retention rates. According to the WKU Office of Institutional Research, 94.8% of first-time, first-year LLC students returned for the spring 2023 semester, compared to 90.4% of first-time, first-year students who did not participate in an LLC (WKU News, 2024). Administrators place value on LLCs and data collected related to library involvement with LLCs show a commitment to students’ success and persistence. The Department of Library Public Services (DLPS) at Western Kentucky University currently performs several outreach activities to the LLCs on campus each fall semester. Librarian involvement with the Criminology LLC has grown over the past few years to include a common reading and an author visit to campus. The assessment case study will impact future library outreach activities to LLCs and provide positive impact data to Housing and Residence Life personnel and campus administrator to show that information literacy can play a role in student success and retention. The researchers hope this case study data will contribute to the overall scholarly conversation related to library assessment. Specifically, the researchers foresee a positive impact on library involvement in LLCs and outreach to more student success initiatives. One of the researchers, along with other WKU Librarians, has already published in this area with hopes to build the literature related to library outreach to LLCs in higher education.
See below:
McCaslin, S., Howell, K., & DeLancey, L. (2022). Library outreach to Living Learning Communities: A case study. College & Research Libraries News, 83(10), 449. doi:https://doi.org/10.5860/crln.83.10.449
Howell, K., McCaslin, S., & DeLancey, L. (2023). Libraries and Living Learning Communities: Exploring strategies for outreach and programming. Journal of Academic Librarianship, 49(3), N.PAG. https://doi.org/10.1016/j.acalib.2022.102657
WKU News. (2024, February 16). WKU announces significant fall to spring retention gains. Media Relations. https://www.wku.edu/news/articles/index.php?view=article&articleid=11768
Reimagining the Partnership Between Academic Libraries and Tribal Libraries
Darlene Lytle
Keywords: IDEAS, sovereignty, social justice, indigenous, reciprocal relationships, collaboration
View “Reimagining the Partnership” abstract
Purpose & Goals
While attempts have been made to include indigenous resources in library collections and programming, little has been discussed in the academic library sphere about making indigenous students feel welcome. Our assessment focuses on how academic libraries can use assessment to combat embedded colonial systems, starting with evaluating the inclusivity of the University of Washington Libraries. We propose adopting the IDEAS framework from the University of Washington Information School to ensure indigenous students are included and thrive in the academic dialogue. Public institutions, like the University of Washington, have a responsibility to provide inclusive spaces and resources, as mandated by the Revised Code of Washington, RCW 27.12.285. Establishing reciprocal relationships between academic and tribal libraries is key, supported by numerous case studies. Our preliminary assessment metric aims to center sovereignty in diversity, equity, and inclusion efforts, focusing on the indigenous user and their community.
Design & Methodology
Scope Dr. Sandy Littletree and Cindy Aden’s project at the University of Washington Information School helped establish the project’s geographical focus by identifying Washington Tribal Libraries and professionals willing to collaborate (Duarte & Belarde-Lewis, 2015). We drew inspiration from “Leading by Diversifying Collections: A Guide for Academic Library Leadership” by Bledsoe et al. for our assessment approach. Inspiration from ‘The Library and Beyond: Decolonization as a Student/Academic Co-Created Project’ guides aim to empower indigenous students (Hopkins et al., 2023). Models from the University of Montana’s Tribal College Librarians Institute (TCLI) and Princeton University Library’s Indigenous Collections Working Group (ICWG) inform partnership and ethical stewardship in Indigenous collections within academic library settings (Bledsoe et al., 2023). Justification Notable absence of discourse regarding increasing sense of belonging for indigenous students in academic library collections (Bucy, 2022). Colonial legacies persist within the library profession, impacting organization and dissemination of indigenous knowledge (Agrawal, 2002; Ghaddar & Caidi, 2014). Academic libraries must actively partner with tribal libraries and prioritize outreach efforts to foster reciprocal relationships with indigenous communities (Duarte & Belarde-Lewis, 2015). Diversifying library content and language important to goal of inclusivity (Webster & Doyle, 2008). University of Washington’s dedication to supporting American Indian and Alaska Native students’ success evident in various initiatives (UW Libraries, 2020). Ethical Frameworks and Theoretical Positions Indigenous research methodologies and theoretical understandings used to conduct ethical research and develop frameworks (Littletree, Belarde-Lewis & Duarte, 2020; Cajete, 2000; Duarte & Belarde-Lewis, 2015; Haworth & Knight, 2015; Lee, 2019; Wilson, 2008; Loyer, 2018; Doyle, 2013). Guidelines adapted from the 2020 Guidelines for Ethical Research in Australian Indigenous Studies by AIATSIS (AIATSIS, 2020). Emphasis on principles of rights, respect, and recognition of Indigenous knowledge (AIATSIS, 2020). Framework fosters mutual understanding and partnership, prioritizing community benefits and responsible data management (AIATSIS, 2020)
Conclusions
Our preliminary research has made it a point that indigenous students do not feel comfortable accessing the library. Effective librarianship centers the user and serves the entire community. While larger institutions in various countries have attempted to right the injustices indigenous communities have faced, academic libraries have focused solely on how they promote and organize the indigenous resources they have. An assessment method for academic libraries that measures libraries’ success at building reciprocal relationships with the community’s indigenous members is key. We propose doing so through tribal libraries is a strong opportunity for such collaboration. Our project will research how best to do this and provide a suggested plan of action. We anticipate our method needing to change once we begin working with tribal libraries and better understand their needs in a way only they can know. The best assessment is informed by stakeholder needs. Most of all we must ask, what do indigenous stakeholders need from academic libraries to feel comfortable participating in and therefore enriching academic knowledge as a whole?
Implications & Value
This work will act as a guide for academic libraries to approach their relationship and outreach efforts with the indigenous members of their community beyond the collection. As with any assessment, the primary goal will be to establish where a library is currently successful and where there are growth opportunities. Most importantly we hope this work will increase the number of Indigenous students who feel welcome in academic libraries
Collections
Usage Data for the Users
Darlene Lytle
Keywords: usage data, DEI, workflow, collection analysis, critical librarianship
View “Usage Data for the Users” abstract
Purpose & Goals
My purpose is to share what I do as the student assistant to the Assessment Librarian at the University of Washington to maintain and ensure the collection of usage data. My goal is to start a conversation around how we can use this usage to serve the library’s Antiracist and DEI goals. The best way to improve is to learn from each other and the Assessment Conference seems the best place to do so.
Design & Methodology
The method I employ in my job includes three main tasks. The first is to keep sushi credentials up to date in Alma, where we passively collect usage data for librarians to find when needed with help from my direct supervisor. I audited the success of each vendor’s usage collection in Alma last year and went hunting for solutions and missing data when possible. The second task is working with vendors who provide some COUNTER-compliant data but do not have Sushi credentials. The third is to collect data that is not COUNTER-compliant, which means it could be a unique data set in various formats. The last two tasks are not passive and involve navigating web pages and dashboards based on internal notes from past years. The assessment librarian assistant, unless a librarian makes a special request for data, gathers this data once a year for the previous year. This data is used to negotiate contracts and make selection decisions, so it is gathered each year before most contracts are renegotiated. There is some method and pattern to this work dependent on the dashboard (or lack of one) vendors use to share their data. Many vendors with COUNTER-compliant usage have been gaining sushi credentials, at which point I add them to Alma. Those vendors that still require manual collection of their data may do so based on a business decision or the nature of the materials they provide. Overall, using this data has proven valuable to selection decisions and is why we do this work.
Practical Implications & Value
I hope attendees will consider what untapped potential there may be in usage data that could impact DEI efforts in libraries. I also want to share how we have streamlined our data collection in the chance that this helps others do the same. Finally, I would hope to learn from the attendees ways we could potentially improve our system through their questions and discussions.
Evaluating the impact of e-resource usage on retention and performance metrics for online-only undergraduates
Anita Hall, University of Louisville Libraries
View Poster (PDF)
Keywords: Student Success, Library Impact, Analytics
View “Evaluating the impact of e-resource usage” abstract
Purpose & Goals
Like many US Higher Education institutions, the University of Louisville seeks to use a data-driven approach to improving student retention and performance. In support of these initiatives, campus units including the libraries are under increasing pressure to demonstrate the impact of our services on these outcomes. For this project, we are comparing outcomes between students who did and did not access library e-resources in hopes to demonstrate a relationship between e-resource usage and key retention metrics that support our university’s strategic goals.
Design & Methodology
This pilot study combines data from the University of Louisville’s EZProxy logs with student data in our Campus Information System to compare retention and academic performance for students in two of our largest online-only undergraduate programs, Psychology and Nursing. Data from these two systems was matched on unique identifiers, deidentified, and then processed with Microsoft PowerBI. As this is a short-term pilot study we are not expecting to find evidence of statistical significance, only to identify and compare relationships between e-resource usage and outcomes in order to shape future, larger-scale research.
Action & Impact
We hope to use these findings to advocate for additional library support for analytics work to enhance our understand of the library’s impact on student success. We also hope that these findings will allow our library administration to advocate for increased resources from the university to support key initiatives. The online nursing degree in particular is a major priority for the university as our state is suffering from a shortage of nurses, and significant funding is currently being dedicated to enrolling and retaining students in this program. We intend to expand this pilot study into a larger examination of library impact on student success for other key programs at the University of Louisville.
Practical Implications & Value
We envision this work contributing to the growing body of research on library impacts on student success, as well as library analytics research more generally. Others who are interested in embarking on this type of work may find a pilot project of this scale to be approachable and manageable within their organizations.
Assessing a Pilot Program: Student-Selected Popular Reading Books
Danielle Dempsey, Villanova University
View Poster (PDF)
Keywords: Popular reading, leisure reading, recreational reading, academic libraries
View “Assessing a Pilot Program” abstract
Purpose & Goals
Falvey Library at Villanova University launched our Popular Reading Pilot Program in February 2024. As part of our ongoing strategic mission to enrich and encourage the continuous development of our students, we began efforts to identify vendor options and form a volunteer student selection committee in Fall 2023. By engaging student volunteers, we hope to ensure that the library leases titles that will be of interest to their peers. It was important to library staff to engage our student community in the selection of book titles, format, and circulation policies. Because the mission of this project is to serve student wellness and not add any stress, we wanted to ensure that volunteer commitment levels were manageable and circulation times were reasonable. We communicate asynchronously, primarily using a Microsoft Teams page for student selectors, with moderation from library staff. The result was a leased collection of approximately 400 print books from Baker & Taylor, with ongoing student involvement as we maintain the collection and rotate in 15 or so new titles per month. We plan to assess the pilot over one calendar year before officially committing to Popular Reading as a permanent part of our library. This poster will address assessment methods used to initially develop the pilot, to refine the pilot over the next year, and the datapoints we will assess when determining whether to adopt this pilot program permanently in February 2025.
Design & Methodology
Datapoints include: Circulations to assess popularity of titles selected, number of overdue books to assess appropriateness of loan term, student feedback regarding book format preferences (print, e-book, audiobook, or some combination of the three), student feedback regarding Student Selection Committee engagement preferences (synchronous vs. asynchronous, software used, reasonable time commitment), source of book recommendations (Student Selectors, online suggestion form, physical suggestion box in library), online user engagement with Popular Reading on the library website, requests for Fiction vs. Non-Fiction, requests by genre. – Environmental scan of academic library literature to see if other universities are engaging students as selectors
Findings
At the time of the conference, our pilot year will not have concluded (2/5/2024 – 2/4/2025), but we will have some preliminary findings and trends to share. Circulation has been very high in the first month of the pilot, and we are interested to see if this level of engagement is sustainable, or if it tapers based on the length of the pilot or time of the academic year.
Action & Impact
The pilot has already drawn our attention to other issues. For example, we have received many suggestions for books that are already available in Main Stacks or as e-books. As a result, we’ve updated our Popular Reading landing page, signage, and added an explanation of where to seek other reading material within the library as a physical FAQ list in the Popular Reading shelves. Our Communications Team is working on a video tutorial for finding popular reading, especially fiction, within Main Stacks. Finally, we now maintain a public list, available on VUfind, of student suggestions available in Main Stacks. We look forward to seeing if this increases the circulation of these books in our Main Stacks collection too.
Practical Implications & Value
We hope to provide guidance to other universities looking to engage volunteer student selectors. This pilot has already included many surprises and taught library staff a lot about our student body – for example, there was a strong preference for leisure reading in print over digital formats. In light of the pressures on university students, especially in these years immediately following COVID-19 lockdowns, we hope that our experience can inform other universities in supporting their student body’s well-being and non-academic interests.
Bringing Data to the Big Deal: Modeling a Path to More Sustainable Commercial Open Access
Esra Çeltek Coşkun, University of Illinois at Urbana-Champaign
View Poster (PDF)
Keywords: Elsevier, ScienceDirect, Agreement, Big-Deal, Cost per Use, COUNTER, Usage Report, Open Access (OA), Authors, Disciplines, Article Processing Cost (APC)
View “Bringing Data to the Big Deal” abstract
Purpose & Goals
In preparation for the 2025 renewal of our Elsevier ScienceDirect agreement, our institution gathered significant data. After careful analysis of usage statistics and our researcher’s publishing output in Elsevier journals, we determined that it was time for a makeover of our library’s multi-year traditional subscription-based agreement.
Design & Methodology
We generated our COUNTER TR J4 ‘Journal Requests by Year of Publication’ reports for all our campuses for the total duration of our existing muti-year agreement. We requested a report from Elsevier showing various data fields for our researchers’ publications in Elsevier journals for the same period. We downloaded datasets from Web of Science and Scopus websites to determine the subject categories of the journals. We analyzed all this data using Tableau to identify patterns in our usage as well as in our researchers’ publications and to create a model for our next agreement.
Findings
Our analysis identified four areas of potential improvement in the coming license period, including: (1) cost per use was too high i.e. the way we calculate cost per download is significantly different than the standard method ; (2) lack of funding disadvantaged some disciplines in transitioning toward open access publishing; (3) total number of OA articles in hybrid journals were very low; and (4) OA articles were published mostly in journals with lower impact factors and lower citation counts.
Action & Impact At the end of our negotiations, we reached an OA agreement that is going to improve all of these. The best thing about this is that we accomplished this with less than 0.5% increase on our current annual increase and a projected average cost savings per OA article in a hybrid journal of 89%. At the end of our contract term, we project more than 3,330 OA articles compared to 246 OA articles without an OA agreement.
Practical Implications & Value
Although our new agreement has been in effect for the last three months, we already see a positive impact on these areas, and our researchers seem to enjoy the makeover.
High Cost eBooks to Support Course Reserves
Kerri Goergen-Doll, Oregon State University
Taylor Ralph, Oregon State University
View Poster (PDF)
Keywords: eBooks, textbook affordability, collections, budgets, ROI
View “High Cost eBooks” abstract
Purpose & Goals
At Oregon State University Libraries & Press (OSULP), we were curious if expensive ($500 plus) eBooks purchased for course reserves would see a similar return on investment as eBooks purchased below that price point. Current policies require purchase approval for course material over $500. We wanted to explore the usage of these high-cost materials to confirm if our policy needed to be adjusted or not.
Design & Methodology
Beginning in 2022, we tracked each course reserve request for an eBook purchase that was over $500. Over that period, we have tracked usage of those eBooks during the term the content was intended to support a course. We include basic bibliographic information, date of purchase, cost, class size, course, discipline, number of sessions taught, whether the course is in-person, hybrid, or through our e-campus. Each month, we collect TR_B1 COUNTER data as it is available to determine full-text views and downloads, and usage as it relates to individual chapters, tables of content, or indexes.
Findings
We can see trends related to when readings are assigned in the class, the amount of the eBook used (one or more chapters), and an analysis of the cost per use (CPU) of high-priced eBooks versus other course reserve texts purchased in the past eighteen months. This will be compared to the CPU of eBooks in the general collection. In addition, an analysis of the type of license (3-user, DRM-free) and platform will be shared.
Action & Impact
These findings will inform how our purchasing policy for course reserves is updated. We will propose that special approval for eBooks over $500 will not be required. This will expedite the process for staff processing course reserves and make that content available to students faster.
Practical Implications & Value
Institutions have approached textbook affordability in multiple ways. At OSULP, we have committed a portion of our collections budget to support the purchase of print and electronic content for course support. Our pilot to determine the ROI of high-cost eBooks for course reserves could help other institutions consider changes to financial allocations, policy updates, and procedural and decision-making changes. The assessment data we used for this project can easily be collected and analyzed. It also can scaffold into additional assessment work related to eBook use in courses that will inform future work on textbook affordability.
Where Access Meets Advocacy: Evaluating a Small University Library’s Textbook Program
Megan Elizabeth Gregory, University of Washington Tacoma Library
Megan Watson, University of Washington Tacoma Library
View Poster (PDF)
Keywords: Textbooks, course reserves, affodability, OER
View “Where Access Meets Advocacy” abstract
Purpose & Goals
For the last five years, the UW Tacoma Library reserves staff have proactively purchased all campus course textbooks that cost $75 or more, as posted on our bookstore’s website each quarter. We conducted this study in order to assess our textbook acquisition activity over this time period in relation to usage data. Due to collections budget constraints, questions around circulation of materials, and the national concerns around textbook affordability, we wanted to find out how to best negotiate our decreased funds to support the needs of our unique student population and discover their use patterns, while balancing prospective budget considerations, developing sustainable collections policies, and aligning our program with our library mission around student success.
Design & Methodology
Using Alma Analytics, we created a report that included print and ebook monograph purchases on UW Tacoma Library’s course reserves fund code between 2018 to 2023. Data was downloaded as a .csv file and imported into Google Sheets for cleaning, analysis, and visualization, with particular attention to trends related to purchases by program, quarter, and year, as well as format. In addition, we examined quantitative and qualitative data from a Spring 2023 student survey which included several questions about textbook and course reserve use, and we reviewed recent literature relevant to this topic. These materials formed the basis for conversations that included stakeholders from collections, access services, OER, and leadership in order to determine how this work might inform our textbook program moving into the 2024-2025 academic year.
Findings
Our preliminary results indicate that library textbook use declined significantly in the time during and following the pandemic, with spending fluctuating over the five years studied. While certain programs, particularly Psychology, saw high spending and high use, total expenditures did not necessarily correlate to use across all disciplines. Print books were purchased in greater numbers for all programs with the exception of Literature, and ebooks were found to be significantly more expensive on an individual title basis. Students indicated that they valued access to freely available textbooks; however, relatively few reported using course reserves. It is unclear which student communities are best served by this program and whether increasingly popular inclusive access models, alternative means of access such as rental programs or online pirating, or reduced awareness following the return of on-campus services were responsible for the reduced usage of textbooks on course reserve.
Action & Impact
In our current context, we are seeking to balance our student-centered mission and values with the reality of continued budget and space reductions. Our textbook purchasing program was established to mitigate the economic burden of expensive course materials and improve student retention. With those goals in mind, however, we find ourselves at a crossroads: do we continue to purchase higher priced textbooks despite lower use and the greater staff workload demands? Do we pivot to prioritizing highly circulating materials in order to potentially serve the greatest number of students? Is our current program merely a band-aid for a larger, systemic issue, and what is our library’s role in contributing to improved student outcomes in the long term? How can the evidence we’ve gathered be utilized, now and in the future, to advocate for increased textbook affordability? These are the questions we are wrestling with as we continue to consider our findings in determining the future of the program moving forward into the 2023–2024 academic year.
Practical Implications & Value
Our project is a case study examining how one small, urban serving campus library is approaching the evolving needs of our students in relation to the changing textbook affordability landscape. Acquiring textbooks for course reserves to provide students a low-cost alternative to buying individual copies is a well-established practice across college and university libraries, but the pandemic, the rise of inclusive access and similar auto-purchasing and LMS-integration models, and continued lack of funding requires a rethinking of such programs. We present our library’s approach as one model, particularly applicable to similar institutions, that intentionally puts data into conversation with our strategic priorities around student retention and our shared values of equity and inclusion.
Assessing Journals for Predatory Practices: Lessons from Revamping the Vetted List of Vision Science Journals Instrument
Michelle Schonken, University of Colorado Anschutz Medical Campus
View Poster (PDF)
Keywords: Journals, Instrument, Vision Science, Predatory Publishing Practices
View “Assessing Journals for Predatory Practices” abstract
Purpose & Goals
This poster will describe how librarians from AVSL and MLA Vision Science Caucus collaborated to develop a journal assessment instrument and algorithm for quality and predatory practices, resulting in the Vetted List of Vision Science Journals on avsl.org. After several years of using the review instrument, membership turnover and publishing practices presented challenges to using the criteria and scoring process. These challenges will be cautionary examples, transitioning into showing how the group revamped the criteria and instrument to overcome issues and enhance usability and consistency.
Design & Methodology
In 2022, the group streamlined the reviewing process, identifying what to keep or change. A primary reviewer still scores a journal, while two secondary reviewers evaluate to confirm or dismiss the algorithm’s verdict, which is based on a cut-off score. The cut-off score expedites the process, but librarians make the final judgment. This poster will describe what changes were made to address criteria difficulties and scoring confusion. The revised scale was tested on a sample of journals to see how they scored in comparison to how they fared on the old one.
Action & Impact
After satisfactory results and implementing the new instrument, the group was able to continue working together to efficiently review journals to be added to or rejected from the AVSL Vetted Journals List, while maintaining their high standards.
Practical Implications & Value
Not only will attendees will learn how to assess potentially predatory journals for their own communities, but also the important work that goes into updating and improving any criteria or scale.
Designing subject-responsive decision support tools for print collection review
Sidonie Devarenne, Western Washington University
Mary Wegmann, Western Washington University
View Poster (PDF)
Keywords: Weeding, humanities, STEM, print books, decision support tools
View “Designing subject-responsive decision support tools” abstract
Purpose & Goals
Weeding an academic library is a complex and labor-intensive process and poses challenges for designing a uniform approach that accommodates subject-specific review criteria and workflows. In 2022, subject librarians and Collection Services personnel at Western Washington University developed an ambitious and cyclical plan for reviewing the general collection of roughly 500,000 items every four years, something that had not systematically been done for decades. The Collection Management and Assessment (CMA) department was tasked with designing and facilitating the data-informed collection evaluation process. This poster explores how the CMA department collaborated with subject librarians and other Collections personnel to identify relevant data points to inform withdrawal decisions, methods for collecting and compiling data from multiple sources, and the evolution of decision support tools to respond to disciplinary needs and librarian preferences. Discussions about the relevance of different data points across disciplines and strategies for collection review between subject librarians and Collections personnel also led to increased understanding and alignment of Collection Services and subject librarian goals and priorities for the project.
Practical Implications & Value
This project adds to the research on subject-specific monograph assessment criteria and workflows and introduces strategies for collaborating on data-informed weeding between Collections personnel and subject librarians. The presenters will share their methodologies for collecting and integrating data from disparate systems into decision support tools to assist in making withdrawal recommendations, and the process of iteration based on reviewer feedback and disciplinary needs. They will discuss strategies they employed to collaborate across library departments, and opportunities they identified for reducing subject librarian workload related to the project. Viewers will come away with an understanding of the benefits and drawbacks of how two different decision support tools are used in the Humanities and STEM.
Are These Good Investments? Evaluation Rubric for Open Access Investments
Laura Spears, University of Florida
Additional Authors:
Erin Gallagher, UF George A. Smathers Library
Tara Tobin Cataldo, UF Academic Research Consulting & Services
Suzanne Stapleton, UF Marston Science Library
Perry Collins, National Endowment for the Humanities
View Poster (PDF)
Keywords: Open access, read and publish, rubric, assessment
View “Are These Good Investments?” abstract
Purpose & Goals
Library professionals have long made use of evaluative tools for decision-making around collection development. Traditional tools are adequate for many collection development scenarios, but the increasing variety of acquisitions models combined with entrenched budgetary challenges present the need for new methods and inventive thinking. An R1 academic library’s Open Access Task Force developed a rubric to understand specific resource needs and opportunities in a rigorous and transparent manner. Rubrics can be both instruments of collection analysis, but informative objects that help guide our own values and philosophies in our collections practices. This presentation illustrates the rubric details and examples of initiatives evaluated.
Design & Methodology
The evaluation tool includes unique criteria for assessing a publishing initiative’s value, the risks/benefits to institution authors, and factors like acquisition budgets, licensing, author rights and sustainability. The rubric identifies six unique categories (cost, sustainability & risk, significance to the institution’s authors, ethical practices, equity & mission, and logistical feasibility including user privacy), and scores each initiative received from publishers of all types. The presentation will include a sample of a completed analysis and provide data from library expenditures that demonstrate the cost/benefit analysis of open access publishing and read and publish agreements specifically.
Findings
Over three years, nine agreements were evaluated and seven were accepted. Of those accepted, there was a 16.8% increase in open access publications with these seven publishers; additionally, the article processing charge (APC) defrayed held a value of $479,067 for 229 articles. The renewal cost of these agreements over four years increased by almost $50k or 12.9%. While these results appear favorable, the larger question of the affordability of the open access future remains.
Action & Impact
This rubric helps the OA task force in understanding the many ways that a publishing agreement can support our university community. It takes into consideration the mission of the publishers, the value and diversity of their content and it also considers the author’s rights implications of each initiative. In this way, our rubric creates transparency that facilitates decision-making and offers that rationale to all interested stakeholders. As publishers continue to monetize aspects of open access publishing, this effort can support the efforts of libraries to understand which agreements work for their community and provides evidence to support the increasing need for budget expansion and advocacy within government levels where the academic library challenge with the open access mandate could be better articulated.
Practical Implications & Value According to CHORUS, the independent, nonprofit membership organization that tracks open and public access publications, the university’s specific OA publications have increased eleven-fold, from 735 in 2017 to 8211 in 2022. Academic libraries continue to be funded without increases from year to year while for profit publishing costs continue to increase. The aim of the library’s Open Access Task Force evaluation rubric is to assist our acquisitions unit to identify beneficial scholarly publishing and balance the needs of the campus community with the many resources offered by the open research ecosystem.
To date, the rubric has been presented to the statewide consortial assessment committee and at two other conferences. However, we are interested in expanding the study to look at other factors in open access publishing such as author rights and development fees, so feedback on how to include this is solicited at presentations. LAC 2024 is the only assessment community to view this work. The rubric is publicly available on a free, open research platform.
Data, Data Visualization & Data Privacy
Revealing Practices with Patron Privacy: Transparency of (Name of State) Academic Library Websites
Laura Spears, University of Florida
Stephanie A. Jacobs, University of South Florida
Kirsten M. Kinsley, Florida State University
View Poster (PDF)
Keywords: Privacy, User Policies, Confidentiality, Analytics, Library Ethics
View “Revealing Practices with Patron Privacy” abstract
Purpose & Goals
Academic libraries increasingly face challenges by the growing collection of user analytics. Digital technology has proliferated automated data gathering based on user behavior, especially as use of electronic resources dominates the circulation realm. In addition, higher education institutions have increased use of user monitoring and librarians face mounting pressure to demonstrate impact on student success using library transactional data married with student academic metrics. As so much user information is ubiquitously collected and used, in what ways does the role of the library evolve in ensuring historical adherence to user confidentiality when engaging with libraries? To determine how academic libraries use their websites to present information describing users’ personal information collection, use and disposal, we looked at privacy policies posted on the libraries’ specific websites. This poster presentation describes the findings of 2023 research in which 70 statewide academic libraries’ websites were reviewed for the presence and content of user privacy policies. The study team examined web-published privacy policies to determine how and with what content libraries make their users aware of the personal information collected from them as they use library resources and services. Building on work by Valentine and Barron (2022) who reviewed privacy policies of U.S. Association of Research Libraries’ (ARL) member libraries, the study team opted to review a southeastern state’s public and private universities and colleges to explore how the academic libraries present user privacy information.
Design & Methodology
This review was designed to gather information on privacy policies at public and private not-for-profit public two-year and four-year institutions in the state of Florida. The study team employed a content analysis of 70 academic library websites creating an inventory based on the literature review of privacy concerns and focusing on the presence of a library-specific privacy policy. Metrics collected included institutional profile data, presence of a library privacy policy, accessibility of the policy page and the content contained in the policy such as authorizing laws/ethics, user data collected, collection practices, policy limitations among other items. The study team conducted reliability testing (.844) and completed the data collection between November 2022 and February 2023. The analysis was limited to the 15 websites that maintained a dedicated privacy policy page.
Findings
Key findings included that only 15 of the 70 institutions maintain a dedicated privacy policy page as part of the library website. Of the 55 academic libraries without a dedicated library privacy policy page, 36 did not link to an institutional page with student privacy language. Of the 15 institutions with dedicated library privacy policy pages, only six libraries actually named the language as a policy. Other description included statement, principles, patron rights and protocol. Key elements included in the statements were materials, website and computer use policies; however, less than half of the policies described student personal data collected during reference transactions, space use, or space surveillance. None of the policies mentioned data collected during instruction sessions although this is commonly done by libraries for institutional reporting and service improvement. Seven of the policies mentioned third-party vendor activity but only one library provided a direct link to a third-party vendor website. Libraries mentioned collection of personally identifiable information (PII) but most did not describe their data management practices.
Action & Impact
While the actual sample size of separate library policy pages is small, the information gathered was interesting and a good framing for future research. It is concerning that rarely do these policies ever mention learning analytics. It is a subject left largely untouched, which is ironic as learning analytics becomes a more integral practice of institutional assessment. While students may have had learning analytics collected by their K-12 schools, it is important to inform and empower students as they enter adulthood and attain full agency. The results of this study are being used by the statewide library consortia assessment steering committee to further examine the libraries without privacy policies as a means to developing a best practices document to provide to the members of the college and universities governing council.
Practical Implications & Value
This study demonstrates that while libraries reference a variety of associate privacy ethics – American Library Association, International Federation of Library Associations, Digital Library Federation -the low number of libraries that detail PII practices is noteworthy as it serves the need for institutional compliance rather than informs the patron. Additionally, a library is a dynamic ecosystem of third-party vendors of which patrons are rarely aware. This includes library management systems, database and publisher content, archival software, and Google Analytics, among others, all that possess their own privacy policies which are likely overlooked by library patrons. This study shows that academic libraries have room to grow their roles in creating greater awareness and knowledge of user privacy and that their primary portal for student usage, the library website, should robustly describe practices and policies and include opportunities for users to access more information about the “twilight of anonymity” (Ron Steslow, Politicology podcast, 21 Feb, 2024). This presentation is based on results published in Mann, E. Z., Jacobs, S. A., Kinsley, K. M., & Spears, L. I. (2023). Tracking transparency: an exploratory review of Florida academic library privacy policies. Information and Learning Sciences, 124(9/10), 285–305.
Website Analytics Platform Migration: A Case Study
Amy Swackhamer, University of Maryland Libraries
View Poster (PDF)
Keywords: websites, analytics, data privacy, Matomo
View “Website Analytics Platform Migration” abstract
Purpose & Goals
This is a case study reviewing the University of Maryland Libraries’ 2023 migration from Google Analytics to the privacy-oriented website analytics platform, Matomo. It covers the impetus for the project, the steps taken to research the Libraries’ technology and data requirements for analytics on the websites we manage, the search for available analytics platform options, the inputs used for testing and evaluation to make a platform recommendation, and the implementation of a new analytics platform.
Design & Methodology
The research process began with a broad, informal literature review of materials about website analytics privacy in libraries to understand the issues and trends specific to libraries, whose needs and values vary significantly from more prevalent commercial approaches to website analytics. Looking at generalizable website use metrics, surveying analytics users in our Libraries, and ingesting blog posts, platform reviews, podcasts, and other online information outside of academic research were also instrumental in identifying the analytics data important for tracking our website performance’, finding appropriate platform options and comparing their advertised features, and creating testing criteria to assess and compare their performance on some of our websites during a trial period. Internal surveys and platform demonstrations with discussions with colleagues informed my understanding of our organization’s analytics needs and the interface preferences of our internal users.
Findings
During this project, I developed a much better understanding of concerns and challenges for libraries regarding website analytics platform use, as well as the current landscape and trends in website analytics platforms. I found that some of my initial ideas about criteria were misplaced, once I researched and tested them further, uncovering a few refinements I would make if I were to conduct another analytics tool assessment. We also ran across a few unforeseen challenges that might be informative for anyone undertaking a similar project. Evaluating potential platforms using our own criteria resulted in evidence-based support for us using a platform with an annual subscription cost. After the completion of research, testing, and user feedback, a clear leader emerged for our platform choice based on our Libraries’ preferences and needs, so in our case our process was a useful way to explore and winnow our options. After completing this project, I feel that moving from an analytics platform that uses websites as vectors to gather and monetize data about our visitors, to one that respects privacy while still providing necessary website usage data to help us assess our websites is both achievable for many libraries and much more consistent with the values of the library profession.
Action & Impact
The research stages of this project culminated in recommending and implementing a new website analytics platform, so we have already taken action on the conclusions. By the time of the 2024 Library Assessment Conference, our new analytics platform will have been in use for over a year, and a presentation would incorporate an overview of any noteworthy aspects of our experience using the new tool, Matomo.
Practical Implications & Value
Website analytics are a ubiquitous tool used in the assessment of library websites. Sharing the work undertaken at UMD can increase awareness of user data privacy concerns with the most commonly used website analytics tool and alternative options to measure website use. For community members already familiar with concerns within the profession about data harvesting through library websites, providing more coverage that highlights this issue can increase the prominence of data privacy as an important consideration for libraries, even if analytics are only one piece of the puzzle. Outlining the process our Libraries used for evaluating website analytics platform options provides one model for stages in this process, information to gather, factors to consider, and resources needed for undertaking a similar effort. Libraries have different data needs, budgets, types and sizes of web properties, and technical expertise, but this case study describes an approach for reviewing analytics platforms that libraries in different situations could customize for their own specifications. People tracking analytics trends estimate Google Analytics to be used on about 85% of sites that monitor analytics. Due to its market dominance, many of the available analytics learning resources and integration tools only address Google Analytics, but much of that content is aimed at marketers in organizations without a commitment to user privacy or intellectual freedom. This case study will demonstrate that privacy-respecting analytics alternatives are becoming increasingly viable, as legislation like the European Union’s General Data Protection Regulation and the California Consumer Privacy Act incentivize the creation of tools that protect user privacy.
Using Visitor Counts for Decision Making
Cherie Turner, University of Houston Libraries
View Poster (PDF)
Keywords: User privacy, Library spaces, Decision making
View “Using Visitor Counts for Decision Making” abstract
Purpose & Goals
The University of Houston Libraries began using a card access turnstile system for entry to the MD Anderson Library in Fall of 2021. This system provides us with significantly more detailed information about library visitors than we have previously had, which has raised significant privacy concerns. This poster will discuss how we have balanced privacy concerns and utility, and how we have paired this data with other data sources (like circulation data) to inform decisions related to library hours, staffing of service points, safety policies, and policies about entry into the library.
Design & Methodology
This project relies on combining transaction level entry data from the access gates with entry data from a security desk, and in some cases transaction level circulation data. In order to protect user privacy as much as possible our primary methods involve aggregated counts and averages. We also categorize based on the academic calendar and the day of the week, and incorporate campus partner concerns and policies into analysis.
Findings
Having access to transaction level entry data for our library is extremely useful, but is a complicating factor for so many decisions made in the Libraries. This includes security concerns and decisions, concerns about user perceptions of the Libraries, and the need to establish and re-visit policies about both access to the Libraries and the maintenance of the data, and many other areas. We have almost certainly not found all of the ways that this change requires us to change practice.
Action & Impact
We have used this information to inform library hours and understand staffing needs for service points. We have also used this information to surface areas of concern, both within the Libraries and also with campus partners, like facilities and our campus police department, as well as student affairs and university leaderships. We are also beginning to use what we have learned for outreach.
Practical Implications & Value
The main value to the community is in furthering conversations about how libraries can balance practical concerns with user privacy.
Communicating Assessment Project Results on Student Preferences: The Stickers Tell the Story on Fenwick Library
Jasmine Spitler, Elon University
Emily Nilson, George Mason University Libraries
View Poster (PDF)
Keywords: User experience, micro-assessment, student preferences, data visualization, student stakeholders
View “Communicating Assessment Project Results” abstract
Purpose & Goals
The poster addresses the issue of how to communicate micro-assessment results to student library users in an eye-catching manner.
Design & Methodology
In Fall 2022, the Assessment & Planning team at George Mason University Libraries conducted micro-assessments, which are “narrowly-focused, short assessment tools that can be quickly designed, implemented, analyzed, and used to make changes to library services” (Taylor, McMullin, Hackman, & Buller, 2016). These micro-assessments took place in the lobby of Fenwick Library, the library on the main campus, where posters hung on the windows with different kinds of questions for student library users to answer– using stickers! Different sticker colors corresponded with different categories of students: undergraduate/resident, undergraduate/commuter, graduate/resident, and graduate/commuter. All questions were regarding student preferences for various library spaces and resources. These questions included: What activities do you use the library for? Does Noise Zone impact your choice of library space? What is your Noise Zone preference? Are you interested in storage? What is your storage reservation length preference? What furniture do you use in the library? What library spaces do you use? Do you prefer natural light? Open comments. During data collection, several students asked the team how the results would be communicated to the public, so in a collaborative effort, the Assessment Librarian and the Database Integrity & Analysis Specialist created an infographic using the GNU Image Manipulation Program, Alma Data Visualization Tool, and Inkscape. Many of the icons used in this infographic were home-grown during this process, as well as the floor maps visualization (which was used to indicate student preferences regarding locations in the main library). This infographic was printed as a poster in honor of the study and was displayed in the Fenwick lobby, and images were created for the Libraries’ Instagram account and the TV screens in Fenwick Library. The infographic also featured a QR code, so that anyone interested could read the full report located in the institutional repository.
Findings
Students were more interested in seeing the results from the study than the original research team expected. This created an opportunity for the Assessment Librarian and the Database Integrity & Analysis Specialist to collaborate on a data visualization to represent the study’s findings. However, throughout this process, scope creep for the project manifested in unexpected ways. For example, printing was delayed due to a color profile mismatch between the open source image software and the campus printers. The project also grew in unexpected ways, which led to a better end product, however requiring heavy staff resources made it difficult to reach the intended time frame.
Action & Impact This was the first time an assessment study’s results were communicated back to its students at George Mason University Libraries, as well as the first time the Assessment team did such an extensive data visualization. Findings were also presented to Administrative Services for their review.
Practical Implications & Value
Communicating the results of an assessment project contributes to the overall body of work in library assessment because it highlights the student users as the primary stakeholders for library services. Oftentimes, assessment projects are done and only reported to Administrative services, without communicating back what had been found to students. In creating this kind of feedback loop, we can look forward to students who are more involved, and more invested, in participating in these assessment projects, because they know that their feedback truly matters, and they are going to see an end-result.
From Hands-On to Hands-Free: It’s Time to Automate Your Data Displays
Jennifer Moon-Chung, University of Pittsburgh
Berenika Webster, University of Pittsburgh
Additional Authors:
Rob Behary, Duquesne University
Berenika M. Webster, University of Pittsburgh
Anne Koenig, University of Pittsburgh
View Poster (PDF)
Keywords: Workflow Efficiency, Automated Reporting, Data Visualization
View “From Hands-On to Hands-Free” abstract
Purpose & Goals
This poster introduces approaches to automating regular dashboard updates, driven by the need to improve upon workflow processes that have traditionally been performed manually. Our objective was to refine the mechanism which updates the Assessment unit’s essential dashboards that showcase our collections and services and to develop a workflow that efficiently manages the influx of ad hoc report requests. Through automation, we sought to shift librarians’ focus from labor-intensive tasks to more meaningful data analysis, thereby optimizing efficiency and increasing the depth of insights gained.
Design & Methodology
An investigation into the automation of report updating focused on two primary tools: Microsoft PowerAutomate and Alteryx. To explore how the individualized features of each of these tools may affect their capability to cater to differing needs, dataset sizes, and resources, this project was conducted at two different institutions: Duquesne University and the University of Pittsburgh. At Duquesne University, PowerAutomate was deployed to automate the updates of hourly head counts. This was crucial for a mid-sized institution like Duquesne, which needed to monitor library space usage for staffing and operational hours. Data was originally collected via a LibInsight web form, with student workers entering time and user numbers by floor. The library analytics team would then download and publish this data monthly via PowerBI. To address specific needs, such as monitoring usage trends during finals, the frequency of data updates was increased to once daily. The adoption of a new automation approach transformed this process by enabling the direct parsing of data from LibInsight into SharePoint spreadsheets, automating data cleaning, and subsequently refreshing PowerBI. This fully automated workflow significantly reduced the requirement for active effort and streamlined the update process, operating seamlessly without manual intervention. On the other hand, the University of Pittsburgh, which encompasses several regional libraries and centers, had interest in exploring a greater diversity of tools due to challenges faced due to existing user familiarity with the available software. This led to the exploration of new tools, including Alteryx, which was timely introduced to the institution and found to be promising for updating dashboard data. Our initial trials included parsing data from Alma Analytics via API, data cleaning, and automatically updating Tableau datasets. Despite the considerable effort required for the initial setup, this approach has greatly streamlined data processing and analysis.
Findings
The exploration of automation tools has not only illuminated their potential to enhance operational efficiency but also emphasized their profound impact on our comprehension and handling of data. The development of these automated workflows has led to a thoughtful reassessment of our daily data management practices, encompassing activities such as recording, collecting, parsing, manipulating, and disseminating data. The automation of these routine tasks positions librarians to potentially devote more time to data analysis, thereby fostering more informed decision-making and facilitating strategic advancements.
Action & Impact
This project is an ongoing endeavor to refine and enhance the automation workflows within library systems. While we have garnered some insightful findings and reflections, the nature of this project is evolutionary, with continuous improvements and adaptations based on emerging data and feedback.
Practical Implications & Value
Libraries across the spectrum employ comparable systems, tools, and methodologies for data sharing and management, making the dissemination of practical use cases of automation workflows highly beneficial. Sharing these can significantly aid those entrenched in data-related tasks who aim to reduce time spent on routine operations and enhance their analytical endeavors. The specific workflows utilized by the authors for this project will also be made available, fostering a culture of efficiency and innovation within the library community. By adopting and adapting these proven workflows, the library assessment community is better positioned to concentrate on strategic analysis and decision-making.
Visualizing Library Impact: Creating Accessible Dashboards of Organization-Wide Data
Josie Cotton, Case Western Reserve University
View Poster (PDF)
Keywords: Tableau, data visualization, accessibility, data-driven decision making
View “Visualizing Library Impact” abstract
Purpose & Goals
We created dashboards of the core functions of our library, including physical collections and spaces, user services, electronic resources, and outreach and engagement. Our dashboards provide insight into how the library is being used and by whom, as well as into our overall value to and impact on the university. Furthermore, our dashboards promote a culture of data-driven decision making, so that stakeholders can use them to make choices that better serve users.
Design & Methodology
We created these dashboards with Tableau and used a wide variety of sources, such as data from the overhead counter, room reservations, Google Analytics, circulation, transactions, and e-resources usage. Additionally, to ensure that our dashboards are accessible to all users in our campus community, we applied best practices in accessibility, considering color contrast, text readability, and the styling of interactive elements.
Findings
This project demonstrates how patrons use library resources, services, systems, and collections. It puts our organizational data in one place and makes it accessible to stakeholders.
Action & Impact
We are sharing these dashboards with relevant stakeholders, who can see how their core services are used. This allows stakeholders to make changes to better serve users. For example, in tracking the areas of the building and furniture that receive higher usage, we can make informed decisions about space layout and about purchasing additional furniture.
Practical Implications & Value
Literature about using data visualization to allow for data-driven decision making tends to focus on one area of librarianship, such as collection management, e-resources, or access services (see Lewellen & Plum, 2016; Mishra, 2023; and Rose, 2017). In contrast, this project considers library data at an organizational level. Something that additionally distinguishes this project is its emphasis on designing accessible dashboards.
References
Lewellen, R. & Plum, T. (2016). Assessment of e-resources usage at University of Massachusetts Amherst: A MINES for libraries study using tableau for visualization and analysis. Research Library Issues, 288, 5–20.
Mishra, S. (2023). Use of information visualization techniques for collection management in libraries: A conceptual review. Library philosophy and practice, 7842.
Rose, K. (2017). Data on demand: A model to support the routine use of quantitative data for decision-making in access services. Journal of access services,14(4), 171–187.
Step up your style: How to improve your library’s data analytics by implementing a style guide
Krystal Wyatt-Baxter, University of Texas at Austin
View Poster (PDF)
Keywords: data visualization, analytics, Tableau
View “Step up your style” abstract
Purpose & Goals
This poster will explain how the process of writing and implementing a data visualization style guide has cascaded into programmatic improvements for my library’s approach to data analytics. Over the past decade, my organization has gone from one-off experiments with Tableau to a full analytics program with multiple visualization designers, data manipulators and audiences. I found myself with a cluttered suite of visualizations that served as a visual representation of my team’s increasing Tableau skills throughout the years rather than a cohesive set of informative tools. Through the process of designing, writing, and implementing a Tableau style guide, I made other incremental improvements to my organization’s data analytics approach including a documented dashboard design process, a data quality assurance checklist, and a standardized approach to publishing and retiring visualizations. I hope to provide guidance and reassurance to librarians who may be at the beginning of their data visualization journeys by sharing lessons I have learned throughout the process.
Design & Methodology
To create a Tableau style guide, I found examples from news organizations, institutional research offices, and businesses. I also researched best practices in data storytelling, visual design, and accessibility. I designed templates to be used for creating new dashboards with set fonts, sizes, and color schemes. I wrote an accompanying document detailing instructions and best practices. My team is currently undertaking a project to retrofit all of our current dashboards to the new style guide and template, improving the usability and accessibility of our visualizations.
Findings
We have found that implementing the style guide is leading to improvements in our entire approach to data analytics, from design to accessibility to data quality.
Action & Impact
I plan to continue implementing the style guide until we have a stylistically cohesive set of dashboards that are more trustworthy and usable than what we previously offered.
Practical Implications & Value
I hope to help others starting out in data visualization learn from my mistakes and take the time to set standards and styles at the outset. This poster will contribute one approach to ensuring that we make our library data maximally impactful for our colleagues.
Develop a Library-wide User Research Hub
SuHui Ho, University of California, San Diego
Keywords: user research hub, user research repository, UX
View “Develop a Library-wide User Research Hub” abstract
Purpose & Goals
The poster discusses University of California San Diego Library’s process in developing a Library-wide User Research Hub using UX design principles. Across our library, many programs and units collect data about library users’ interactions with library resources and services, preferences, and more. There was no mechanism for consistent and broad sharing of these rich datasets across programs, which resulted in duplication of effort and missed opportunities for collaboration or data-driven decision-making. Our project aimed to develop a user-friendly mechanism for collecting and sharing user research.
Design & Methodology
We used the UX design process, including stakeholder interviews, persona and user story development, competitor research/landscape review, platform selection, prototypes and evaluation to help us pinpoint our user needs and select a tool as a depository for in-house user research studies and data.
Findings
Since there may be many users for the research hub, finding and developing the hub for a primary user is important; Developing taxonomy and templates provides guidance and governance for the staff contributing their studies to the hub. Tool selection should focus on database features including tagging that allows for similar research to be identified.
Practical Implications & Value
The project will allow library staff to find and view related research from programs across the library; provide a structure to ensure the ongoing growth and organization of the Hub with guidance for what information should and should not be included in the scope of this project, with special attention paid to user privacy.
Advanced Analytics for DSpace
Federico Verlicchi, 4Science
View “Advanced Analytics for DSpace” abstract
Explore how 4Science’s innovative Analytics & Reporting and Content & Usage Statistics Add-on modules enhance data insights for DSpace repositories. This poster showcases advanced statistical tools that go beyond the basic DSpace usage of metrics, offering tailored insights into your repository’s performance and impact.
Discover how these powerful add-on modules can help you make data-driven decisions, demonstrate your institution’s research value, and optimize your repository strategy. From customizable dashboards to advanced visualization options, see how enhanced statistics can transform your understanding of user engagement and content utilization. Stop by our Poster to explore the future of your DSpace!
Methods
Library Users’ Perceptions Pre- and Post-Pandemic: Comparing 2017 and 2023 LibQUAL+ Survey Results
Andrea Schuba, University of Maryland
Amy Swackhamer, University of Maryland Libraries
Enhancing data analysis methods for evaluating research statistics using generative AI chatbots as a novice data analyst
Criss Guy, University of North Carolina at Chapel Hill
Beginning with the End in Mind: Monday.com as a Project Management Tool in the Library
Erin Cheever, UL Research Institutes
What Good Did We Do? Five Years of National Data from Project Outcome for Academic Libraries
Gena Parsons-Diamond, Association of College & Research Libraries
Partners in Kind: Collaborating with Generative AI to Complete a Compassion Audit of Student Programming and Services at the University Library
Janice Grover, University of Wyoming
Jessica Rardin, University of Wyoming Libraries
Assessing Large Post-It User Comments for Understanding and Decision-Making: A Secondary Assessment Method
Kris Johnson, Montana State University Library
Measuring Information Services Outcomes over Time: A Longitudinal View of the MISO Survey at the University of Tennessee
Louis Becker, University of Tennessee, Knoxville Libraries
Where’s the Search Box?: Usability Testing Research Guides Created in LibGuides and Adobe Express
Piper Cumbo, Auburn University
Abigail Higgins, Auburn University
Strategy & Assessment
Sharing the Love (of Assessment)
Sephra Byrne, University of North Texas Libraries
Lidia Arvisu, University of North Texas
Leveraging Assessment to Foster Frontline Voices
Elena Carrillo, University of Illinois Chicago
Jung Mi Scoulas, University of Illinois Chicago
One Form to Rule Them All? A Journey to Improve Reporting
Elizabeth Cope, University of Tennessee- Knoxville
Jennifer Mezick, University of Tennessee, Knoxville
Diversity Statement Rubric Framework
Elizabeth Dill, California State University, Long Beach
Aisha Johnson, Georgia Tech Library
Evaluating How Library Employees Apply Their EDI Training
Jeffery Loo, UC San Diego Library
Erik T. Mitchell, UC San Diego Library
Student voices at the core of assessment: one academic library’s approach
Jerry Limberg, California State University San Marcos
Empowering Middle Managers for Strategic Plan Implementation & Assessment
Rebecca Greer, University of California, Santa Barbara
Dave Kujan, University of California, Santa Barbara Library
A Year at a Glance: Implementing Manageable, Scalable Annual Reviews
Rio Picollo, University Canada West
ACRL Proficiencies for Assessment in Academic Libraries: Prioritizing for Proliferation in the Profession
Stephanie Crespo-Méndez, Syracuse University
Adapting Institutionally Mandated Assessment at the University of North Texas
Whitney Johnson-Freeman, University of North Texas
Library Internal Reviews: A Journey to Describe Value and Assist Decision-Making in a Culture of Assessment
Holt Zaugg, BYU
Access, Services & Spaces
Conducting Library Audits
Holt Zaugg, BYU
Engaging library patrons in space planning: creative methods for gathering feedback
Beth Filar Williams, Oregon State University Libraries and Press
Rachel Burgess, Oregon State University
Stable Links, Broken Picture: Reframing the Best Practices of Library Guide Design Based on Guide Usage Data
Brittany Norwood, Princeton University
“Where Can I Study?” A Mixed-Methods Approach to Evaluating Library Study Spaces Webpages
Sarah DeVille-Holly, Texas Tech University
Kimberly Vardeman, Texas Tech University
Enhancing User Experience through Comprehensive Updates of University Libraries Website
Young Joo Jeon, University of South Carolina
From Data to Action: Custom Dashboard Solutions for Virtual Chat Reference Excellence
Guinsly Mondesir, University of Toronto
The Evolution and Implementation of a Libraries Website Usability Study
Jen Mayer, University Libraries, University of Northern Colorado
Exploring the impact of a library makerspace on student self-efficacy and engagement
Jennifer Church-Duran, University of Arizona Libraries
Insights from a Study of Pre-Peri-Post-Pandemic Library Visit Data
Jordan Packer, Columbia University
Nisa Bakkalbasi, Columbia University
Computational Analysis of Chat Transcripts to Inform Services and Guide Student Success
Regina Beard, Florida Gulf Coast University
Rachel Tait-Ripperdan, Florida Gulf Coast University Library
It’s Worth the Time: Conducting a DEI-Inflective Library Space Audit
Susanna Cowan, University of Connecticut
Beyond the Numbers: Assessing Self-Checkout Through a Lens of Inclusivity
Tobi Hines, Cornell University Library
Andrew Horbal, Cornell University Library
Student Success & Retention
Exploring Undergraduates’ Engagement: Understanding Students Use and Non-Use of the Academic Library
Jung Mi Scoulas, University of Illinois Chicago
Sandra De Groote, University of Illinois Chicago
Redefining student success: Exploring perspectives and analyzing definitions
Jung Mi Scoulas, University of Illinois Chicago
Articulating Value: Assessing the Reach and Impact of Open House for First-year Undergraduate Students
Leigh Tinik, Penn State University
Steve Borrelli, Penn State University Libraries
Exploring Predictors of Undergraduate Students’ Perceptions of Belonging at a Public Research University
Linda Naru, University of Illinois Chicago
Jung Mi Scoulas, University of Illinois Chicago
Exploring Collaborative Service Models: An Analysis of Peer-Assisted Learning Services in a Learning Commons Environment
Meredith Knoff, Indiana University Bloomington
Leanne Nay, Indiana University Libraries
Evaluating the Experiences of Hispanic & Latine Students in an Academic Library
Rachel Olsen, UNC Greensboro
Empowering First-Generation Graduate Students (FGGS) in Academic Research: A User-Centered Design Approach
Sabrina Lin, University of Washington Libraries
Assessing the Value and Impact of Library Student Employment: New Perspectives from the California State University
Scott Walter, San Diego State University
Cyril Oberlander, California State Polytechnic University, Humboldt
