Wednesday, November 6
9:00 a.m.–12:30 p.m. | Pre-conference Workshops
View descriptions on the Workshops page (link opens in a new tab)
Accessible Data Communication and Visualization
Negeen Aghassibake, University of Washington
Maggie Faber, University of Washington
Practicing Participation: Tools, Techniques, and Process for Participatory Design in Library Assessment
Scott Young, Montana State University
1:30 p.m.–5:00 p.m. | Pre-conference Workshops
Communicating and Using Assessment Data for Change and Impact
Becky Croxton, Colorado State University Libraries
Megan Oakleaf, Syracuse University
What Makes It a Library? Approaches to Library Space Assessment
Valrie Minson, University of Florida Libraries
Laura Spears, University of Florida Libraries
Exploring Holistic Impacts of Whiteness on Collection Building Practices
Arthur Aguilera, University of Colorado Boulder
Amanda Rybin Koob, University of Colorado Boulder
Thursday, November 7
8:45 a.m.–10:00 a.m. | Opening Session & Keynote: Ciji Heiser
Equity-Centered Assessment: A Call to Action for Libraries
10:30 a.m.–12:30 a.m. | Concurrent Session 1
Paper Session 1: Teaching & Learning
A collaborative assessment approach: Evaluation of an advanced searching workshop
Kim Bates, University of Alberta
Megan Kennedy, University of Alberta Galleria North
Draft Paper (PDF)
View Slides (PDF)
Keywords: collaborative assessment, systematic review, advanced searching, online learning
View “A collaborative assessment approach” abstract
Purpose & Goals
The Sperber Health Sciences Librarians have regularly offered an intensive workshop on the topic of systematic review searching skills throughout the years. This workshop has taken several forms in terms of length and mode of delivery; at the time of this project, the workshop was delivered virtually as four 1-hour synchronous sessions offered on consecutive days in a single week. To accompany the synchronous workshop, asynchronous modules covering the same content were also available to learners. The workshop teaches researchers, usually graduate students from various health sciences faculties, how to design, execute, and report a robust, comprehensive search strategy appropriate for publication in a systematic or scoping review. In spite of offering the workshop on a near monthly basis, reaching approximately 900 learners per year, and having the online asynchronous modules, librarians still conducted a large number of one-on-one consultations with researchers completing systematic review searches. These consultations often covered nearly identical content to what was offered in the workshop, making them repetitive and time consuming as a single consultation. Additionally, these consultations seemed unproductive since there was not always sufficient time to include the design of the search strategy for the review during a 1-hr appointment, leading to multiple consultations for a single project. The goal of this assessment project was to three-fold: 1. Determine how well students were learning the foundation skills required to conduct the initial steps of a systematic search as part of a larger systematic or scoping review 2. Determine the delivery preference for learners 3. Understand the knowledge gap between the theory of advanced searching and the application of this information to complete a systematic search
Design & Methodology
This was a collaborative project of the recently formed Assessment and Insight Team (AIT) and the Health Sciences Librarians at the University of Alberta Library. The project was led by two members of the AIT who worked with two of the Sperber librarians to create an assessment strategy. This project took seven months in total to implement and complete. A literature review was conducted to determine the best approaches for conducting an assessment of learning while still making the workload manageable, and therefore the project feasible. The assessment strategy was survey based and comprised of three parts: 1. A post-test style survey was used to assess the learning of researchers who had taken the workshop in the three months prior to beginning this assessment project (August-October 2022). This survey included “test your knowledge” style questions as well as questions about demographics and mode of delivery preferences 2. In class problem-based learning style quizzes were delivered at the end of the daily workshop sessions and data was collected to determine which concepts learners were struggling with and which were clear. These quizzes took approximately 5 minutes for learners to complete and were promoted as a good way for learners to test their own learning. 3. A follow up survey was distributed to attendees (Jan-March 2023). The survey asked questions about demographics, mode of delivery preference, and “confidence” with skills taught at the workshop. These “confidence questions” were adapted from Fresno Tests and “researcher-readiness” assessment literature Surveys were distributed using the survey software, Qualtrics. Daily quizzes were implemented using Google Forms. Additional assessment was completed for the in-person version of the workshop that existed in the years previous. However, this assessment was limited mostly to “satisfaction” with the workshop rather than if the attendees learned core skills.
Findings
From the daily quizzes, we found that the responses demonstrated a fairly strong understanding of the concepts. There were two questions with a notably high number of incorrect answers: one question related to the inclusion of subject headings (MeSH) in the search strategy and the other was about selecting the most appropriate review methodology for a research question. This was not unexpected as Sperber Librarians frequently discussed both of these concepts during one-on-one research consultations. From the surveys, we found that most attendees were graduate or post-doc students, or research associates/staff from various Health Sciences Faculties. Learners preferred virtual synchronous instruction as the mode of delivery. Additionally, shorter (1-hr) sessions spread out over a single week were noted to be preferable as these were easier to fit into learners’ schedules. Many respondents mentioned that they preferred to have learning materials (such as slides and handouts) made available ahead of time and that the workshop be recorded. Interestingly, when asked if they had reviewed the asynchronous modules available to accompany the workshop, nearly half said they had not reviewed them and the other half said they did not know these modules existed. The majority of survey respondents responded positively to questions about confidence with skills taught during the workshop.
Action & Impact
Based on the data collected for this project, the following recommendations have been implemented: – Maintain virtual synchronous workshops as the mode of delivery, but reduce from four days to three. – Content has been realigned; extraneous content on the systematic review method, rather than the search, was removed. – Slides and handouts are sent to attendees ahead of the sessions. One session series per term is recorded and these recordings are made available to learners after attending. Further, the lack of knowledge or use of the asynchronous modules has led to a complete revitalization of this content. Many of the skills taught at the workshop benefit from supporting materials that provide additional explanation or demonstration. There are also many links to various handbooks, reporting guidelines, and other documents that support systematic searching that could be made more readily accessible if included in an asynchronous learning module. Additionally, an asynchronous module can be used by learners who are unable to attend the workshop for scheduling reasons, and also used by librarians as a teaching tool during their one-on-one research consultations. Because of the importance of the content in these modules Health Sciences Librarians decided to revamp the content in the asynchronous modules to make sure it is up to date and more accessible. To complete this work, the Health Sciences Librarians have worked with a new career librarian who is part of the “Library Resident” initiative at University of Alberta.
Practical Implications & Value
This is an example of a collaborative assessment project between a functional “assessment team” and a specialized library unit. In this case, members of the AIT provided insight on best practices and options for data collection, survey design, and data analysis, while health librarians provided detailed knowledge about the various aspects of the workshop including content, methods of delivery past and present, and potential limitations for implementing an assessment strategy. Both of these parts were necessary to develop an effective and meaningful assessment strategy. Data from this assessment project has been used to make evidence-based changes to the workshop and make it as useful as possible for both learners and librarians. Further the data from this project has been used to initiate additional collaborative projects within the library.
An assessment project of research consultations
Megan Kennedy, University of Alberta
Lucinda Johnston, University of Alberta Libraries Galleria North
Draft Paper (PDF)
View Slides (PDF)
Keywords: Research consultations, surveys, focus groups, evidence-informed decision-making
View “An assessment project of research consultations” abstract
Purpose & Goals
In Spring 2022, the University of Alberta Library (UAL) Assessment and Insight Team (AIT) inventoried the services and activities of the various units of UAL in order to identify gaps in assessment activities and to determine where AIT might provide assistance. We discovered that assessment practices were significantly variable regarding teaching and consulting activities of UAL’s librarian community. As teaching and consultation form a substantial portion of the work for much of UAL’s librarian community, the AIT team planned to develop logic models for teaching and consultation, (from the teaching logic model exemplar in the Canadian Association for Research Libraries Library Impact Framework, released in 2021). However, the leads for this project (the authors) soon realized that there was a conflation of practices and ideas related to consultation and instruction. They undertook an assessment of current consultation practices, with the goal of delineating the two functions and also exploring the needs of the librarian community regarding the assessment of consultations. At this stage, we were not focused on the impact of consultations from the user perspective. Our assessment activity was intended to be informal in nature and initially for the purpose of developing a logic model, which would include assessment tools for the UAL librarian community. However, based on the results from the overall assessment project, the goals of the project evolved. In the end, our question changed from “How do we conduct and assess consultation” to “How do we support consultation practices?”.
Design & Methodology
Our literature review at the start of this project found that generally there were few assessment methods among Canadian academic libraries for research consultations and that there are fewer formal assessments of one-on-one consultations compared to group instruction. Literature that we did find focused on measuring the impact of research consultations from the user perspective, which was not our focus. There was consensus that research consultations are important to support student learning, but there was little to no literature on how to conduct effective research consultations. To learn more about current assessment and consultation practices at UAL, we developed a survey that asked broadly about both. We used Qualtrics to host and distribute the anonymous survey. It was sent out in July, 2022 via a Google Group email to all UAL staff, librarians, and archivists, and remained open for one month. Respondents were asked to provide their email address if they wished to participate in a followup focus group to take place in fall 2022. The survey had 25 responses overall, which was a 15% response rate. However, although we sent it to all library staff to capture all consultation activity, from the responses we received, it was evident that most of the respondents were of the librarian community, which would indicate a much higher response rate from the main target audience (~37%). Of the respondents, 14 volunteered to participate in a focus group to further discuss their consultation practices and assessment needs. These volunteers were split into two groups to facilitate greater conversational flow. The focus groups were facilitated by the authors using a hybrid meeting format since many volunteers were still working from home at this time. The focus groups consisted of engagement and discussion type questions.
Findings
Our survey showed that consultations are highly variable in terms of purpose, audience and process, though some generalities did emerge. The most common reasons for consultations were related to coursework or research. Although most respondents described a similar approach to their consultations, practices varied depending on the consultation purpose or audience. Regarding assessment, many respondents felt that they lacked the time to meaningfully assess their consultations and that user feedback surveys distributed after a consultation could be a burden for patrons to complete. However, respondents stressed that knowing they were being effective during their consultations was important. Self-evaluation was identified as the most common form of assessment, but many respondents felt they lacked the time to thoughtfully review the results and implement changes. After reviewing the survey data, our question changed from “How do we conduct and assess consultation” to “How do we support consultation practices”, which guided our focus group discussions. At the focus groups, participants identified several reasons for attending: supporting sustainable and effective consultations, skills development, and knowledge sharing. Participants noted differences between consultation and teaching, reinforcing that consultation practices are highly variable and individual. They also recognized that their experience of a successful consultation might be different from their patrons, and that overall it was important for everyone to have a good experience. Concerns of the efficacy of consultations continued to come forward, as well as the sustainability and quality of this service. However, it was stressed that consultations should not be evaluated simply for the sake of it. Given that extra time is always in short supply, any evaluation should be meaningful and generate positive changes in librarian’s skill sets and student experience.
Action & Impact
Based on our findings, we developed a consultation toolkit that contains a mix of supportive and self-assessment resources, and includes multiple options to suit the needs of highly variable and individual consultation practices. There are three sections within the toolkit: 1) general best practices for conducting consultations; 2) training resources; 3) strategies for improving consultation practices. The third section is broken down into five sub-sections:
- Recognizing Student Learning and Cognitive Load
- Asking Effective Questions
- Conducting Effective Self-assessment
- Collecting User Feedback
- Turning Feedback into Actions (for Improved Service Delivery)
Use of the toolkit is optional, and assessment tools are intended for the purpose of personal reflection and improvement of consultation effectiveness. The data generated by these tools is for librarians’ personal use and professional development only, and is not intended to inform librarian performance reviews or system-wide pracitises. However, librarians may choose to report on their evaluation/reflection activities and any related professional growth, and/or identify professional development needs. In addition to improved consultation practices, we hope that this toolkit will help to foster a greater culture of assessment which is still developing at our institution. It was also suggested that a community of practice (CoP) dedicated to consultations could be established. This recommendation has yet to be implemented due to ongoing discussions about which library unit should be responsible for developing and maintaining the consultation CoP.
Practical Implications & Value
This is an example of a multifaceted assessment project that engaged a large portion of the professional librarian community at a single institution. This assessment was unique in that we were assessing a library service—research consultations—from the perspective of the population delivering the service (in this case, primarily librarians) rather than the users of the service. There was initial concern that librarians might feel defensive about being “assessed” on their practices, and that there could be a lack of participation in this project. However, effectively framing the assessment project as a supportive endeavour, rather than as “peer judgement” of librarians’ delivery of research consultations, demonstrates how a project like this can be a success. This assessment project evolved organically and responsively to the data we collected. We believe this is an important element to consider when planning an assessment project as direction can and should change as the project evolves. This project has led to evidence-informed decision-making to support the development of consultation practices. Further, the data collected has led to the development of a tangible, practical resource that can support librarians as they continue to deliver the important service of research consultations.
Faculty Opinion of Subject Librarians
Duane Wilson, Brigham Young University Galleria North
Emily Rodriguez, Brigham Young University Student
Draft Paper (PDF)
View Slides (PDF)
Keywords: Subject Librarian, Quantitative Methods, Survey
View “Faculty Opinion of Subject Librarians” abstract
Purpose & goals
This study was a follow-up to a qualitative study of faculty members’ opinions of subject librarians. The purpose of the study was to determine faculty member’s opinions of the subject librarians they work with. It sought to understand faculty members’ opinions on which subject librarian duties and qualifications were most important and if the findings from the qualitative study were generalizable to a larger group of faculty members on campus.
Design & methodology
The primary author developed survey questions by reviewing surveys in the literature. The supervisor of the subject librarians selected the questions that they thought were most important and these questions were refined to focus on issues pertinent to the findings of the qualitative study. Because of campus restrictions, the sampling followed a quasi-experimental design. Subject librarians were asked to provide lists of faculty members they worked with frequently, occasionally, and rarely. These faculty members were asked to take the survey.
The data was analyzed using Excel, Tableau, and SPSS. To determine statistically significant differences between demographic characteristics, an ANOVA was run on the differences between values. In addition to the overall report, each subject librarian was provided with an individualized report with aggregate findings and anonymized comments from their faculty members.
Findings
Faculty members were familiar with subject librarians and happy with their performance. Helping students find resources and maintaining collections were the most important duties to faculty members. Faculty members contacted subject librarians infrequently, even for the duties that they listed as important. There was no high correlation between the importance of subject librarian duties and the frequency with which faculty members contacted subject librarians about these duties.
Faculty members thought that library knowledge, communication skills, and people skills were the most important subject librarian characteristics. University status, library degree, and subject degree were considered less important. Email was the preferred contact method for faculty members. The findings of this study largely mirrored the findings from the qualitative study.
Action & Impact
This study has helped subject librarians know and understand what priorities are most important for faculty members. As a result, they were able to adjust their focus to working with faculty on the things that are most important to them. Also, it has helped settle a debate in the library where some subject librarians thought that a doctorate was an important qualification for a position. This study made it clear that faculty members didn’t care what degree librarians should have. This study will help with future position creation and hiring decisions since it identifies the most important characteristics that subject librarians should have.
Practical Implication & Value
This study helps support and update the ongoing discussion of the opinion of faculty members on subject librarians. It helps subject librarians know that they need to focus more on faculty members’ terminology and agenda if they want to be successful at working with them. The survey can be used as a template by other institutions to evaluate their faculty members’ opinions and priorities related to subject librarians. It can also be used to inform position and hiring decisions at other institutions.
Using analytics and qualitative methods to improve and sustain online tutorials and research guides
Samantha Harlow, UNC Greensboro
Joshua Olsen, University of North Carolina at Greensboro
Additional Author:
Rachel Olsen, UNC Greensboro rcsander@uncg.edu
View Slides (PDF)
Keywords: Research tutorials, LibGuides, asynchronous assessment, mixed method
View “Using analytics and qualitative methods” abstract
Purpose & Goals
Academic libraries produce many online learning objects, but with this abundance of materials comes the need for robust assessment. Librarians from the University of North Carolina Greensboro (UNCG) create, edit, and market a suite of research tutorials, as well as many Springshare LibGuides (research guides). The research tutorials, ULTRA (University Libraries Tutorials for Research Assistance), have become a popular research resource, and LibGuides are integrated in the learning management system (LMS), as well as the library website where they are highly used. There are now hundreds of course and subject LibGuides, as well as dozens of research modules in use by UNCG and global patrons. This paper’s goal is to discuss asynchronous assessment strategies of LibGuides and ULTRA, as well as how these virtual resources have changed due to feedback and what future directions we should take to improve, as well as continue with our success. This paper’s research question is “how can academic librarians best perform assessment using analytics and qualitative methods to improve and sustain online tutorials and research guides?”
Design & Methodology
Assessment can highlight the effectiveness of online tutorials and research guides by looking at patron research needs. Some libraries assess their tutorials and LibGuides by looking at a variety of data, including analytics and surveying patrons; Blummer (2007) evaluated academic library tutorials using this kind of mixed methods study. Implementing tutorials through a learning management system (LMS) and within a course allows librarians and instructors to measure whether or not information literacy learning outcomes are met (Fontane, 2017; Henrich & Attebury, 2012). Being able to compare tutorial types, such as video versus interactive HTML5 based web pages, can help determine what method academic libraries should take when creating research resources (Lantz et al., 2017; Stonebraker, 2015). Assessing if knowledge was gained by patrons after taking tutorials is useful when designing research resources, whether through post tests or usability studies (Fontane, 2017; Held & Gil-Trejo, 2016; Lindsay et al., 2006). For this online tutorial and LibGuide assessment project, a mixed method approach was performed. Data was taken from online analytic tools, as well as a feedback form that users can fill out toward the end of each tutorial. We also conducted a diversity, equity, inclusion, and accessibility (DEIA) audit of each tutorial. Through these evaluations we hope to ensure accessibility, respect for diversity and inclusivity, and transparent content that encourages continual learning. These DEIA audits were conducted by Library and Information Science (LIS) graduate interns and were completed using rubrics to score each tutorial. And lastly, students are being surveyed about ULTRA tutorials relevance to assignments and research practices in their courses. Citations and more literature on asynchronous online assessment: https://go.uncg.edu/s9otjg
Findings
Having a variety of assessment data and methodologies to improve online tutorials and LibGuides have provided many different findings. We have found so far: Issues with the learning management research modules through the form on the ULTRA tutorials, with professors using older versions of the materials. This has helped us think through our communication strategies with instructors to better integrate the modules within Canvas. Access to LibGuides through Canvas and the library website, which helps librarians understand how students are connecting to materials, as well as navigation issues within LibGuides. Diversity and inclusion concerns within ULTRA to best represent our students. The survey shows the problems with link maintenance, usability issues, and ideas for improvement for when the modules are migrated to a new system.
Action & Impact
In terms of LibGuides assessment, this data is helpful when reviewing the structure and consistency of LibGuides, as well as how students are finding guides through the website or the LMS Canvas. UNCG University Libraries will be migrating to a new online tutorial system in 2025 due to issues with server space; these tutorials and modules will now run through Springshare LibWizard. This has created an ideal opening for us to make updates and improvements to our materials based on feedback and our previous experiences. The DEIA audit and the survey allows librarians to have a sustainable assessment method in place for improving and reviewing content for the research modules. Ultimately, these modules can become more relevant to UNCG specific research assignments and student needs.
Practical Implications & Value
Librarians are creating more asynchronous online content, services, and teaching sessions than ever before in higher education, demanding increasing retrospective analysis of what instructional content exists, what works, what does not, and what projects should come next. The ability to tell the story of how libraries meet the needs of learners (including online and distance students) is essential as we continue to navigate through this dynamic and transformative era of academic librarianship. This paper will provide a blueprint for asynchronous assessment strategies for all academic librarians that can be adapted based on each team’s institution and needs.
Paper Session 2: Collections Assessment
Combining Circulation and Citation Metrics to Assess an Approval Profile
John Russell, The Pennsylvania State University
Andrei Mihailovic, Marquette University Galleria South
Draft Paper (PDF)
View Slides (PDF)
Keywords: collections, approval plans, text mining, citation analysis
View “Combining Circulation and Citation Metrics” abstract
Purpose & Goals
Marquette University’s Raynor Library completed a large-scale weeding project in 2023 to accommodate a renovation that significantly reduced available collection space. With shrinking stacks and more weeding on the horizon, an opportunity arose to refine the approval plan with the goal of reducing the ongoing print footprint of the collection. This assessment project targeted various profiles of the approval plan, which is how the library receives most of its print, to identify areas that can be switched from automatic print shipments (or autoshipments) to slips.
Design & Methodology
The approach for this project was a two-stage review of approval profiles, using circulation data at one stage, and citation data at the other. GreenGlass data was used to pull circulation rates for every autoship profile of the approval plan, and a heat map of the data established a baseline for performance evaluation, using the principle of relative efficiency. For example, 39% of the entire collection has never circulated overall, so if 22% of a smaller area of the collection has never circulated, it is deemed a strong performer. For the second stage, an R-based text mining program was used to simplify the gathering of citation information. A list of print titles sent on approval over an eight-year span was run against a corpus of recently published discipline-appropriate faculty and graduate student publications from Marquette’s institutional repository. Citation counts were then pulled and tallied after data cleanup and the elimination of false positives. The citation counts and circulation rates generated by this process were used to compare subdisciplines within an autoship profile, along with the size and cost of each subdiscipline’s annual autoshipments. For example, since approval profiles are based on Library of Congress Classification (LCC) ranges, when researchers looked at the JC’s, they investigated autoshipped print titles acquired over the life of the GOBI approval plan in each subdiscipline of the JC’s, comparing JC 1-50 (Theories of the state) to JC571-605 (Rights of the individual), and every subdiscipline in between. The data for each range was scrutinized for citation counts and circulation rates, and problem areas were identified using both sets of metrics. The results were further scrutinized for limitations, including a survey of faculty and department pages to see if research on subjects relevant to the profiles under consideration were not uploaded to the repository.
Findings
The researchers initially targeted a print-preferred profile of the collection that delivered the highest volume of titles, the B’s (Philosophy). They identified several subdisciplines with low circulation and citation rates. A low circulation rate was gauged as 50% or more of a collection area containing zero checkouts, and a low citation rate was gauged as one or fewer unique cited titles per area, or subdiscipline. Due to a much smaller amount of citation data, however, performance was mostly gauged by the relative counts in each subdiscipline, as a baseline metric like the 39% figure in circulation was not established in citation counts. Researchers also assessed e-preferred areas, as Raynor Library still receives a large volume of print titles that meet subject-based criteria, but are not published electronically, in its e-preferred approval profiles. The citation rates for print titles in these e-preferred areas performed much worse than their print-preferred counterparts.
Action & Impact
The best approval areas to target for revision in this assessment project were profiles with low circulation and citation counts, and a high volume of print autoshipments, since revising these would have the greatest impact on the collection footprint. Additionally, given the results for e-preferred areas, one sustainable measure for the collection footprint may be to slip all print in these e-preferred areas, while maintaining print autoshipments in the few high-performing subdisciplines. All of these findings will be shared with the appropriate subject specialists to generate a conversation about possible approval profile revisions.
Practical Implications & Value
This assessment project’s main impact is the use of a process that lowers the largest barrier to citation analysis: time. Citation counting for bibliometric analysis has traditionally been carried out manually, and this R-based text mining tool allows for relatively efficient, high-volume citation analysis. While researchers in this context used it to assess print autoshipments for collection space purposes, one can imagine this approach being used in several other collection assessment contexts where citation data could augment other forms of usage data to help inform decision-making processes.
Our Future is in the Past — The Predictive Power of Consortial ILL Transaction Data
Steve Smith, University of Massachusetts Boston Galleria South
View Slides (PDF)
Keywords: controlled digital lending (CDL), consortial analytics, Interlibrary loan usage & analysis, collection assessment, shared print programs/collective collections
View “Our Future is in the Past” abstract
Purpose & Goals
The primary goal of the Boston Library Consortium’s Controlled Digital Lending (CDL) Data Analysis project was to identify a set of print book titles that could pre-seed a digital repository, so that interlibrary CDL transactions would not require scan-on-demand workflows. Rather, it would allow the project to focus on other aspects of the delivery of the files (library workflows, user experience, etc.). As such, we sought to identify titles having a high probability of being requested by BLC member libraries going forward. More broadly, this project resulted in a template and methodology for consortium and other non-integrated library networks for working with non-standardized (aka “dirty”) data, for de-duping such data and enriching them from multiple sources in order to gain insight into the shared use of the collective holdings among these institutions.
Design & Methodology
An unusual challenge was integrating usage data from 13 BLC member libraries. We opted for a methodology using non-standardized ILL transaction data for the consortium, which required integrating data elements drawn from additional sources. Gathering and cleaning the data was a multistage process.. As our goal was to produce a list of potential titles for a corpus, not delineate full bibliographic data for every title that was lent, there were several instances where we just worked with “good enough” data rather than seeking the complete dataset. Even accepting that the data would be imperfect, it still required several rounds of data clean up during multiple stages of the process. We focused on titles that had been requested and filled between BLC libraries via RapidR during the last five years. Due to libraries limiting operations during the pandemic, data from March 2020 forward showed dramatically fewer transactions than the period of January 2018-March 2020, so we focused on titles that were requested and filled by BLC libraries in both 2018 and 2019. That dataset was then enriched using the OCLC API to return more complete bibliographic data, including year of publication, publisher, LC class. We cleaned up the year of publication, used a list of publishers from Gobi to standardize publisher names and mark if they are trade or university publishers. This dataset was then matched against 2021 and 2022 (to May 19) RapidR transactions to identify books that were requested in at least 3 of the past 5 years–that is, in both 2018 and 2019, and either 2021 or 2022. This resulted in a corpus of 588 titles. Finally, we shared OCLC numbers of those 588 titles with Gobi, who then provided information on e-book availability.
Findings
When we received the complete annual RapidR data at the end of calendar 2022 we decided to test the utility of the corpus by seeing how many titles were requested after the data cut-off date (May 19, 2022). Of the 588 titles, 129 were requested between May 20 and December 31, 2022, representing 22% of the corpus. Many of the titles had multiple requests, with the number of transactions in this seven-and-a-half month period totaling 173 (more than 5 per week). As a comparison, we pulled a random sample of 615 titles from 2018-2019 data: only 48 of these titles (8%) were requested between May 20 and December 31, 2022, demonstrating that our corpus performed significantly better than sheer chance. This was corroborated by running a similar comparison using the 2023 RapidR transaction data: 38% of the corpus had a least one loan during that period. While deemed a success, the project did raise questions that are nearly impossible to answer from the data in hand. For example, is it possible to establish when requests for different editions are meaningful? Does a patron have a rationale for requesting a particular version of Orwell’s 1984, or would they be satisfied with any copy of the text? Slightly easier—but still difficult to tease out—is establishing unique requests for a title: One patron submission can generate multiple requests if it was rejected by several lenders before it is fulfilled, or several individuals may have submitted requests for a single title almost simultaneously. Telling one situation from the other, while perhaps possible, is far from straightforward.
Action & Impact
As the Boston Library Consortium’s CDL program enters its pilot phase, our project has confirmed the existence of a body of print titles distributed across the BLC membership that is routinely being physically shipped between campuses to meet patron demands. This is a base level confirmation that CDL is worthy venture for the Consortium. More significantly, it is an indication that maintaining a shared digital repository will be a positive component of the project as it will reduce the need for scanning at the point of request, which would require more elaborate technologies and workflows, increased staffing, and longer delivery times. It also establishes that ebook availability of print holdings—with about half of the routinely requested corpus title having a library-licensable ebook version—is an issue we should take under consideration. It has surfaced the need for member libraries to pursue whole-ebook ILL rights collectively and more systematically, as well as the need for incorporating functionality to route print requests to ebook alternatives (including purchase-on-demand options) as part of the development of any CDL systems.
Practical Implications & Value
Consortium and shared-print programs play an increasingly strategic role for libraries and their approaches to maintaining their circulating collections. There is a growing need for tools and methodologies that support the analysis and assessment of these “collective collections,” whether among a set of institutions or by a single member of a group. The methodology we have developed can be extended to probe other questions and aid in collective decision-making. For example, the ebook availability data could be used to identify priorities for consortial purchasing. Or the subject matter of transacted titles could illuminate gaps in collecting and serve as basis for adjusting acquisition policies in order to achieve more diverse holdings across a consortium; while this could be done using our data set based on LC classification, such a project could also look to further enhance the dataset with subject headings to improve granularity. ILL data is limited and inconsistent, but it also is capable of providing a unique window on how the collective collections are shared among the members of consortium. Our methodology is proof that it can be a viable option for informal or spontaneous peer or assessment groups.
Using themes in area studies collection descriptions to assess strength
Ambra Gagliardi, University of Utah Galleria South
View Slides (PDF)
Keywords: Area Studies, Collection Assessment, Thematic Analysis, Academic Libraries
View “Using themes in area studies” abstract
Purpose & Goals
Research questions:
- What central themes and patterns are found in area studies collection descriptions from Title VI NRC proposal narratives?
- What can these themes and patterns tell us about how area studies collections develop?
- How can these themes and patterns help institutions assess the strength of their area studies collections and, thus, their impact on teaching, learning, and research?
Design & Methodology
For decades, the U.S. Department of Education has funded area studies centers and programs in higher education institutions nationwide via their Title VI grant programs. Beyond supporting interdisciplinary and language instruction/research, these programs have helped build remarkable area studies collections in academic libraries. The Department’s National Resource Center (NRC) grant program, in particular, has substantially impacted area studies collections. This presentation will share findings from a qualitative thematic analysis (Braun & Clark) of text and image data from the “strength of the library” sections of NRC grant narratives, focusing on the East Asian/Pan-Asian category. These sections address the eligibility requirement for each institution to “maintain specialized library collections.” In these sections, institutions describe collection statistics, unique holdings, subject and language foci, cooperative agreements, budget allocations, acquisition initiatives, and staffing expertise. By exploring themes and patterns within these descriptions, we can learn how each institution defines its area studies collections, how collections develop over time, and how institutions assess the strength of their collections. This analysis aims to support library professionals charged with developing and managing area studies collections, which require unique selection and acquisition approaches. We can learn from each other and, hopefully, work more collaboratively to grow and preserve these truly interdisciplinary collections.
Conclusions
Libraries have traditionally used collection counts (e.g., the number of volumes by language) to assess a collection’s size (and implied strength). However, traditional collection counts are less impactful when determining the strength of an area studies collection. Initial findings suggest that detailed descriptions of subject coverage provide better insight into collection strengths. Themes that address this finding include:
- Interdisciplinary nature: East Asian/Pan-Asian collections are interdisciplinary. Materials span humanities, social science, science, and technology disciplines. This broad coverage makes it challenging for libraries to separate collection counts, budget allocations, and personnel specifically for Asian Studies from other disciplines.
- Descriptive subject coverage: East Asian/Pan-Asian library collection descriptions include details beyond statistical counts and budget allocations. These descriptive highlights stress the significance and strength of library collection holdings in a particular subject, language, people, or region. Collections housed beyond the library and managed by campus or community organizations are also highlighted. Additional findings suggest that sharing materials and expertise between institutions is essential for area studies collections. Themes that address this finding include:
- Collective sharing: East Asian/Pan-Asian collections and expertise are shared among institutions locally, regionally, nationally, and internationally. Library networking and collaboration are documented through consortia memberships, resource sharing, outreach, cooperative acquisition programs, and open access.
Continued research is needed to examine themes and patterns across multiple Title VI NRC world region categories (e.g., Africa, Middle East, Latin America). This would help make the findings more generalizable. Furthermore, continued research into established systems for assessing collection strength (e.g., Library of Congress’s conspectus levels: https://www.loc.gov/acq/devpol/cpc.html) will help identify a useful system for all institutions to follow when determining the strength of their area studies collections for purposes such as Title VI NRC grant proposals.
Implications & Value
Understanding how academic libraries describe their area studies collections provides valuable insight into the multifaceted, interdisciplinary world of foreign language collection development. Many institutions nationwide receive U.S. Department of Education Title VI NRC grants. The unique nature of international/area studies collection development, lean library budgets, and limited staffing make supporting these centers challenging. A wealth of information has yet to be explored in the grant narratives that fund area studies centers. The themes and patterns within the grant narrative data can provide perspective and ideas for library professionals responsible for area studies collections. The use of reflexive thematic analysis as the qualitative method for this research will also be addressed in the presentation, which adds an extra layer of interest for librarians.
Learning objectives:
- At the end of this session, attendees will be able to discuss the relationship between Title VI NRC grant programs and area studies collections in academic libraries.
- At the end of this session, attendees will be able to identify at least three themes or patterns in East Asian/Pan-Asian collections from Title VI NRC grant-funded institutions.
- At the end of this session, attendees will experience a use case of reflexive thematic analysis as a qualitative method in collection development assessment.
The ReShare Paradigm — What can we achieve?
Karin Gilje, University of Pennsylvania
View Slides (PDF)
Keywords: Resource Sharing, Visualization, Data Management, Collaboration
View “The ReShare Paradigm” abstract
Purpose & Goals
The Partnership for Academic Library Collaboration & Innovation (PALCI) and Ivy Plus Library Confederation (IPLC) have moved to the ReShare platform by IndexData (in Fall 2021 and Winter 2022 respectively). Through close collaboration, IndexData has provided the University of Pennsylvania access to the data for import into our MetriDoc Data Warehouse where deeper analyses can be performed through tools such as Microsoft PowerBI. Rather than relying on simple counting statistics, we can evaluate the uniqueness of our collections and the efficacy of sharing within consortia. There are challenges with the new system to change the assumptions stakeholders have in calculating essential metrics. However, the transparent nature of the data provides more opportunities to examine data in new and unique ways.
Design & Methodology
The ReShare interlibrary loan platform relies on a centralized PostgreSQL data store hosted by IndexData. IndexData has made their underlying code base open source which increases the transparency of how resource sharing works. The Penn Libraries host a Ruby on Rails User Interface based on a PostgreSQL database called MetriDoc. IndexData provides a selection of their data tables to MetriDoc for Access Services to have direct access to it. Dashboards were created within the User Interface for quick lookups of key datapoints and further analyses can be performed by our Resource Sharing colleagues through an interactive query system. Additionally, tools such as Power BI or Tableau can connect to the PostgreSQL database directly to create fully interactive Data Visualizations and Reports.
Findings
As with any migration, there have been issues, but the collaboration between the Penn Libraries and the vendor, IndexData, has improved our and their understanding of the data. Through our close interactions, we have developed a complete data model to track loans through the system. The data model is rich in collection, operational, and management information. We can use the data to analyze which parts of our collections are in high demand at other institutions and where our patrons may need to rely on our partners’ collections. The two consortia contract with shipping vendors to ensure items have minimal transit time. The exit and entrance scan information helps us evaluate the shipping performance. ReShare data also provides detailed information on the rota, or ordered list, of institutions asked for an item. This rota is developed for each item based on the current share of an institution’s resource sharing load. This information has never been available before and has opened new venues for understanding the depth and breadth of the joint collections.
Action & Impact
We will continue to maintain and improve the MetriDoc service. Although the main part of this project is complete, we will be developing a series of workshops for our stakeholders to help them understand and leverage the data. We will also be sharing the research questions with the vendor, IndexData, to help them understand how the data is being utilized and continue our collaborative effort to provide answers to new questions.
Practical Implications & Value
Although the University of Pennsylvania hosts and maintains MetriDoc, it was initially developed as a tool for the IPLC. Completing the migration to the IndexData ReShare platform fulfills our contract with the IPLC BorrowDirect resource sharing program. However, we have gone beyond simply providing data to also providing in situ dashboards and querying capability. In addition, we have developed an excellent cooperative relationship with the vendor, IndexData.
Paper Session 3: Communication
Building Things to Break: Ambiguity as an Interface for Library Assessment
Joshua Herter, University of Winnipeg
Gabrielle Prefontaine, University of Winnipeg
View Slides (PDF)
Keywords: Data, Reporting, Annual Report, Strategy, Planning
View “Building Things to Break” abstract
Purpose & Goals: Modern library assessment programmes can be a tricky endeavour. Annual reports, strategic plans, evidence-based frameworks; these are all products that – in our experience – offer a mixed return on investment when we have little control over the context in which they will be used. What numbers should be selected? Do they tell a good story? Will anyone read them? Will they be misinterpreted? This paper presentation will use a case study of library assessment at a mid-sized Canadian institution — namely, our journey in creating an annual report template — to position annual reporting as a flexible interface for meaningful engagement rather than a transactional measure of outcomes.
Design & Methodology: We will draw on our experiences to examine how traditional assessment methods have not always met our reporting needs and situate this story within contemporary issues in higher education strategic planning literature (notably Cooper, 2022; Borgman & Brand, 2022; Madsen & Hurst, 2018). Our conversation will offer dual perspectives from both a Library Dean facing decisions about what to report, and an Academic Librarian faced with operationalizing those decisions. This discussion will be centred around the latest iteration of our reporting strategy and attempts to develop an annual report for our library. Key to our approach will be the conversations, fears, crises, and humbling moments that facilitated our assessment journey, and how our conversations about it matter more than any product itself.
Conclusions: The purpose of having data on hand is not to compel others; it’s to help us define what we’re doing. Universities have problems to solve, and the library solves a lot of them – visibly and invisibly. Libraries facilitate numerous interactions between faculty, students, staff and researchers each day, the absence of which would have major impacts on the business of universities. Annual reporting can and should be more than sharing numbers, but the ambiguity inherent in many of our stories makes this a challenge. Rather than fight or ignore this friction, we use it as an interface that lets us have the conversations we need to meet the moment.
Implications & Value: Discourse about data is what leads to action, not the data itself. We believe assessment is a conversation and depends greatly on introspection, criticism and questioning. In sharing our journey, we hope to encourage others to name their feelings of discomfort or dissatisfaction with data generating activities and give this friction a voice. As a node for reaction, criticism, and conversation between administrators and librarians, our approach has opened doors to building more impactful, kinetic and meaningful ways to tell our campus story.
KnitBI: Stitching Together Library Data with Power BI
Cairo Sanders, University of Victoria
View Slides (PDF)
Keywords: Power BI, data visualization, data hub, change management, centralized dashboard, data management
View “KnitBI” abstract
Purpose & Goals
This presentation offers a case study in the development of a centralized data platform at the University of Victoria Libraries using Power BI, and insight on how other libraries can overcome their own data-management and change-management challenges towards better data that communicates our work and impact.
Design & Methodology
Assessment at the University of Victoria Libraries has historically been distributed across units, resulting in data about our operations, services, and users being collected, structured, and managed with an unruly number of tools. This mess of data led to miscommunication, duplicated efforts, and inefficient workflows, which ultimately did not allow us to leverage our data in meaningful ways. This is an all-too-common story: academic libraries share a “data all over the place” problem. A 2016 inter-institutional study by Hurst et al., found a general need for more centralized tools. Institutions who have succeeded in developing their own centralized business intelligence tools have found this central source led to insights in cost efficiencies, customer satisfaction, collection evaluations, and more (Zucca, 2010). Given the cost and complexity of developing and maintaining a custom data platform, we searched for out of the box solutions to our data challenge. After testing reporting tools such as D3 and Tableau, we eventually selected and implemented Microsoft Power BI in 2022. Seeking to minimize disruption of established data collection practices, one reason we selected Power BI was its ability to extract existing data from a variety of storage locations and in various formats such as Excel spreadsheets, APIs, SQL databases, SharePoint folders etc., into a single location with a unified user experience. This has also allowed us to begin the switch, one data source and subject at a time. The technical element was only half the battle; discovering how to implement change across library units without causing undue disruption was the longer process. This included inventorying our current data assets, tools, and storage locations across units; consulting with units to understand business needs; participating in the data analysis community across campus; partnering with user experience experts; and providing training to build up data literacy across the organization.
Conclusions
In practice, centralizing our data holdings has streamlined our routine statistical reporting (e.g. to CARL, ACRL, etc.) so that we can now focus our efforts on areas of greater impact. The tools we’ve developed have allowed us to harness our data in new ways and ask new questions to demonstrate our impact. For example, our recent work to analyze the library’s investment in Open Access or partnerships in faculty research grants – areas that go beyond the traditional annual report and which speak to the greater mandate of our institutions. However, with centralized assessment, this work can only be achieved through close partnerships with the data holders and users. We’ve learned that developing tools and visualizations that serve both the creator and end user requires: 1) understanding the data literacy level of your audience, 2) creating an overarching user experience, and 3) developing training in parallel to launching new tools. While securing buy-in to this new approach can prove challenging, our approach has been to start small by working with colleagues who have clearly articulated questions in mind, and who understand the value of their data. Thus, demonstrating successful collaborations will build trust and understanding across the organization from which to build on.
Implications & Value
As higher education institutions face greater fiscal pressures, their libraries must respond by embracing a culture of data-driven decision making, which requires greater data literacy for all library staff. We’ve heard these talking points before to this seemingly universal challenge — so how can we move beyond platitudes and into action? Central data dashboards, in practice, can improve statistical reporting, decision making, and communicating library impact. Our work with Power BI has demonstrates that it is possible for libraries to move to a unified source of data without developing and maintaining custom applications or disrupting routine. All this to say, this is no small task and requires investing in technical and data skills for staff, and, above all, someone with these skills who can develop tools, lead professional development, and liaise with subject experts.
Hurst, M., Madsen, C., Wilson, F., Smith, M., & Garrity, W.F. (2016). All Your Data Displayed in One Place: Preliminary Research and Planning for a Library Assessment Dashboard and Toolkit [Paper presentation]. Proceedings of the 2016 Library Assessment Conference, Arlington, Virginia. https://www.libraryassessment.org/wp-content/uploads/bm~doc/91-wilson-2016.pdf
Zucca, J. (2010). Data Farms or a Field of Dreams? [Paper presentation]. Proceedings of the 2010 Library Assessment Conference, Baltimore, Maryland. https://citeseerx.ist.psu.edu/document?repid=rep1&type=pdf&doi=5a9aaa7ff659187b90f29f48fa025774e720fbb2
Navigating the forest with tree diagrams to visualize library research support
Sarah Murphy, The Ohio State University
Draft Paper (PDF)
View Slides (PDF)
Keywords: Data visualization, organizational performance, open data
View “Navigating the forest” abstract
Purpose and goals
Libraries use data visualization to display bibliographic and library collection data, often selecting a simple bar chart and line graph to represent this data. This study uses the fan-chord diagram — one of several visualizations from the tree family — to examine research interests and journal needs of faculty, post-doctoral scholars, graduate students, and research scientists at The Ohio State University. Tree visualizations effectively illustrate branching knowledge and can display the inter-relationships between grant funding, journal usage, research publications, and more. Filtered by local taxonomies, the fan-chord diagrams created for this project show a misalignment for various disciplines between the top journals researchers choose to publish their results and the journal titles they reference. This misalignment is also present for journal titles citing the original published research. Used in tandem with more common visualizations, tree diagrams provide libraries with a more holistic view of journal use, helping to answer the simple question “to what extent does our collection meet our author’s research needs?” (Morris 2022)
Design, methodology, or approach
A list of NIH (National Institutes of Health) funded projects of Ohio State University researchers was downloaded from the NIH RePORTER, filtering for FY2010 to FY2022. Associated research publications published between 2018 and 2022 were then downloaded from the same database. Lists of referenced publications and citing publications were next generated for each research publication using the sample Python script provided for NIH iCites database API. A second Python script was then written to gather the journal titles, journal abbreviations, authors, author affiliations, publication years, and major medical subject headings (MeSH) for each PMID on the research publications, reference publications and citing publications lists. A third script also normalized author affiliations by assigning Scopus affiliation identifiers. Unique starting points or nodes were then assigned to journal titles on the research publication lists and end points or nodes were assigned to journals on the reference publications and citing publications lists using Tableau Prep. All lists were then related in the Tableau data source window and directory-level data for local authors was added to facilitate meaningful filtering. Last, to create curved lines between starting and ending points on the fan-chord diagrams, a simple Excel file with 100 points was added to the data model.
Findings
The resulting visualizations provide a colorful, interactive profile of journal usage for NIH funded researchers in each academic unit at The Ohio State University. Filters allow users to choose an academic unit to display and up to 50 branches showing the top journals Ohio State researchers use to reference papers or the top journals citing Ohio State research. The top MeSH terms assigned to the referenced papers and the citing papers display in word clouds and text appears under the department name to summarize the amount of NIH funding awarded to researchers affiliated with the unit, the number of NIH projects, the number of research publications associated with the funding and the number of journals publishing the Ohio State authored research. A bar chart shows the top journals publishing NIH funded research authored by researchers in the selected unit and lollipop charts show the top journals referenced by these researchers and citing this research. When filtered, the dashboards clearly indicate that for various disciplines, the top journals publishing Ohio State researchers’ results are not in alignment with the journal titles Ohio State researchers’ reference or the journal titles citing this research.
Action & Impact
Present day data visualization is grounded in mathematics and the base formulas used to construct curves for fan-chord diagrams and other tree family visualizations may be applied towards other more advanced charts. Code libraries are available to create these visuals in python and R, but these tools require time and effort to master and additional skills and knowledge to embed interactivity. Instructions documenting the steps and calculations required to build tree and chord diagrams in Tableau are freely available and once built, these advanced visualizations can be combined with other visualizations to present a more coherent picture. Using raw data to develop local Tableau dashboards allows academic libraries to readily share and filter data using meaningful local taxonomies.
Ongoing requests from researchers for instructions outlining how to create tree family diagrams in Tableau, R, python, and other tools indicate interest in this type of visualization continues to grow. Learning the math required to construct the visualization for this project has helped the author teach others how to build this visualization.
Practical implications or value
Using alternative approaches to present data, academic librarians can better answer at the discipline level whether our collections meet the needs of campus researchers. Tree family visualizations offer a mechanism to display relationships and information flows. When used in tandem with other visualizations, tree diagrams enhance the presentation of assessment results, inspiring conversation, and data-informed action.
Reflective storytelling in research and evaluation before, during, and after COVID-19
Martha Kyrillidou, QualityMetrics LLC
View Slides (PDF)
Keywords: COVID, LSTA Evaluations, Surveys, Focus Groups, Design studies, Indirect Cost Rate studies, Strategy, Improvement, Support Staff, Diversity, Equity, Inclusion
View “Reflective storytelling in research and evaluation” abstract
Purpose & Goals
The goal of this paper is to capture the challenges and the fun – philosophical and pragmatic – that were caused by the pandemic (COVID-19) disruption in research and assessment projects libraries and the author engaged with during COVID-19. The pandemic accelerated a number of trends and influenced how we do our work and accelerated the transformation taking place in libraries and what our library users want from libraries. Trends such as more reliance on electronic resources, gaping questions on the value of the physical infrastructure (physical spaces and artifacts including books) and trends affecting their utilization are still with us. Disruption changed the way we assess and adjust our research methods and this paper captures this unique moment in the history of libraries and library assessment using a reflecting storytelling approach and examples of specific projects the author engaged with before, during, and after the COVID-19 pandemic.
Design & Methodology
Using auto reflective storytelling, the author describes a variety of efforts ranging from space planning projects to financial analysis to inhouse and virtual surveys to state wide evaluations. We will describe how we engaged in understanding user expectations in relation to the needs for a renovation and establishment of a student success centers (physical infrastructure, in relation to new buildings and renovations, and how this work took place as travel was restricted. We will also discuss how we supported Library Services and Technology Act (LSTA) evaluations, as well as sponsored research support projects in research libraries, and projects that seek to transform library operations and strategies. Using a reflective storytelling approach, the purpose of this paper is to reflect on the audience, the intentionality of the project, and the positionality of the evaluator and/or researcher.
Findings
Managing expectations and adjusting expectations was a constant during the pandemic. While some projects could be extended regarding timelines, and utilize more remote methods, other projects were more rigid in needing to meet national level deadlines that had zero flexibility regarding timelines and target completion dates. In this paper we capture these nuances and highlight the urgency of the audience, the consumer of the end result of the consulting projects, evaluations, reports, and findings. We also focus on the intentions of the different projects and how these intentions shift under different conditions. For example, we discuss how LSTA evaluations were conducted before and after the pandemic; how indirect cost studies were also disrupted and restarted; how space planning took place utilizing remote design thinking approaches; and how strategies are transformative under different conditions and for different purposes possibly than originally initiated for. How responsive can our research and evaluation methods need to be during times of crisis and what can we do to continue to build interest and engagement with positive library outcomes, adjust activities that are not as impactful, and build stronger library organizations? A third group of findings will highlight the positionality of the evaluator and how their own conception of disruption or normalcy affects ways we approach projects. In that respect, the paper will follow an autoethnographic reflective approach and will discuss the author’s perception of the various projects and lessons learned, opportunities realized and opportunities missed, and articulate what are healthy, realistic, and resilient personal outcomes through these experiences.
Action & Impact
We are showcasing the choices we made in project implementations over the last few years and the lack of choice given external pressures in certain instances. The impact of this paper is the recognition that we lived through a 21st century global disaster and we are still in recovery mode.This paper articulates the attitudes, thoughts, and actions that are positive during times of disruption and consider how we can augment the impact of such positive attitudes, thoughts, and actions even in the depths of a global disaster so we can be more resilient and build organizations that also demonstrate resilience and sustainability.
Practical Implications & Value
This paper helps us in a very practical way improve our storytelling about library achievements, benchmarking, and transformations. It is an attempt to center the communities and libraries themselves, as well as the people’s achievements and transformations enacted through and enabled by libraries. Overall, it reminds us that keeping in mind the audience, the intentionality of the evaluation/research project, and the positionality of the evaluator when designing user feedback methods is important and it can result in finding fun and engaging ways of exploring, understanding, sharing, and co-creating.
Learning Lab 1: Wonders of Data
Wonders of Data: Planning, Assessment, and Storytelling
Brooke Doyle, OCLC Research
Lynn Silipigni Connaway, PhD, OCLC Research
View Slides (PDF)
Handouts:
View “Wonders of Data” learning outcomes
Learning Outcomes
Apply the New Model Library framework and information regarding challenges and changes in strategic goals that 29 international library leaders made during the pandemic.
Identify two important changes their institution has made in library services, collaboration, and engagement during the past 3 years.
Consider how assessment data can be used to tell an impactful story about their library’s strengths and opportunities.
Align library’s programs and offerings with both the library’s and the institution’s strategic goals.
Learning Lab 2: Cultural Competence and Social Justice in Assessment
Cultural Competence and Social Justice in Assessment: Using the New Proficiencies to Initiate Learning Plans for You and Your Colleagues
Becky Croxton, Colorado State University Libraries
Megan Oakleaf, Syracuse University
Jung Mi Scoulas, University of Illinois Chicago
View Slides (PDF)
View “Cultural Competence and Social Justice in Assessment” learning outcomes
Learning Outcomes
Participants will be able to Identify their learning gaps and opportunities for growth based on Proficiencies for Assessment in Academic Libraries.
Participants will be able to create an assessment proficiencies learning plan template that can be customized and used for themselves or employed to plan training for colleagues.
2:25 p.m.–4:25 p.m. | Concurrent Session 2
Paper Session 4: Environmental Sustainability
Assessing Environmental Sustainability in Canadian University Libraries’ Strategic Plans
Paige Roman, McMaster University
Nicole Doro, McMaster University
Draft Paper (PDF)
View Slides (PDF)
Keywords: academic libraries, environmental sustainability, strategic planning, sustainability assessment
View “Assessing Environmental Sustainability” abstract
Purpose & Goals
Floods, famine, fires: fallout from the planet’s increasing temperature makes the climate crisis a reality that affects everyone (IPCC 2023) – libraries and librarians included. Academic libraries contribute to carbon emissions and waste through managing infrastructure, energy and water use, the purchasing of materials and resources, printing, and so on. While we are facing a dire situation, there is still plenty of room for action. Strategic plans are designed to provide direction and measurable goals which are essential to systematically furthering sustainability on campus. By exploring strategic plans of Canadian university libraries, our study provides an analysis of current strategic priorities and language which can be used to inform future strategic planning sustainability initiatives. The goal of this study is to explore the extent to which environmental sustainability is present in the strategic plans of Canadian university libraries, and to analyze how it is being included, when it is included at all. After a review of the literature, no analysis has been done on an institutional level regarding environmental sustainability (ES) presence in academic library strategic planning. Furthermore, much of the scholarly discourse regarding environmental sustainability in libraries is limited to surveys regarding participant perceptions, or communicating programming ideas related to environmental sustainability in academic libraries. This study aims to synthesize and communicate what is currently being done at the strategic planning level of academic libraries in Canada.
Design & Methodology
University libraries were selected based on Canadian Research Knowledge Network (CRKN) institutional membership to ensure coverage from across Canada and to retrieve consistent full-time equivalent (FTE) data. To be included, the member had to be a university with an FTE exceeding 1000, and have a publicly available strategic plan that expired no later than 2023. Out of 63 universities with a FTE exceeding 1000, 30 had strategic plans meeting our criteria. These 30 libraries underwent screening for mentions of environmental sustainability, resulting in 20 assessed using conceptual content analysis. Conceptual content analysis was used to analyze textual data by identifying key concepts and tracking their frequency of use. Analysis occurred at the word/word sense level, being as specific as possible while leaving room for ambiguous uses of “sustainability.” While we did have a predetermined set of categories to code for, adopting a flexible coding methodology allowed for possible variations in words or word senses used as coding progressed. Our nine coded words were: sustainab*, UN SDGs (United Nations Sustainable Development Goals), ecology, climate change/climate crisis, environment*, ecosystem, eco-friendly, green, and stewardship. Coding was performed for frequency, guided by established rules. For instance, terms like “sustainable” were coded under “sustainability,” while mentions of the UN SDGs were noted separately. To account for French strategic plans, the closest possible translation was used to allow comparison across languages. Surrounding words were put into a content box to keep track of associated uses. Additionally, we coded for location of concepts. For example, words found in the mission, value, or purpose sections of strategic plans counted as “guiding principle,” while concepts found in a goal or sub-goal all counted towards “goals.” Coding was done by hand by both researchers individually, and compared to determine internal consistency.
Findings
Of the 20 strategic plans analyzed, 12 (60%) had ES as a goal unto itself; 14 (70%) featured ES as a guiding principle; and only 3 (15%) had ES integrated throughout the strategic planning document in multiple goals. There were 6 (30%) instances where ES was both a goal unto itself and a guiding principle of the strategic plan, and 2 (10%) instances where ES was a goal unto itself, integrated into parts of other goals, and a guiding principle of the strategic plan. 14 (70%) of the strategic plans had coded words mentioned 1-4 times throughout the document, while 6 (30%) had coded words mentioned anywhere from 5 to 15 times. The strategic plans with a higher volume of coded words (mentioned 14 and 15 times) were the same documents where ES was a goal unto itself, integrated into parts of other goals, and a guiding principle of the strategic plan; these strategic plans were most explicit about sustainability. The strategic plans with a lower volume (1-4) mention of coded words were also more vague in the senses in which they used coded words. 7 (35%) strategic plans spoke exclusively about ES, while another 7 (35%) included a mix of environmental and broad sustainable language. 6 (30%) plans only mentioned the coded words in the broad sense. The categories “sustainab*,” “ecology,” “climate/climate change,” and “environment” had the most associated words. The terms “crisis” (3), and “action” (9) were the most frequently used associated words and were exclusively paired with “ecology” and “climate/climate change.” “sustainab*” and “environment” had the widest array of associated words. The most frequently used code was “sustainab*” with 45 instances, which reflects both broad and environment-specific usages. The next two most frequently coded concepts were “environment*” with 18 instances, and “climate” with 16.
Action & Impact
McMaster University Library (MUL), where we work, is one of the university libraries that has sustainability as a goal unto itself. After reviewing the possibilities and gathering inspiration from other strategic plans, we will advocate for sustainability (amongst other intersecting values such as equity, Indigeneity, and access) to be not only articulated as goals unto themselves, but better integrated throughout and as guiding principles in the crafting of strategic initiatives, so that this work is not done in a silo but rather across units and departments of the library in a holistic manner. MUL is entering a new round of strategic initiatives planning, and leadership has asked for our input as part of the Library Sustainability Committee regarding how we can integrate intersectional environmental sustainability in our plans; we plan to draw upon this work as evidence to support our advocacy. More broadly, we seek to expand the community of academic libraries doing work related to environmental sustainability—both in the practice of everyday work responsibilities of academic librarians, but also as a more prominent thread of scholarly conversation in academic librarianship as well. We hope these findings will encourage academic libraries to consider implementing environmental sustainability into future iterations of their strategic plans, as the climate crisis is an intersectional issue that requires our collective efforts to protect against the worst possible outcomes.
Practical Implications & Value
During our literature review, we found very few publications documenting the assessment of sustainability (environmental or otherwise) in strategic plans (notable exceptions were Jones & Wong 2016, Missingham 2021, and Triblehorn 2023). We hope that environmental sustainability will be considered a priority for strategic planning for all libraries going forward, and formally documenting the ways in which this is currently being done in this paper may help to add to the scholarly discourse on the topic and guide proliferation of its inclusion. One possible future direction of research stemming from this project would be evaluating annual reports from the included university libraries’ to measure the impact of the strategic plans that did have an environmentally sustainable focus. The findings of such a follow-up may help guide best practices for drafting strategic plans with environmental sustainability, as there may be some correlation to be found between the degree to which sustainability was a guiding principle, a goal unto itself, or interwoven through the document, and the associated outcomes. Another possible avenue for exploration would be to compare the university libraries strategic plans to their broader institutional strategic plans to analyze any thematic throughlines or influence that may be visible related to the theme of sustainability. Another separate direction for future research based on this project would be to analyze the original dataset (i.e. current strategic plan documents of Canadian university libraries) for other themes, such as mentions of decolonization and equity, for example. To quote panelist Sarah Triblehorn at the 2024 ARL President’s Institute: “Policy isn’t the goal; policy exists to keep us accountable.” While planning is not technically direct action, strategic plans guide efforts on an institutional scale and make meaningful direct action possible, which is essential if we are to have hope in the face of the climate crisis.
(The Problem with) UN SDGs as a Measure of Sustainability in Academic Libraries, and an Exploration of Alternatives
Nicole Doro, McMaster University
Draft Paper (PDF)
View Slides (PDF)
Keywords: United Nations Sustainable Development Goals, academic libraries, sustainable assessment
View “(The Problem with) UN SDGs” abstract
Year over year, we are continuing to observe the impacts of the climate crisis worsen, from floods, fires (lest we forget the persistent air quality advisory from wildfire smoke across Canada and the United States in summer 2023) and storms, to climate refugees and climate anxiety. As Dr. Kimberly Nicholas (2021) so succinctly frames the climate crisis: “It’s warming. It’s us. We’re sure. It’s bad. We can fix it.” In order to fix it, we all have a role to play – especially academic libraries. All jobs are climate jobs—academic library professionals need to respond to the call for action to reflect on their professional scope of influence to determine the ways in which we can have an impact and push our institutions towards meaningful change. In 2015, the United Nations adopted the 2030 Agenda for Sustainable Development, which centres 17 goals meant to simultaneously recognize and inspire action on targets related to health, education, inequality, economic growth, and climate change. These United Nations Sustainable Development Goals (UN SDGs) categorize and rank progress within and between countries based on the 17 goals. This paper seeks to address the question of how to assess sustainability in academic libraries. While the UN SDGs are one of, if not the most, prominent forms of thinking about sustainable assessment in higher education, the inadequacy of the UN SDGs should cause pause and prompt consideration of alternative measures of sustainability in academic libraries. I aim to point in the direction of other alternative forms of sustainable assessment that may be more productively deployed in the context of academic libraries that address the gaps found in the UN SDGs, such as Indigenous methodologies, the Sustainability Tracking, Assessment & Rating System (STARS), the Sustainable Libraries Initiative (SLI) or true cost assessment.
Panel 1: Library Assessment in Canada
Library Assessment in Canada: A Cross Country Checkup
Mary-Jo Romaniuk, University of Calgary
Julie Morin, Canadian Association of Research Libraries
Justine Wheeler, University of Calgary
Shahira Khair, University of Victoria
Vivian Lewis, McMaster University Libraries
View Slides (PDF)
View “Library Assessment in Canada” description
Learning outcomes:
An exploration of current and future trends in library assessment, this session brings together Canadian academic library perspectives to examine our individual and collective capacity to succeed in evaluating and communicating library impact. Panelists will provide an update on several projects with national and international impact, including managing and re-imagining our core library statistics program; returning to user satisfaction surveys; and exploring innovative ways for measuring and describing impact.
Key topics:
Library assessment, trends, capacity, professional development.
Plan for attendee engagement:
Audience members will have the chance to react to and ask questions about project updates. Online polling tools will be used throughout the discussion portion to gather audience input on challenges and opportunities for building library capacity in their own institutions. Time will be reserved for audience reactions to discussions and for them to pose their own questions to panelists.
Panel 2: Academic Library Managers
Academic Library Managers: A Conversation about the Gaps and Opportunities for Professional Development
Kathleen Bell, George Mason University
Dani Brecher Cook, UC San Diego
Maoria Kirker, George Mason University
Julie Adamo, Middlebury College
Kim Nayyer, Cornell Law School
View Slides (PDF)
View “Academic Library Managers” description
Learning Outcomes
1. Attendees will identify gaps in professional development opportunities for academic library managers. 2. Attendees will describe potential opportunities for professional development for academic library managers. 3. Attendees will critique the state of management preparation in libraries.
Key Topics
Why did you become a manager and how well were you prepared for that role? How do library schools prepare (academic) library managers? What do you see as the most challenging gap in professional development for library managers? What assessment practices can be integrated into the field to better evaluate library managers? What unexplored methods might we consider when assessing library managers?
Plan for attendee engagement
The session will use Mentimeter to engage the audience throughout the session. At the start of the session, we will use the word cloud feature to gain the audience’s perspective on the skills needed by academic library managers in order to be successful. Throughout the panel discussions, we will use polling before new questions to gain the audience perspectives on a topic. For example, if we ask panelists to reflect on the gaps in their formal education related to managerial training, we will have a poll related to the formal learning audience members had in this area. Finally, we will leave at least 15-20 minutes for audience questions and answers.
Panel 3: Demonstrating Library Impact on Faculty Research
Demonstrating Library Impact on Faculty Research
Lisa Hinchliffe
Consuella Askew, Rutgers University Libraries
Hilary Craiglow, Attain Partners
Laura Gariepy, Virginia Commonwealth University
Martha Kyrillidou, QualityMetrics LLC
Scott Walter, San Diego State University
View “Demonstrating Library Impact on Faculty Research” description
Learning Outcomes
Session attendees will: Grapple with the complexities of documenting library impact on faculty research and scholarly activities. Understand the opportunities and limitations inherent in the Library Cost Study approach to demonstrating library impact. Identify additional strategies that might complement the approach taken in the Library Cost Study.
Key Topics
Demonstrating Library Impact—What has the library field accomplished in this area? Considering the typical tri-partite mission of higher education (teaching, research, service; patient care is also captured where and when applicable), how mature are library efforts in addressing each of the areas? What have been the challenges in assessing impact on faculty research and scholarly activities? Library Cost Study— What is the Library Cost Study and how did it come to be? As a longstanding approach to documenting library use, what are the limitations and challenges libraries face with implementing it? How could it be updated to reflect contemporary library services and evaluation methodologies? Other Strategies—What other examples are there of documenting library impact on faculty research and scholarly activity? How can libraries partner with other units (e.g., Office for Research, Institutional Research, etc.) to obtain data that will help strengthen their demonstration of impact on faculty research? What opportunities do you see to enhance efforts in this area?
Plan for attendee engagement
Before the dialogue among panelists begins, attendees will be asked to reflect on their experiences with demonstrating the impact of the library on faculty research and scholarly activities. This activity allows for “settling in” and gaining focus on the topic at hand. After the initial panelist discussion, participants will be prompted to identify at least one take-away from the session thus far and at least one question they have. This pause technique ensures that everyone has the opportunity to formulate a question, not just those who are often first to the microphone. Attendees will then ask questions and Lisa will moderate with the panel. At least 15 minutes will be allocated to this dialogue to ensure high levels of participant engagement. To close the session, attendees will be asked to identify one “next step” for themselves with respect to documenting library impact on faculty research.
Panel 4: Ethics, Evaluation, AI, and Storytelling
Ethics, Evaluation, AI, and Storytelling: Data Lenses on Library Assessment
Ashley Sands, Institute of Museum and Library Services (IMLS)
Lisa Hinchliffe
Kate McDowell
Scott Young, Montana State University
Kyle Jones
View Slides (PDF)
View “Ethics, Evaluation, AI, and Storytelling” description
Learning Outcomes
Attendees will benefit both in theory and in practice. Attendees will come away with a few discrete toolkits, techniques, case studies, and curricula that are openly available and able to be used to conduct assessment activities at contexts across libraries and archives. Additionally, attendees will join the thoughtful panel conversations weighing ethical considerations across the development of assessment design. The audience will participate in nuanced discussions around the ethical uses of AI/ML, differences between “library assessment” and “library learning analytics” in terms of data collection, and the implications for privacy and ethics across these projects.
Key Topics
Panelists will be encouraged to delve into potential gray areas of library assessment and discuss the data collection and analytical methods that have enabled them to ethically provide data-based evidence to their stakeholders while balancing staff and patron privacy in line with the ALA code of ethics. In practice, how do these grant awardees balance the ALA Code of Ethics, for example: “We provide the highest level of service to all library users” with “We protect each library user’s right to privacy and confidentiality with respect to information sought or received and resources consulted, borrowed, acquired or transmitted.” While its own branch in the recent LAC conference, we will also closely examine how diversity, equity, and inclusion can be prioritized and respected in library data collection and use.
Plan for attendee engagement
In addition to time being set aside for Q/A from attendees, the session will highly utilize Menti (https://www.menti.com/) or another similar live-polling tool. The session will begin with a couple of Menti demographic polls. This will enable the panelists to better understand the level of expertise and experience of the participants in the room. Prior to each round of questions for the panelists, the attendees will be polled again with a provoking question, asking them to gauge their initial perspectives. This is intended to get the audience thinking about nuance and the potential applications, not just theory, of questions the panelists are about to address. Just prior to the Q/A, a Menti round will be used to ask folks to reflect on and share their top priorities related to at least one of the areas in the presentation. This is intended to enable participation from all of those in the room; even those who generally won’t raise their hand during a conference session will hopefully feel comfortable enough to post their question confidentially in writing for the panel to consider and address. At the close of the session, a final quantitative Menti poll will ask attendees if they have learned about new tools, whether their perspectives on data and privacy have shifted as a result of the session, and whether they anticipate using these new tools or knowledge at their home institutions.
Common Read Discussion 1: Discussion of—The Data Detective
Discussion of: The Data Detective—Ten Easy Rules to Make Sense of Statistics
Anne Koenig, University of Pittsburgh
Jennifer Moon-Chung, University of Pittsburgh
View Slides (PDF)
Citation
Harford, T . (2022). The Data detective: ten easy rules to make sense of statistics. Riverhead Books.
View “Discussion of: The Data Detective” description
Learning Outcomes
After the session, attendees will be able to:
- Recognize the ways bias can enter into our statistics
- Evaluate the reliability of statistics
- Plan strategies to improve their analytical skills and presentations
Relevance to the assessment community
The themes presented in the book are useful to anyone, novice and expert alike, involved in data collection, analysis, visualization and interpretation because they stress critical thinking and review easy methods to evaluate and improve what we do as assessment professionals. The stories and examples presented in the book are relatable; and some may even be familiar, but the author encourages us to take another look with a different lens. The book is approachable and engaging without being overly mathematical or technical and provides much food for thoughtful discussion and reflection. Furthermore, the book serves as an important reminder for assessment professionals to persist in learning and actively combat data bias, essential for avoiding misleading data interpretations. This is particularly relevant given that datasets with identical statistics can display differently visually. It advocates for thorough analysis and careful communication, highlighting the book’s broader message.
Common Read Discussion 2: A Conversation on Critical Assessment—Where Are We Now?
A Conversation on Critical Assessment: Where are we now?
Maggie Faber, University of Washington Libraries
Jackie Belanger, University of Washington Libraries
View Slides (PDF)
https://www.inthelibrarywiththeleadpipe.org/2018/towards-critical-assessment-practice/
View “Common Read Discussion 2: A Conversation on Critical Assessment” description
Instructions to Participants
Please read the article at inthelibrarywiththeleadpipe.org/2018/towards-critical-assessment-practice and reflect on any ways your practices incorporate critical assessment principles or ideas.
Learning Outcomes
Our aim is to invite attendees to engage reflectively and critically in a nuanced discussion about the nature of power, bias, and positionality in library assessment work and to share experiences and lessons learned from putting these ideas into practice. Following this conversation, participants will be able to:
- Examine underlying assumptions and power structures in current assessment practices
- Reflect on the assessment practitioners’ own positionality and institutional context and the ways in which that shapes our work throughout the assessment cycle
- Explore other disciplines and alternative methodologies in order to critically consider ways of engaging user communities in assessment work
- Build community with other critical assessment practitioners within the wider assessment field
Relevance to the assessment community
In 2018, the authors presented the ideas from the proposed article in a Library Assessment Conference paper designed to highlight questions related to power structures, equity, and social justice in library assessment. We revisited these ideas in 2020, focusing on the practical challenges of implementation and holding that in tension with the desire for “quick fixes” for systemic issues. Throughout our presentations, we posed a number of questions to the audience with which we have grappled, including:
- How do our own identities, institutional positions, and perspectives shape our work?
- What is the purpose of the assessment, who decides what to assess, and who benefits from the work?
- Are there elements of our institutional contexts (e.g., an emphasis on a culture of accountability) that create tension with the values we try to bring to our work? How might a more critical approach transform these approaches to assessment?
- What are the histories and contexts of the methods we choose, and how do these shape our work? How can we take account of the histories and inequities of qualitative methods such as ethnography, even while these methods are often posited as an antidote to an overemphasis on quantitative assessment?
- What is considered “evidence” and who decides?
- Are we working in ways that enable power sharing and engagement with user communities at all stages of the process, from question formulation and data analysis, to decision-making?
We continue to grapple with these topics and questions, and we think they still warrant a deeper engagement from the library assessment community. While we have had the opportunity to facilitate discussions and activities on critical assessment at other conferences, the format change to LAC 2020 meant that we have not yet had the opportunity to engage with our broader assessment peers in a similarly open discussion. In the intervening time, Critical Assessment has taken on a life of its own, with an American Library Association profile on “Keeping up with…Critical Assessment” and important expansions and reframing, such as “Moving from Critical Assessment to Assessment as Care.” While we have seen a general shift to equity-informed assessment and recent revisions to the ACRL Assessment Proficiencies, which now include a greater focus on equity and social justice, the core question remains the same: how do we translate this into practice in our day-to-day work? We propose returning to these original questions from the perspective of practitioners who have been attempting to translate our inquiry and reflection into changes to our assessment activities, program, and engagement with institutional colleagues. We also hope to learn from participants who have adopted, adapted, and made these ideas their own. By revisiting this article at LAC, we hope to exchange challenges and lessons learned over the six years since we began this conversation.
Learning Lab 3: How A Value-Based Prioritization System Created Demonstrable Equity Work
How a value-based prioritization system created demonstrable equity work for Multnomah County Library’s projects and initiatives
Elizabeth O’Neill
Annie Lewis, Multnomah County Library
View Slides (PDF)
View “How A Value-Based Prioritization System Created Demonstrable Equity Work” description
Learning Outcomes
Understand the value of a systemwide team to lead project management and evaluation Understand development of a library-wide prioritization tool based on organizational values; Gain understanding of a prioritization process with definitions, scores, weights and a rater reliability process to ensure consistent scoring with executive teams. Learn how racial equity was centered in a library’s resource allocation in staffing, effort, and project prioritization. Determine application of a similar equity-driven prioritization tool to guide efforts in your organization.
Friday, November 8
8:45 a.m.–10:45 a.m. | Concurrent Session 3
Paper Session 5: Collaboration & Community Building
Consortium Complications: Scholars Portal’s Service Assessment Framework and the Evaluation of the Accessible Content ePortal
Sabina Pagotto, Scholars Portal | Ontario Council of University Libraries
Draft Paper (PDF)
View Slides (PDF)
Keywords: service assessment, consortium, multi-method, accessibility
View “Consortium Complications” abstract
Purpose & Goals
Assessing a service provided by a library consortium adds an extra layer of complication to any assessment project. Scholars Portal, the service arm of the Ontario Council of University Libraries, developed a framework to guide assessment projects for its services. This paper describes the first implementation of this framework. It was used to identify strengths, opportunities, and next steps for the alternate text format service, the Accessible Content ePortal (ACE). The decision to choose ACE as the first candidate for the assessment framework resulted from recent changes to the service and evolving accessibility legislation and standards.
Design & Methodology
The assessment framework builds on a previous Library Assessment Conference presentation from Scholars Portal, “Evaluating from Arm’s Length: Assessing Services Provided by a Library Consortium” (Pagotto, Barrett, and Pereyaslavska, 2016) which recommends taking a comprehensive view of consortial services, including stakeholder and community perspectives, and using both qualitative and quantitative methods. Within this framework, three methodologies were selected to assess the ACE service: a comparative analysis of similar services, a quality assurance review, and a survey of staff stakeholders.
Findings
This assessment project has enabled us to articulate the value of the ACE service, understand its place in the Canadian and International alternate format service landscape, and identify specific steps that would improve the quality of the service and reduce barriers to its usage. As the first project created within the Scholars Portal service assessment framework, it has also been an important vehicle to test that framework and its methodologies.
Action & Impact
Recommendations about enhancements that can be developed for the ACE service are currently being reviewed by the consortial governance bodies. Specific, actionable recommendations about the most effective ways to implement or manage the service locally have also been distributed to the local ACE coordinators at each consortial member library. Internally, Scholars Portal is working to revise the service assessment framework based on what we have learned from this first project. The framework, along with the findings of this assessment, will be shared publicly with our community.
Practical Implications & Value
Services provided at the consortial level are complex, involving a variety of different stakeholders and different institutional priorities across member libraries. The ACE service assessment project responds to both shared and individual library requirements to provide accessibility services to users, faculty and staff. By looking at both local and consortial needs, this project helps to validate the new service assessment framework. For Scholars Portal, introducing the service assessment framework represents a step forward in our plan to periodically assess our services comprehensively, transparently, and appropriately.
Local Taxonomies Supporting Campus Analysts: Integrating Research Impact and Data Visualization Services
Sarah Murphy, The Ohio State University
Additional Author:
Sheila Craft-Morgan, The Ohio State University
Draft Paper (PDF)
View Slides (PDF)
Keywords: Organizational Performance, Data Visualization, Research Impact, Institutional Research, Open Data
View “Local Taxonomies Supporting Campus Analysts” abstract
Purpose & Goals
Various campus entities use bibliographic data to 1) analyze and benchmark the institution’s position in relation to other universities, 2) understand collaborative networks, and 3) explore interdisciplinary connections. This data, however, is frequently not used in coordinated ways. Units independently harvest, repackage, and report it using their own procedures, definitions, and practices, leading to inconsistent results. Academic librarians are uniquely positioned to address this issue. As trainers and metadata experts, they already educate campus constituents to locate, gather, utilize, understand, and apply bibliographic tools and measures to assess research impact, both ethically and effectively. But stretched campus institutional research and research analysis units still invest a tremendous amount of time and human capital matching data collected from various public and proprietary sources to local discipline groupings, research programs, and other analysis units. Academic libraries can help these groups save time by using their data skills in concert with existing campus infrastructure to help facilitate consistent, timely reporting. This study asks how academic libraries can engage campus analysts and use existing taxonomies with unique identifiers to develop tools that save campus planners time and improve reporting consistency across units. The study offers new ways for envisioning the application of library assessment by showing how integrated library research impact and data visualization services can successfully position the library as a partner for campus organizational performance improvement while simultaneously helping researchers make scholarly work discoverable and administrators understand the breadth, scope, and success of the university’s various research programs.
Design & Methodology
Aware of an ongoing campus need to link funding data to publications, patents, and clinical trials, and filter this data using local taxonomies, the authors developed a list of use cases to share with campus planners to surface mutual goals and guide the development of tools. Because the possibilities of this type of work are infinite, the authors believed a prototype was needed to inspire discussion and focus the linking of bibliographic data to local systems. They initially developed a series of Tableau dashboards that integrate NIH RePORTER, iCites, and PubMed data. These dashboards not only raise awareness of the existence of and richness of the NIH data but show how to meaningfully model and visualize both grant and bibliographic data, as well as filter this data using local constructs. The prototype includes federal funding data for grants awarded to the university between 2018-2022, lists of research publications associated with this data from the NIH RePORTER, corresponding reference and citation lists along with RCR values from iCites, and bibliographic information for the reference and citation lists from PubMed. To allow meaningful filtering, the data is linked to local systems using a simple join table that matches local identification numbers for each author with the research publication PMID, and the Scopus author id. Since most campus researchers do not have the requisite skills to harvest and clean author and location data from the messy affiliation field in PubMed, the join table helps campus planners to normalize faculty names, as well as university college, department, and division assignments. They can filter data using local directory information – including faculty ranks, discipline groupings, research programs, buildings, and more.
Conclusions
The prototype is now complete, and the authors are engaging various constituents across campus using the use cases and prototype. The dashboards include embedded instructions for using the views and document the sources the data was assembled from. A data dictionary is also available to facilitate a shared understanding of how each field is defined. When ready, the authors plan to mount the join table to Tableau Server and start educating campus partners on how to apply this table to their own projects to consistently filter data using meaningful local taxonomies. We anticipate additional questions concerning how research impact and data visualization services can broadly support organizational performance measurement will result from these meetings. We will also work with partners to develop future tables linking local data to other bibliographic data sources to facilitate benchmarking and comparisons.
Implications & Value
As research impact and data visualization services evolve, librarians in these roles have an opportunity to work with campus partners to support the university’s organizational performance measurement. This study builds the library’s existing reputation for providing quality bibliographic research and instruction with the creation of new services that leverage librarian expertise to save researchers time and improve the quality of their analyses. It also provides an example of the creation of a cost-effective data source by harvesting public data and combining it with local datasets which can be replicated by other institutions.
What’s on your website?: Findings from a Nationwide Inventory of Basic Needs Services on Library Websites
Sindy Lopez, Ithaka S+R
Sage J. Love, Ithaka S+R
Additional Author:
Melissa Blankstein
View Slides (PDF)
Keywords: Academic Libraries, Public Libraries, Collaborations, Basic Needs, Community Colleges
View “What’s on your website?” abstract
Purpose & Goals
With funding from ECMC Foundation, Ithaka S+R launched the Maximizing Public-Academic Library Partnerships research initiative examining opportunities for collaboration between academic and public libraries and how they can best support basic needs and holistic student success. The overall goal of the project is to provide practical recommendations and tools that build and strengthen connections that can improve basic need resources and services for both students and the broader community. The paper discusses the first phase of the project, specifically how libraries advertise basic needs resources and services on their websites and our approach to systematic data collection through varying sources of web-based information. We leveraged a qualitative inventory methodology developed by Ithaka S+R that employs a systematic approach to search library websites using a defined set of search categories. By exploring how basic needs information is advertised, we aim to understand how libraries can partner to build capacity to serve students and communities facing basic needs challenges.
Design & Methodology
We conducted a qualitative inventory analysis of 100 randomly selected academic institutions and their corresponding public library systems using Qualtrics to document the presence and absence of specific variables related to basic need resources and services. We also captured information on whether current library partnerships existed and whether basic needs information was listed in Libguides and calendar of events. The basic needs categories examined included technology, housing, food, mental health, physical health, financial aid, transportation, and childcare services. Information was also captured on resources and services that focused on particular subpopulations (e.g. veterans, immigrants and refugees, formerly incarcerated people, etc). Overall, this inventory aimed to understand the landscape of basic need resources and services within academic and public library settings and whether formalized library partnerships exist to provide these types of services to address both student and community needs.
Findings
Findings from this inventory include insights on the advertisement of basic needs services and collaborations at public and academic libraries. Our analysis illuminated that there were differences in how information was communicated through varying library types, geographical locations, and other institutional markers. We also observed that public libraries, driven by their public missions, tend to offer a more comprehensive suite of basic needs services, while academic libraries focus on student-centric areas such as technology, career assistance, and mental health support. Although formal collaborations between different library sectors seem to be limited, we anticipate that there will be areas of development that can enrich collaboration between libraries, particularly on joint access and programming. We also anticipate uncovering more about informal partnerships between public and academic libraries in later project phases.
Action & Impact
We intend to publish a publicly available report outlining our findings on the current information gaps that exist in advertising basic needs services to students and communities. By highlighting areas where information may be lacking on websites, this research can enable librarians and library staff to better organize information based on the specific needs of their patrons. Additionally, we will leverage the insights from our inventory analysis to support other project phases, including identifying suitable candidates for case studies and conducting state policy analysis. The larger goal of this project is to use these initial findings to inform discussions between public and academic libraries on how to foster formal collaborations and capacity building to serve the basic needs of students and their communities. We hope to identify strategies employed by libraries in advertising basic needs services to create a knowledge sharing network where librarians and library staff can learn from each other and adapt these practices to their own institutions.
Practical Implications & Value
Community members can act on this research by reviewing the findings to pinpoint potential information gaps within their own libraries and partnerships. Specifically, this study highlights areas where libraries might fall short in communicating the basic needs services available. This can prompt librarians and staff to consider whether all their services are adequately represented on their websites. This can also help shift dissemination strategies that ensure relevant information reaches students and patrons alike. This research also makes a contribution to the overall body of work in library assessment by introducing new methods for content analysis, particularly in manual data collection and qualitative data capture from websites. This systematic approach to website analysis offers a framework for qualitative assessments in library settings when there are no other means to gather specific and dispersed data.
Paper Session 6: Spaces
Building Equitable User Research Methodologies: Lessons and Guidance from the Installation of a Sensory Space
Harini Kannan, New York University Libraries
Lauren Kehoe, New York University
View Slides (PDF)
Keywords: library assessment, user research, academic libraries, sensory space, neurodiversity, student success, inclusion, accessibility
View “Building Equitable User Research Methodologies” abstract
Purpose & Goals
A sensory space is a comfortable and welcoming area within any larger space that provides a calming and soothing environment for those with a variety of sensory input needs. This space allows users to center themselves in a way that is not often afforded to them. In partnership with our university’s student accessibility center, our libraries applied and received grant funding to install a sensory space on the first floor of our main campus library. In this paper, which will be published as a book chapter in a forthcoming book (published by ACRL), we detail our multistage user research project which became the foundational ground from which the sensory space was designed and developed. In particular, our paper addresses the questions: what do neurodiverse students with sensory processing characteristics need from a sensory space, and why; how do we develop a user research project that is informed by frameworks of critical disability studies; how do we challenge tokenization and exploitation of user research participants from marginalized backgrounds; and how do we meaningfully incorporate our research findings such that they have material impact?
Design & Methodology
To answer these questions, as in, to build an accessible and ethical user research project, we first drew upon multiple theoretical frameworks. This included centering the social model of disability, borne from critical disability studies, the framework of “nothing about us without us” from the disability civil rights movement, and Mia Mingus’s theories of collective access from within the disability justice movement. This theoretical grounding informed the precise research questions we developed, our strategies for participant recruitment, and the values with which we wanted to lead our interactions with students. Our research methodologies were informed not only by the aforementioned frameworks, but also through following Universal Design for Learning (UDL) principles: multiple means of engagement, multiple means of representation, and multiple means of action & expression. Embodying these principles led to designing research methodologies that enabled verbal and non-verbal communication, synchronous and asynchronous participation, community or individual interactions, and so forth. Primarily, we utilized short and long surveys, Zoom interviews, facilitated focus groups, and asynchronous Google drive submissions based on a facilitation guide. While our methodologies yielded a large amount of data, we utilized manual qualitative data analysis methodologies such as coding by hand and thematic analysis to better synthesize the various outputs of data collected.
Findings
Our findings fell into the following large categories: expectations for any sensory space; inclusion, belonging, & visibility; comfort and safety; community building; preferences and triggers as it relates the five senses; sensory management and aids; furniture needs & spatial orientation; health privacy; rules & community guidelines. Within each category lie major themes and insights that provide us clear guidance around the material and social design of the sensory space. From the findings, material design considerations include: variations of furniture to enable modular body positions; demarcation of the space to allow for both focused studying and relaxation; building out tranquility toolkits; clear guidelines on noise, visual, and olfactory stimulation in the space; and more. Social design, based on the findings, include: community guidelines for addressing conflict; events and programming to build collective identity; and addressing issues related to health information disclosure. This is just a handful of the vast amount of findings that came from our research.
Action & Impact
Our findings directly impacted the next steps of the sensory space design, installation, policy design, and outreach. With space design, we used our findings to ensure that furniture was arranged to allow for variation of seating options and mobility navigating across a smaller space. In terms of installation, the current space we’re working with is geared towards a sensory relaxation environment, but students voiced the need for spaces for sensory stimulation and experimentation, especially by themselves. We’ve decided to renovate two individual study rooms as sensory spaces, equipped for students to customize their sensory experience. The findings reaffirmed the need for sensory kit lending for on demand use, to allow for expanded sensory soothing beyond designated spaces. We will build out digital and physical sensory kits, and work with in-person service providers to develop workflows for loaning out the latter in the library. For policy design, we leveraged our findings to convince our campus partners to expand access to the space for students identifying as neurodiverse broadly, not just students who identified as autistic– for whom the sensory space was originally conceptualized. This is a direct result of our role in challenging the medical model of disability which limits access to resources. More broadly, we can more accurately advocate for the needs of neurodiverse students with campus partners, and push for the installation of sensory spaces outside of the libraries. In terms of guidelines, we created a set of community guidelines to set expectations for the space, provide guidance on navigating conflict, and more. These guidelines have been sent back to students for additional input, solidifying a community-based feedback process. These are some of the many direct and material applications from our research, highlighting the various ways to implement findings depending on budget, staff capacity, and time.
Practical Implications & Value
Most commonly, user research is used to design responsive technology and digital products, which shapes much of the collective knowledge and practice in libraries. However, we strongly believe that user research must be shaped and applied beyond its conventional uses in the field, given that the library is a physical, social, and political space, in addition to a digital one. While this user research project was applied to the installation of a physical sensory space, we believe our research design process, methodologies, and execution can provide guidance for user research practitioners who are looking to be in community with neurodiverse students. Our work falls in line with the burgeoning field of Critical UX studies within Library and Information Sciences, which challenges the politically neutral assumptions made within user experience and assessment practices. For UX practitioners looking to develop their critical lenses, our paper presentation will provide honest and transparent learnings around an inclusive approach to user research.
If You Rebuild It, Will They Still Come? Evidence-Based Library Space Planning, Post-COVID
Cynthia Kane, Emporia State University
Alexander Mosakowski, Emporia State University
Draft Paper (PDF)
View Slides (PDF)
Keywords: COVID-19, academic library assessment, student perceptions, library space utilization, mixed-methods
View “If You Rebuild It, Will they Still Come?” abstract
Purpose & Goals
This paper addresses the issue of college students’ utilization of an academic library building space for individual and group study and related activities after the impact of COVID-19. In the 2018–19 academic year, elements of ethnographic research were employed by librarians at Emporia State University, Emporia, Kansas to learn more about the use of the William Allen White Library building in terms of its physical space and to obtain quantitative and qualitative evidence for future library building projects. The COVID-19 pandemic in early 2020 negatively affected the numbers of patrons using the physical library building for several years, and it has only been in the 2023-24 academic year that the WAW Library has seen a gate count increase in patrons entering and exiting the building. In addition, the Kansas Board of Regents adopted in June 2021 a policy framework for its universities, including Emporia State University, to report annually on campus building space utilization efficiencies. As a result, the initial 2018–19 study was replicated in 2023–24 by librarians in order to identify possible changes in the last five years in students’ library space preferences for research, recreation, and other aspects of academic life.
Design & Methodology
The 2023–24 project employed a mixed-methods approach that replicated much of the original research design while adding one element. The study received ESU Institutional Research Board approval in September 2023. A literature review in Summer 2023 by the co-researchers reviewed published library and information science journal articles about library building space research following the COVID-19 pandemic. In Fall 2023, an online survey was provided to ESU students to establish their familiarity with and use of the library’s four floors, decks, and the 24/7 Library Learning Commons. The survey contained the same questions as an online survey in the first study. There were 324 respondents to the second survey, an increase of 77% from the 248 respondents to the first study. Individual interviews and focus groups of students were conducted in Fall 2023 and Spring 2024 to gain insight into those students’ use or non-use of the library building. The new element in the 2023-24 study was a face to face focus group who participated in an adapted design charrette of library floor models to offer collaborative suggestions for rearrangements of furniture, tables, and other movable items. In February 2024, questions on whiteboards on the four floors and the Library Learning Commons were rotated weekly, asking students about furniture, lighting, noise levels, and related factors to their decisions to use those spaces. Finally, unobtrusive observations were conducted in Spring 2024 in two week intervals at assigned times on the floors and the Learning Commons. Floor plans were employed to map patrons’ preferred gathering spots in these areas. The Learning Commons and 1st floor, both open to ESU students as 24/7 spaces, were also checked at midnight in the unobtrusive observations to see how those spaces were used by students after the rest of the library building was closed.
Findings
Preliminary qualitative and quantitative data indicate that the library building is perceived by ESU students as a quiet and relaxing place for study and research. Students indicated quite often that they value the physical surroundings of the floors to reinforce their focus upon their classwork as opposed to distractions in their apartments or residence hall rooms. The survey and charrette activity revealed that students prefer movable furniture such as small tables and chairs that they can rearrange to create their own customized study spaces. Input from students who live locally in Emporia but do not use the library building showed that these students tend to stay where they are once their face to face classes are finished and they have returned to their homes. A significant finding, similar to the 2018–19 study, is that student activity in the library building increases in the late afternoons and early evenings, but this activity is not necessarily correlated with the need for research assistance. Instead, students are seeking a place for individual and group work and the 24/7 areas of the Learning Commons and the 1st floor remain popular as a result.
Action & Impact
The co-researchers will compare all results of the 2023–24 project to data from the 2018–19 project in order to evaluate longitudinal changes. Next, they will collaborate with the librarians and the Dean of the ESU Libraries and Archives to determine priorities for updates and rearrangements in furniture, lighting, and other easily customizable items on the library floors and the Learning Commons. The Dean will also share the study’s findings and conclusions with administrators such as the Provost and Vice President of Academic Affairs, the Vice President of Student Life, the Vice President for Enrollment Services, and the Director of Facility Planning for long-term considerations of library building remodelings.
Practical Implications & Value
In a time of expanded competition among colleges and universities for residential student enrollment, the academic library as a whole is also challenged to demonstrate its value to and its impacts upon student enrollment, recruitment, and retention. This research, especially since it encompasses quantitative and qualitative evidence from students pre- and post-COVID, contributes to the library community’s greater understanding of student attitudes and beliefs about physical library building spaces. In addition, the Kansas Board of Regents’ policy framework concerning campus building space utilization efficiencies is not unique to this state. Both public and private universities increasingly are asked to justify uses and expenses in physical building maintenance. Library assessment research which draws upon students’ input can assist greatly to illustrate to administrators the positive influence of the academic library upon student success.
Library Space Summit our vehicle to a participatory approach, expanding facility renovation planning from one campus to twenty
Steve Borrelli, Penn State University Libraries
Robin Tate, Penn State University Libraries
View Slides (PDF)
Keywords: Facility master planning, space, renovation, participatory
View “Library Space Summit” abstract
Purpose & Goals
As our 10-year Master Facilities Plan for the University Park Campus Libraries had run its course, the Space Steering Committee was tasked to develop a list of prioritized library facility enhancements across Penn State’s 20 physical campuses to leverage when working with library donors. This paper describes a participatory approach which engaged all library personnel across 20 campuses in a survey, a full day Space Summit for middle managers, a guided discussion with senior administrators and ultimately the development of an updated list of library facility enhancement priorities expanded to 20 campuses.
Design & Methodology
The Space Summit was designed to engage middle managers from over 20 campuses and 60 departments and branches across the university libraries, to develop proposals for public space facility enhancements across libraries locations. Summit Participants engaged in individual and group activities throughout a full day of programming leveraging stakeholder input from a library-wide survey in crafting big-bold public space facility enhancement proposals. A library-wide survey initiated the effort which invited all library personnel to share ideas for public space enhancements. The resulting 154 submissions were leveraged to develop approximately 80 location-specific worksheets, summarizing enhancement ideas submitted in the library-wide survey. On the day of the summit, middle managers worked individually to develop public space proposals informed by location-specific worksheets which summarized submissions from personnel who worked in the location the manager represented, then middle managers worked in small groups to refine ideas. Following the Summit, the Space Steering Committee conducted a prioritization exercise to identify proposals to put forward for administrative review leveraging guiding principles which resulted from survey responses. Senior administrators were next led through a guided discussion to identify the final list of facility enhancements to leverage with donors when opportunities arise.
Findings
The 154 survey responses were used by middle managers to develop approximately 100 proposals for public space refinements across 20 campuses. The Space Steering Committee prioritization exercise leveraged guiding principles for facility enhancements informed by the library-wide survey, to identify 17 proposals to put forward for senior administrator review. The guided discussion with senior administrators resulted in a refined list of 13 public space facility enhancement proposals and identified additional desired enhancements to pursue regardless of donor support.
Action & Impact
The list of facility enhancements is adopted for use by senior administrators. Additional ideas for facility enhancements are in hand to capitalize on should opportunities arise.
Practical Implications & Value
Opportunities for donor funded facility enhancements are increasingly rare. Being prepared when opportunities arise is good practice. Senior Administrators are now armed with a list of over a dozen facility enhancement proposals developed by the personnel working in those spaces for consideration when donors express interest.
Shall We Start at the North Entrance? Library Space and an Inclusivity Mindset
Susanna Cowan, University of Connecticut
Draft Paper (PDF)
View Slides (PDF)
Keywords: DEI, space assessment, inclusivity, practice
View “Shall We Start at the North Entrance?” abstract
Purpose & Goals
This paper arises from the design and implementation of a DEI-inflected space audit, although that audit is not the focus of this presentation (I will submit that separately as a poster proposal). The process of designing that audit generated the question: How do libraries, and academic libraries in particular, incorporate principles of diversity, equity, and inclusion (DEI) into library space work at its most fundamental level? There has been a surge in assessment work focused on asking questions about whether library spaces meet the needs of patrons when issues of inclusivity are centered. Of particular significance, I think, is how asking questions about how inclusive libraries are has expanded what it means for a library to be accessible. In addition to approaching spaces with checklists aimed at ensuring physical access to library spaces, libraries are now asking questions about less easily measured but equally important criteria such as welcoming-ness and representation. (See for example, Elteto, Jackson, and Lim, 2008, “Is the Library a ‘Welcoming Space’?”). The question propelling this paper is not – How do we design an assessment to answer questions about how inclusive library spaces are/are not? The question I hope to explore is – How do we re-understand what it means to caretake, renovate, and design library spaces from a DEI-grounded mindset? “Library as place” as a construct focuses on the library as experienced by its users, but academic libraries are both “library as place” (the construct) and complex facilities requiring constant evaluation and maintenance. What remains unanswered, I think, is to what degree our day-to-day “library space work” can be grounded in inclusive practices and mindsets. And how would we go about doing that? Where would we start? How can we move from inclusive space as a project to space inclusivity as a practice?
Design & Methodology
Although this conceptual paper will not use any one assessment project as its focus, it might be thought of as a paper that explores how we can establish theoretically grounded but practice-focused frameworks that move inclusion-focused space work from checklists and design/redesign projects to “all the time” mindsets. The reason I am proposing this as a conceptual paper and not a “what we did” one is that in part I want to address the challenges in this work, which are many. Reframing design, remodeling, planning, maintenance, and other space work in terms of an imperative to uphold DEI principles in everything we do requires that we let go of mindsets regarding library spaces that we have long understood as neutral and unimplicated by DEI concerns. As colleagues have expressed to me, in space we often use compliance as our measure for accessibility. But compliance alone (with, for example, ADA statutes) restricts us in setting the bar at legal requirements, as if laws have ever been fully inclusive or future-facing. It is easy (or at least simple) to grab a ruler and measure the distance between one’s stacks. It is several orders of magnitude more difficult to start asking: Does this go far enough? Who does this not help? What do physical measures miss completely? Are we missing the forest for the trees (or the trees for the forest)? In the 20 minutes I have, I will present an overview of how DEI has changed how we assess library spaces, but I will also demonstrate how our space “toolkits” haven’t yet caught up to our newer understandings of how space impacts our users. Finally, I will offer a place to begin (maybe): a reflective space audit.
Conclusions
I am still developing my thoughts on this, as I continue to research this and related topics. So far, my conclusions have gotten this far: that it is difficult to find “space rubrics” for libraries that go beyond ADA-focused criteria. Although there are certainly space-focused assessment projects and design work that have centered DEI work, I have not found much that addresses how inclusive practices move beyond special project work (a particular design/re-design project) to re-ground library space-facilities practices. I am especially interested in how we begin to evaluate our spaces in terms of affective criteria. Research is telling us that the facilities many of us work in a fundamental way tied to traditions of higher education rooted in exclusionary traditions. Where do we start unpacking this regarding the spaces we have and our practices regarding them? Measuring stakeholder perceptions will be essential, but how should we first reflect on our spaces and our relationship to them before we start asking stakeholders questions?
Implications & Value
Libraries are constantly undertaking design/re-design projects, furniture upgrades, service-point remodeling, remediation, and other space-focused work. Increasingly, I imagine that such work will intersect with new and reinvigorated commitments to have inclusivity reflected not only in our new and existing spaces but also in the way we evaluate, maintain, and safeguard them. We have clear rubrics for measuring how well our spaces accommodate physical disability – and to some degree non-physical needs. But we need to begin considering how “library space” is both about row and door widths and an idea that encompasses how well we invite people in our doors and make them feel welcome while they are there. Thinking in those terms will require us to equip ourselves with information about our spaces and how we conceptualize them more holistically. This paper will not say “here’s the toolkit,” but I hope it offers “consider this, and here’s a possible starting place.”
Learning Lab 4: Powering Up Your Data Work with Excel’s PowerQuery
Powering up your data work with Excel’s PowerQuery
Angela Zoss, Duke University Libraries
View Slides (PDF)
View “Powering Up Your Data Work with Excel’s PowerQuery” description
Learning Outcomes
- Articulate the benefits of reproducible data work
- Find PowerQuery in modern versions of Excel
- Connect to a data file in PowerQuery
- Understand different types of PowerQuery data transformations
- Build a query with a multi-step data transformation sequence
- Push transformed data back to Excel for summaries and visualizations
Learning Lab 5: Empowering Libraries
Empowering Libraries: Assessment Tools for Examining Campus Engagement and Library Use for Student-Centric Success
Sandra De Groote, University of Illinois Chicago
Jung Mi Scoulas, University of Illinois Chicago
View Slides (PDF)
View “Empowering Libraries: Assessment Tools for Examining Camus Engagement” description
Learning Outcomes
- Participants will identify the essential components of assessment tools designed to explore student academic engagement, library use, self-regulation, and students’ own definition of academic success.
- Participants will be able to adapt assessment tools to the needs of their local institution.
- Participants will articulate the comprehensive process of utilizing assessment tools, including identifying potential barriers to obtain students’ demographics both within and outside of the library. d. Participants will depart with a toolkit of practical applications tailored to their institution’s specific needs, ready for immediate implementation to enhance their understanding of student experiences and support academic success.
Panel 5: Educating for Assessment
Educating for Assessment
Kara Malenfant, Dominican University
Kawanna Bright, East Carolina University
Lisa Hinchliffe
Starr Hoffman, University of Nevada, Las Vegas
Megan Oakleaf, Syracuse University
Devin Savage, Illinois Institute of Technology
Stephanie Crespo-Méndez, Syracuse University
View Slides (PDF)
View “Educating for Assessment” description
Learning Outcomes
Session attendees will be able to: – Identify effective practices in education and training for library assessment practice in a variety of settings. – Describe the challenges of identifying key competencies and training library practitioners on a full range of evaluation methods and strategies. – Recognize the need for a multi-faceted approach to meet the education and training needs in the library field.
Key Topics
Content: What do library assessment professionals need to know? How is this changing and evolving over time? What are related areas of practice, e.g., UX, and how to make the linkages without overwhelming learners? Method: What can be taught in an iSchool course? What is best learned on-the-job and through continuing education? What is the role of informal community of practice groups, e.g., local and regional meet-ups? Assessment: How can we evaluate the outcomes of education and training experiences? How do we ensure that teachers in this area are also progressing their own knowledge and skills?
Plan for attendee engagement
Before the dialogue among panelists begins, attendees will be asked to reflect on their experiences with learning to do assessment and respond to a series of prompts to share their experiences. Then, after the initial panelist discussion, attendees will have an opportunity to ask questions and Kara will moderate with the panel. To close the session, attendees will be asked to reflect on how they will apply what was discussed in the session.
Panel 6: UX and Assessment
UX and Assessment: Bold Statements & Engaging Conversation on the Role of UX in Library Assessment
Kris Johnson, Montana State University Library
Beth Filar Williams, Oregon State University Libraries and Press
Lindsay Ozburn, Utah State University
Krystal Wyatt-Baxter, University of Texas at Austin
Amy Bailey, Grand Valley State University Libraries
View Slides (PDF)
View “UX and Assessment” description
Key Topics:
The moderator will introduce the topic, then each panelist will introduce themselves and share a bold statement related to “the value of UX in the assessment world.” The ensuing moderated panel discussion will weave in audience participation using digital response software. Panelists will discuss how and where UX fits into assessment at their institutions, what UX maturity looks like in their organizations, and examples of UX making a positive impact. UX and assessment can be tiring, exhausting, and challenging at times — panelists and audience members will also consider what has helped them remain resilient in pursuing this work.
Learning Objectives
- Attendees will hear perspectives on how UX fits into the larger assessment conversation that may challenge their current perspective or understanding
- Attendees will gain a better understanding of the value of a UX mindset across different types of academic libraries
- Attendees will learn approaches for applying a UX lens within their own libraries.
11:00 a.m.–12:00 p.m. | Concurrent Session 4
Paper Session 7: AI & Assessment
AI and Library Assessment: A world of learning and opportunities
Martha Kyrillidou, QualityMetrics LLC
Additional Author:
Leo Lo, University of New Mexico
Draft Paper (PDF)
View Slides (PDF)
Keywords: AI, Assessment, Learning, Professional Development, academic libraries, public libraries, state library agencies
View “AI and Library Assessment” abstract
Purpose & Goals
Artificial Intelligence (AI) refers to the development of computer systems and software that can perform tasks that would typically require human intelligence. These tasks may include problem-solving, learning, understanding natural language, recognizing patterns, perception, and decision-making. All of these tasks are critical skills and abilities for Library Assessment. This paper will discuss and frame the issues around the use of AI for library assessment purposes and the types of knowledge and training that is needed for assessment librarians to utilize AI tools responsibly and ethically. Our collective knowledge of AI tools in the library ecosystem is a step behind the people who develop these tools and step ahead of the rest of our society. Based on our personal experiences we have seen that AI tools are used in libraries to facilitate and augment our information retrieval skills and educate our patrons on the responsible use of these tools. Little systematic work has taken place so far to understand how library assessment operations can support these activities. Over the past year, our research has been deeply immersed in understanding the multifaceted implications of AI for libraries. This includes an analysis of AI tool utilization within academic libraries, evaluating AI literacy levels, and determining the requisite training and comfort levels necessary for effective implementation. Our investigations are now extending towards evaluating AI’s integration within public libraries and state library agencies, signaling a broader applicability of our findings.
Design & Methodology
Our methodology is fundamentally exploratory, designed to dissect and understand the nuances of AI application in a library setting. We leverage a rich tapestry of data, literature, and firsthand professional experiences to construct a comprehensive overview of current practices and potential future directions. One of the co-authors, Leo Lo, brings a unique perspective through his development of courses on prompt engineering for Sage Campus and the Medical Library Association (MLA). His work, focusing on the nuanced crafting of prompts to elicit desired responses from AI models, adds a critical dimension to our analysis. Furthermore, Lo’s extensive presentations across various forums on AI’s societal impacts enrich our discourse, providing a grounded understanding of the broader implications of our work. In this paper, we draw upon these experiences and insights to frame critical inquiries about the integration and impact of AI tools in library assessments. We aim to engage our audience in a dynamic, collaborative discussion on key questions such as:
- Strategies for preparing libraries for AI integration and the specific role of assessment librarians in this process.
- Identifying the training needs for library staff and assessment librarians to competently navigate AI technologies.
- Examining the incorporation of AI tools within information literacy instruction and the systematic sharing of assessment outcomes related to AI tool usage within libraries.
- Evaluating library literacy programs through the lens of AI application.
- Establishing a supportive and proactive community of practice for AI in libraries.
- Defining the contribution of assessment to enhancing media literacy in the age of AI.
Through this multifaceted approach, our paper seeks to stimulate a forward-looking dialogue on shaping a future where AI tools are harnessed responsibly and innovatively to advance library services and assessment practices.
Conclusions
Our research and analysis have led us to identify a pronounced necessity for enhanced training focused on Artificial Intelligence (AI) within the library sector. This need extends even more broadly to society at large, emphasizing the urgent requirement for comprehensive education on AI issues. Our conviction is that library assessment occupies a pivotal position in this educational landscape. It has the potential to significantly contribute to ensuring that engagement activities centered on AI awareness and exploration not only meet but exceed learning outcomes and objectives. Through such targeted assessment, libraries can amplify their role as pivotal educational and informational resources, delivering value to their communities in ways that are both measurable and profoundly impactful. Moreover, this investigation has prompted us to reflect on several additional questions that could shape future research directions and practical applications within the field:
- How can library assessment methodologies be adapted or expanded to more effectively measure the impact of AI literacy programs within communities?
- What best practices can be established for libraries to navigate the challenges and opportunities presented by AI, ensuring ethical use and accessibility for all patrons?
- In what ways can libraries serve as community hubs for AI education, fostering a culture of informed and critical engagement with AI technologies?
- How can the success of AI-related engagement activities be quantitatively and qualitatively assessed to demonstrate tangible benefits to library stakeholders and communities?
These questions underscore the ongoing journey of discovery and adaptation as libraries confront the realities of an AI-integrated world. Our work suggests that by embracing their educational and assessment roles, libraries can lead the way in cultivating a society that is not only proficient in AI but also critical and ethical in its application.
Implications & Value
Our ambition is for this work to serve as a catalyst for ongoing dialogue and collaboration within the library community and beyond. We foresee regular gatherings—be they virtual or in-person—where professionals from diverse backgrounds convene to exchange insights on the integration and utilization of AI tools within library services. These discussions will highlight innovative practices and critically evaluate learning outcomes resulting from AI engagements, with a keen focus on ethical considerations and the appropriate use of AI technologies in both professional settings and broader societal contexts. By fostering a proactive discourse on AI’s role in library assessment and its broader implications, this work aims to contribute significantly to the evolving landscape of library science and information management. It seeks to bridge gaps in understanding and practice, enabling library professionals to navigate the complexities of AI with greater confidence and ethical rigor. In essence, our work strives to lay the groundwork for a more informed, ethical, and collaborative approach to leveraging AI in libraries and inform the library assessment practices.
Beyond OERs: Using an environmental scan process and an AI assistant to evaluate the current OER and Open Practice landscape
Donna Harp Ziegenfuss, University of Utah
Draft Paper (PDF)
View Slides (PDF)
Keywords: OERs, open pedagogy, open education, content analysis methodology, environmental scan process, AI data analysis
View “Beyond OERs” abstract
Purpose & Goals
During these uncertain times in higher education, college degree affordability and decreasing student enrollment are issues surfacing on many college campuses. Open Education Resources (OERs) and no-cost textbooks are touted as a possible student success strategy, and librarians have become primary players in this solution. In addition, Covid-19 identified gaps in traditional library practices and post-Covid change continues to pivot toward digital resources and online learning. Although many OER best practices and examples can be found to help librarians find and promote OERs, Open Practice (OP), or using OERs in teaching, is less studied. At an institution with no formal OER or OP initiatives, and no librarian or staff officially dedicated to OERs and/or OP, this research was designed to collect data that can be used to jump start decision-making and planning for new open education initiatives. The researcher is a librarian who also teaches graduate courses in qualitative research methods and uses OERs and OP in her own teaching, but she is not an expert in this area. A 2023 sabbatical, provided research opportunities to investigate the opportunities, benefits, and challenges of OERs and OP. It also afforded time to explore new AI assistant features embedded in Atlas.ti, a qualitative data analysis tool taught in her classes. The research questions are:
- What are the OER and associated open education practice trends being discussed, reported on, and utilized both inside and outside the library? How do these trends align to the researcher’s library and institutional priorities?
- Based on evidence collected in an environmental scan of the open education landscape, what could an open education initiative at a R1 public university look like?
- How could the ChatGPT AI tools embedded in computer-assisted qualitative data analysis software (CAQDAS) tools, like Atlas.ti be used to conduct an environmental scan?
Design & Methodology
This paper presents the results of a 7-step environmental scan process (Wilburn, Vanderpool, et.al., 2016) conducted during a 2023 librarian sabbatical to explore, document, and analyze OER, open practice, and the broader open education landscape. Environmental scanning is a an analysis tool used to inform decision-making for designing new policy, program planning, and initiative development. The researcher, not an open education expert, used the scanning process to systematically learn about OERs and open education, as well as, learn how to use an Atlas.ti, an AI-assisted qualitative research analysis tool. Data were collected from open and available sources: (1) transcripts of open education webinars, events, and interviews; (2) abstracts of open education scholarly articles; (3) reports and newsletters; and (4) open education list-serves, notes, blogs and websites. In addition, evidence and statistics of open education state initiatives and national peer institutions were gathered. Names of open education organizations, OER experts, and OER textbook titles were also curated as part of the environmental scan process. A Content Analysis (CA) methodology (White and Marsh, 2006) was then used to analyze the data to identify trends and patterns. Data were collected, cleaned and then imported into Atlas.ti software for analysis. Two methods of auto-coding, a semi-manual method using word frequencies to identify codes, as well as an automated Atlas.ti AI-assistant tool were used to code and summarize data. The AI-assistant provided the primary coding of the data, and the researcher served as the second coder and triangulated the coding using manual coding strategies. The biggest lesson learned was that the auto-coding process resulted in too many codes and categories (between 500 and 600 codes). Codes were manually merged and purged, and re-organized into new categories before thematic analysis was conducted. Categories were then analyzed looking for patterns and connections and five themes were identified.
Findings
The large number of codes and categories created through the AI-assisted auto-coding tool were overwhelming, but the process also created an opportunity to be totally immersed in the data in a way not usually possible with more traditional coding practices. The categories resulted in a detailed understanding of the open education topic themes, but also provided tagged excerpts of the countries represented in the data, a relationship of openness and data types, details on the impact on student success, and a range of methodologies described in the abstracts. Codes and themes also indicated the value that OER and open practices could contribute to other campus initiatives. However implementing OER and open education practice is not without challenges and some of the challenges presented in the data include funding issues, faculty resistance to using OERs that may not be peer reviewed texts, and time constraints of finding and adapting or creating OERs. Third level of coding, or selective coding, and constant comparative strategies resulted in five themes. The resulting themes will be used to suggest possible next steps for moving forward with OER and open education initiatives at the researcher’s institution. The five themes identified include:
- Integrating OERs and open practices on a college campus is a team sport
- Building campus-wide awareness, open education capacity, and digital competencies that are customized to university culture and context
- Designing a community of openness to bridge support for different stakeholders
- Impacting student success through empowering students using equitable and inclusive open pedagogy best practices
- Extending beyond the OER and aligning open education work to the library and university strategic planning priorities
Action & Impact
Since there are no formal OER or open education initiatives currently on the researcher’s campus, except for open access publishing, it was important to gather broad data and set guidelines around the planning process for new OER and open education initiatives. Findings from this research will be used as recommendations for discussions and building a foundation for identifying strategies for open education planning. The environmental scan provides a 30,000 foot view of the broader open education topic, not just from what is being published, but what experts are presenting on and discussing in conferences, meetings, and organizational communications. The next step is to design a way to visualize the findings so that an advisory group can use the information and findings to make decisions and move open education planning forward. I am using the findings to complete a preliminary sabbatical report for the Library Dean and making recommendations for the library role in a campus-wide open education initiative. I will present my findings and recommendations with a focus on the administrative aspects of designing and implementation of open education to two academic associate vice-presidents who are OER supporters and will become critical campus partners in decision making, planning and implementation. An online self-directed Canvas course already in production to introduce faculty to OERs and open education, will need to be restructured to meet the needs of all stakeholders, not just faculty, based on the research findings. Logic model planning, project mapping, and decision making by the advisory board will lead to an assessment plan for measuring success of the open education initiatives and impact on library stakeholders.
Practical Implications & Value
This research contributes to the library assessment community by presenting the how-tos and lessons learned when using an alternative data collection process, the environmental scan. This scan method could be adapted and used for exploring any library initiative or outreach project. We often think of conducting a needs assessment before planning library programming, but the environmental scan provides a broader view of the situational context and trends around the topic. After an environmental scan, a team or advisory group could use the findings to help create a logic model, a needs assessment, an assessment plan, or a communication plan based on the scan findings. This research also provides guidance on how to use AI tools for qualitative data analysis that would be of interest to the community. The researcher plans to use a Playbook to present information and instructions that people can use to make progress on something like information gathering, training, or using tools. Playbooks are used on the researcher’s campus for curating online learning policies and processes, in the school of business, and for international grant work, so there is campus familiarity with the term. In this case, an Open Education Playbook will be a framework for organizing all OER and OP initiative work, policies, training materials, tools, resources, and recommendations for using OERs and open practice in teaching. It will also be a way to create an open education presence, make a case for OERs and open education, and collect and display statistics. A Playbook provides an opportunity to continue to build on the open education work, for example where new initiatives like grants or graduate student opportunities can be posted and spotlighted as we move further along the OER journey. SPARC also uses a playbook to present OER state policy.
Paper Session 8: Assessing Library Technologies
Assessment as change management: Facilitating consensus, decision-making, and culture change through a scaffolded approach to ILS review and selection
Lindsay Ozburn, Utah State University
Joseph LaSure, Utah State University
Additional Authors:
Alex Sundt, Utah State University
Liz Woolcott, Utah State University
View Slides (PDF)
Keywords: Integrated library systems, migrations, change management, consensus building, scaffolded assessment
View “Assessment as change management” abstract
Purpose & Goals
Managing large-scale change is difficult for any organization, but can be particularly challenging for busy and resource-strapped libraries. In our case, the lack of a structured approach to assessing library-wide operations meant that decisions were often driven by anecdotal information and hinged on selective requirements rather than a full accounting of stakeholder needs. As it became clear that a migration may be needed, this project sought to both determine our current Integrated Library System (ILS) needs and build a more robust framework for ongoing assessment and consensus-making. Importantly, our goal was to develop an assessment strategy that would carry us through not just the review and selection process, but could continue post-migration to help support implementation, training, and ongoing system improvements for both staff users and library patrons. This paper will discuss the scaffolded assessment methods and tools that we employed and how they factored into our overall review process and ultimately sparked a culture shift in our ILS review and management processes.
Design & Methodology
The ILS Review Task Force scaffolded a variety of assessments of the ILS over several years, including: surveys, focus groups, workflow analyses, and ethnographic-style interviews. In Phase One, all library staff were allowed to complete a nine-question Qualtrics survey, consisting of both multiple-response and free-response questions. The aim was to identify and garner feedback about the digital tools staff use in the course of their work. In Phase Two, testing consisted of two stages: a patron-facing workshop and workflow-analysis interviews. During the workshop, eighteen staff members created workflows for typical information-seeking tasks using the Libraries’ discovery tools. Participants, broken into three groups, evaluated each process, highlighting potential user difficulties. Before the workshop, participants used worksheets to reflect on witnessed patron issues. In Phase 3, staff participants created workflows using Excel, outling three processes centered on digital resources and three related to analog resources. Participants outlined the actions taken and the ILS module(s) they used. In cases where the ILS was not used, an explanation and the identification of alternative tools were requested. This information was coded and stored in an Airtable database. Finally, the researchers conducted two group interviews during which participants explained their workflows and tool preferences. Interview questions were distributed before the sessions and, to fill any gaps from the sessions themselves, afterward as a questionnaire. The investigation concluded with a feedback survey and analysis of the findings. After these assessments, an RFP committee was formed to take the functional requirements and findings from the analysis and create a detailed list of RFP criteria. Next, standardized usability and functionality tests were created to evaluate system performance and provide a standardized rating for each criteria. This ensured that each system was evaluated fairly and various library functions and staff needs were weighted equally in the scoring and selection process.
Findings
Early findings demonstrated that the library system was both outdated and missing key functions to support a modern, e-focused collection – a situation exacerbated by stark increase in e-resource usage during COVID. The workflow analysis, in particular, highlighted that the emphasis on e-resources dramatically shifted where work was taking place, moving labor from the later collection lifecycle events (i.e. shelving, circulating, weeding) to early lifecycle processes such as acquisition and licensing. Critically viewing the possibilities in new systems under the lens of the needs identified helped staff to investigate potential new systems with more intentionality. Additionally, staff reporting to the Collections and Discovery department were asked to provide reports on how the workflows they had outlined might be changed or improved by migrating to a new system. This analysis proved critical to changing perspectives on the burden of migration – with almost all staff reporting that a new system would improve their work efficiency. Standardizing the way staff assessed the viability of new systems during the RFP process was an important development because the same tests for each criterion were applied to all systems reviewed. Feedback and results were less mired in individual preference and were focused on performance with realistic, typical tasks undertaken by staff. While the ultimate conclusions were heavily dependent on the costs analysis, the technical reviews were critical for establishing the justification for the system selection to the library.
Action & Impact
The scaffolded approach we took produced a lot of data points for our analysis, but also exposed numerous problems with our training, communication and management practices surrounding the ILS. Bringing in ILS vendors to showcase their products afterwards brought these problems into sharper relief, demonstrating what was possible in next-gen systems and generating buy-in and excitement for a potential migration. After initiating the RFP process, this systematic approach was continued during our ILS review and selection process, with multiple opportunities for staff to engage and independently test and rate system performance using standard evaluation criteria. This more objective, consensus-driven process helped mitigate potential bias toward certain functions or user groups within the library, and led to a decision that reflected the broad needs of the library as best as possible. Although ILS migrations can be quite challenging and stressful, if staff had to be put through such pain, it made sense to take advantage of the opportunity to deeply examine our needs and re-think our workflows. Not only did this help ease the implementation process, it also helped library staff know what to expect and how to take full advantage of the system to ensure workflows could not only be adapted to the new ILS, but were actually improved and more sustainable as a result. Moreover, our process sought to include more voices and perspectives into our ILS discussions, including end-user perspectives that were often secondary to internal collection management needs. This is beginning to re-shape the culture around our technical systems, helping the library as a whole understand the role and impact of library technology, and creating a foundation for continued assessment and improvement of library operations more broadly.
Practical Implications & Value
From an organizational decision-making perspective, utilizing process scaffolding helps ensure that prior, related assessment data is woven into current and future library evaluations. The scaffolding approach encourages an assessment-minded culture by establishing a baseline of data-informed needs that staff can build upon. The process has helped our library shift decision making further away from ‘feelings’-based or anecdotal data to systematically-collected assessment data that builds onto itself to form a robust collection of user-informed data. From a methodological perspective, our paper contributes to the limited body of work on assessing for the internal and external needs of ILS. It uniquely incorporates voices from across staff levels and specialties rather than relying on the voices of a few to make an impactful decision like selecting and migrating to a new ILS. When it comes time to choose an ILS, we will have justifications that everyone understands—justifications that are not based in one person’s needs or wants. It was not biased to one group. From a leadership perspective, our methods proved successful in generating substantial—if not total—buy-in for migrating to a new ILS. Consensus building in academic library is extremely difficult when faced with a wide-reaching, impactful decision. Our paper demonstrates that utilizing this assessment methodology was the key to building consensus and collective knowledge building in place of anecdotal decision-making. It brought employees from across the library together to enhance our users’ experience—a process that everyone was willing to get behind. Leadership brought people in as participants in this entire conversation and not as token representatives. Treating people as experts and sources of information—with respect and as the experts that they are—generated a lot of positivity and buy in that will carry them through the pain of the migration process.
Faculty Reversion or Student Persistence? Investigating Differential User Group Trends in Library & IT Services Post-Pandemic
Craig Milberg, Willamette University
David Consiglio, Bryn Mawr College
Additional Authors:
Katherine Furlong, Bucknell University
Alexandrea Glenn, University of Pennsylvania Libraries
Wesley Ng-A-Fook, Barry University
Ellen Yu, Union College
View Slides (PDF)
View “Faculty Reversion or Student Persistence?” abstract
Purpose & Goals
Purpose:
The COVID-19 pandemic fundamentally altered the landscape of education and information access, forcing libraries and IT departments to adapt rapidly. This research aims to understand the lasting impact of these changes on user perceptions and usage of library and IT services.
Goals:
- Identify significant shifts in user importance and satisfaction ratings for library and IT services pre- and post-pandemic based on MISO data. This will reveal which services have gained or lost user significance and satisfaction in the new educational landscape.
- Examine the persistence of pandemic-induced trends. The study will determine if the “massive dislocations” observed during the pandemic, such as increased reliance on digital resources, are temporary blips or represent long-term shifts in user behavior.
- Provide valuable insights for library and IT decision-making. By understanding user priorities and evolving trends, libraries and IT departments can strategically adjust service offerings, resource allocation, and communication strategies to better serve their communities.
- Contribute to understanding the long-term impact of the pandemic on higher education and information services. This research will be valuable for understanding not just changes in specific services, but also broader trends in user expectations and preferences within the post-pandemic academic environment.
Design & Methodology
This presentation investigates changes in benchmarks and trends for Library and Information Technology (IT) services using the Measuring Information Service Outcomes Survey (MISO). MISO, a comprehensive survey tool established in 2005, gathers data on user use, importance and satisfaction with library and IT services across various institutions. The study leverages MISO data collected before and after the COVID-19 pandemic to identify significant shifts and emerging trends. The methodology involves a two-phase approach:
- Phase 1: Pre-Pandemic Baseline Analysis
- Researchers will select MISO survey data from a defined period preceding the pandemic (e.g., 2018-2020 pre pandemic).
- They will analyze the data to establish pre-pandemic benchmarks for various service categories within libraries and IT departments. ○ This analysis will focus on user importance ratings (how crucial users consider specific services) and user satisfaction ratings (how well services meet user needs).
- Phase 2: Post-Pandemic Trend Analysis
- Researchers will examine MISO data collected after the pandemic’s onset (e.g., 2022–2024).
By comparing this data to the pre-pandemic benchmarks, the study will identify significant changes in user importance and satisfaction ratings. This will reveal emerging trends in service usage and user priorities in the post-pandemic landscape. The analysis will examine statistically significant differences (p < .01) and effect sizes between pre- and post-pandemic data. It will explore these changes at the individual service level and identify macro level trends spanning across many service points.
Conclusions
This research, utilizing MISO’s rich dataset, presents a unique opportunity to understand the lasting impact of the COVID-19 pandemic on user perceptions and usage of library and IT services. We will be gathering data through spring of 2024, and aim to provide valuable insights from the most recent data. Understanding pandemic-induced shifts in user importance and satisfaction will allow institutions to adapt their services, resource allocation, and communication strategies to better serve their constituents in the evolving educational environment. We will enter this study with the following hypotheses:
Hypothesis 1: Differential Impact Across User Groups We expect the impact of the pandemic on library and IT service usage to vary across different user groups (e.g., faculty, students).
Sub-hypothesis 1a: Faculty Reversion Faculty members who adopted new library and IT services due to the pandemic may revert back to pre-pandemic usage patterns to a greater extent than students.
Sub-hypothesis 1b: Student Persistence (“Stickier” Changes) Students who entered college after adjusting to remote learning in high school may demonstrate more persistent changes in their library and IT service usage post-pandemic compared to students who experienced traditional in-person learning before college.
Implications & Value
This research has the potential to add valuable data and insights to the ongoing dialogue about the long-term impact of the pandemic on higher education. The findings can inform future research on user behavior, service delivery models, and technology adoption within educational institutions. By understanding the evolving post-pandemic user landscape and its implications, libraries, IT departments, and higher education institutions can adapt their strategies and optimize their services to better serve current and future generations of learners.
Panel 7: Planning Masquerading as Strategy
Planning Masquerading as Strategy: Creating a Future that Doesn’t Exist
Starr Hoffman, University of Nevada, Las Vegas
Kathleen Bell, George Mason University
Steve Borrelli, Penn State University Libraries
Rebecca Greer, University of California, Santa Barbara
Maurini Strub, University of Rochester
View Slides (PDF)
View “Planning Masquerading as Strategy” description
Learning Outcomes
- Participants will develop an awareness of Roger Martin’s insights on strategy.
- Participants will be able to distinguish between strategy vs. planning.
- Participants will be able to apply a critical practice to their strategic planning efforts.
- Participants will deduce how to avoid the pitfalls of strategic planning
Key Topics
- What is strategy and how does it differ from planning?
- What makes strategy difficult for academic libraries?
- What practical steps can libraries take to ensure their planning truly complements the parent institution’s plan and enhances their strategic choices? Or, if their institution doesn’t have a strategic plan, how can they reasonably scale a plan?
- Which of Martin’s five pitfalls of planning do panelists’ strategic planning efforts most often exemplify? (5 pitfalls: Inconsequential, fragmented, Internally focused, control obsessed, extrapolative)
- How do we translate for-profit frameworks to social sector/public good organizations/institutions?
Panel 8: No “One Size Fits All”
No “One Size Fits All”: An Array of Approaches for Assessing the Library’s Contributions to Student Learning
Becky Croxton, Colorado State University Libraries
Megan Oakleaf, Syracuse University
Jung Mi Scoulas, University of Illinois Chicago
Kenneth Varnum
Shane Nackerud
View Slides (PDF)
View “No ‘One Size Fits All'” description
Learning Outcomes
- Attendees will be able to describe a variety of approaches and strategies for better understanding student learning related to library engagement.
- Attendees will be able to describe collaborations that may be necessary for undertaking these student learning assessment strategies.
- Attendees will be able to identify potential ethical issues that can arise from close examination of student data and/or over-aggregating data to the point that marginalized voices are removed.
- Attendees will be able to formulate an action plan for conversations they may undertake when they return home to their institutions.
Key Topics
The moderator will introduce the panelists and explain the audience participation cards, which will be collected in the second half of the panel discussion and discussed at the end of the planned panel discussion. Key topics that will be discussed include student learning and engagement in connection with the library, library learning analytics, inclusion, collaboration, ethical issues, and privacy considerations.
Questions:
- Tell us a bit about your work to understand student learning and engagement in connection with the library …your key research questions or problems to solve, your core method(s), and (initial or more than that) results. What does this tell you about student learning and engagement that you didn’t know (as much about) before?
- Describe how your project/strategy addresses inclusion and getting beyond the averages to understand previously marginalized voices. How are student “voices” heard in your work?
- What kinds of collaboration did you need to undertake to begin/continue/complete the project?
- What ethical issues did you contend with in engaging in the project?
- What actions will result from the information you’ve found so far?
12:00 p.m.–1:30 p.m. | Lunch Session & Keynote: Sonia DeLuca Fernández
Activating Your Leadership to Advance Inclusion
Sonia DeLuca Fernández, Senior Vice Chancellor, Diversity, Equity & Inclusion
University of Colorado Boulder
View Slides (PDF)
View “Activating Your Leadership to Advance Inclusion” description
When “DEI” is being mischaracterized and weaponized, how should we structure a commitment to access and inclusion? In what ways might we better leverage our organizations’ missions to lead transformation? How can we better center the needs of students and other constituents while fielding attacks on our work?
In consideration of a range of influences on the future of higher education, shared equity leadership (SEL; Kezar et al., 2021) can help organize improvements, support collaboration, and provide clarity around how to make advancing diversity, equity, and inclusion everyone’s work. In this keynote Dr. Sonia DeLuca Fernández will introduce SEL and focus on the potentials for new conceptualizations of accountability and how your leadership is critical to re/creating practices that support access and inclusion.
1:50 p.m.–3:50 p.m. | Concurrent Session 5
Paper Session 9: Assessment Tools
Interpretive Description: A practical, rigorous qualitative approach for applied research
Laura Gariepy, Virginia Commonwealth University
View Slides (PDF)
Keywords: qualitative research, interpretive description, applied research
View “Interpretive Description” abstract
Purpose & Goals
What qualitative frameworks allow the pursuit of rich data about practical research questions without requiring researchers to squeeze their study into traditional qualitative frameworks that were developed to create grand theories? This paper introduces interpretive description, a qualitative approach which originated in the field of nursing. It focuses on answering real-world questions in applied disciplines, with rigorous ontological and epistemological underpinnings. The goal of the paper is to familiarize the library research community with this promising approach for practical qualitative research.
Design & Methodology
I will provide a brief overview of qualitative research and major qualitative traditions originally intended to develop conceptual theories such as grounded theory, narrative inquiry, and phenomenology, and why they are often insufficient for research in applied fields. I will then describe the key tenets of interpretive description, originally introduced to the field of nursing by Sally Thorne in order to address this very issue. I will address specific components of interpretive description, including question development, ontology, epistemology, and the use of various data collection and analysis approaches, which are often borrowed from other qualitative traditions with thoughtful justification. From there, I will explain how one study focused on undergraduate students’ attitudes about search data privacy in academic libraries would look if approached through the lens of each aforementioned traditional qualitative approach, and how the nature of the findings would differ based on each. Finally, I will demonstrate how interpretive description provided the right methodological fit to gain practical knowledge in this study, while still ensuring rigor. The focus of this example will be on methods, not on the specific findings of the study, although attendees will be provided additional citations to learn more about the results.
Conclusions
Many traditional qualitative traditions require librarians to perform force a square peg into a round hole when embarking on a practically focused research study. Interpretive description encourages the thoughtful utilization of methods from various qualitative traditions to answer specific research questions, which are posed in a way that allows answers to be resituated within the context of the applied field. In other words, interpretive description results in useful, practical findings for librarianship.
Implications & Value
Interpretive description, while utilized in many human services disciplines such as nursing, social work, and education, is essentially absent in library research with the exception of 1-2 studies. This methodological approach has tremendous potential to increase the pertinence of qualitative research in library research without compromising the rigor of the research being conducted.
Press Here for More Data: Advanced Tracking with Simple Technology
Sarah Leonard, United States Military Academy at West Point
Draft Paper (PDF)
View Slides (PDF)
Handouts (PDF)
Keywords: Library Data, Data Collection, Patron Interaction, Electronic Equipment, Simplify Reporting
View “Press Here for More Data” abstract
Purpose & Goals
Many public-facing librarians have encountered the time-honored tradition of putting a hash mark to paper after an encounter with a patron at a service point. While many libraries have moved to electronic forms of data gathering, this method has not necessarily been upgraded to meet the latest technology available. Common pitfalls arise from the limited data collected with hash marks, each interaction is counted, but none of the important details about the interactions are captured. At the United States Military Academy (USMA) Library we have implemented Stream Decks, programmable button hardware systems, as tools, used to collect more detailed and accurate reporting at service points while removing staffing constraints and ensuring patron experience is not interrupted by reporting activities. With the push of a button, date, time, location, question type, question method, patron type, and staff member who answered are all recorded automatically. USMA Library uses the software RefAnalytics in Springshare to record this data in an accessible and editable format.
Design & Methodology
Adapting the usage of individual streaming technology, programmable button hardware, an LCD button deck that allows for the automation of specific tasks to be coded in apps or platforms, and using reference analytics or data software, we are updating the systems for recording detailed transactions in a stable, effective, and informative manner. Using parameters of required data reporting, and to meet the information needs of USMA Library Administration, a set of buttons has been designed and coded to record valuable data points affiliated with service point transactions. This new method of reporting helps staff to access and edit submissions that come through via programmable button hardware, allowing the date and time of the transaction to be captured accurately. For USMA Library purposes, the editing option in RefAnalytics also allows staff to add Transaction Duration, difficulty level on the READ Scale, support type offered, and even the major assignment the question was about.
Findings
Integrating the programmable button hardware system into our service points required educating staff on the meanings of their selections on the programmable button hardware. In effect, every button pushed records multiple data points that are logged in reference analytics or data software. These logged events can then be edited by every staff member to add further details and information if necessary. The USMA Library’s initial rollout of the programmable button hardware system was January 2, 2024. Initial findings are visible when comparing data collected in 2023, through the old form, and 2024, through the Stream Deck system. We have had a limited time frame to collect data thus far, but once staff were educated on the functionality of the system, we have seen a significant 67% increase in service point transactions recorded, and a 56% increase in reference records. From January 2 through February 28 of 2023, USMA Library recorded 607 interactions, 196 of those being reference. The same time frame in 2024 saw 1,877 interactions, 449 of which were reference. There was also a huge increase in the speed of completing a record about a transaction. With our previous method, filling out the information required interacting with a questionnaire style form, each submission took an average of three minutes and 52 seconds to complete. With the programmable button hardware, simple interactions are complete after the press of a button. The data processing in the background takes approximately five seconds to load through. Recording transactions with the Stream Deck is over 45 times faster and still collects all the necessary data points.
Action & Impact
The records from the programmable button hardware are more accurate because staff are able to quickly submit the record at the time of the interaction, rather than trying to remember the date, time, and details to fill out the reference analytics or data software form at a later time. In the previous system staff would track interactions mentally, with notes, or hash marks while at a service point and then enter the data when they returned to their office. This created a discrepancy in the times reported. With programmable button hardware connected to reference analytics or data software, the recording of the transaction is actually completed in real time at the service points. At USMA Library, RefAnalytics contains the data and provides a useful Data/Time Stats tab where daily and hourly breakdowns of the questions received can be found. The data collected in the reference analytics or data software system is essential to expressing value and evaluating the services we provide our users. The Stream Deck system will continue to be refined to fulfill the information needs and will provide insight to changes at service points over time. Additionally, findings will inform many of the decisions the USMA Library makes in the future, including decisions on scheduling at service points and better understanding the needs of our users.
Practical Implications & Value
While budgets are shrinking and libraries are forced to prove their value with real data more than ever before, this new system contributes to the overall body of work in library assessment because it integrates the updates in technology with the increased reporting needs. The greater Library Assessment Community can replicate these efforts, and will be able to design systems that meet their own data needs to better support their individual missions or needs. For example, this technology could also be extremely useful for student assistants and other support staff who can be trained to use the system quickly, providing detailed results. References to any specific commercial product, process, or service, or the use of any trade, firm or corporation name is for the information and convenience of the public, and does not constitute endorsement, recommendation, or favoring by the Department of the Defense, the United States Army, or the U.S. Military Academy.
Quality assessment of Pontificia Universidad Católica Argentina libraries based on UX management model
Soledad Lago, Pontificia Universidad Católica Argentina
Draft Paper (PDF)
View Slides (PDF)
Keywords: library assessment, user experience, library management, innovation, cultural organizational change
View “Quality assessment of Pontificia Universidad Católica Argentina libraries” abstract
Purpose & Goals
In this presentation, I will focus on the changes introduced to the Library System of Pontificia Universidad Católica Argentina (UCA) based on the results of the libraries assessment and the methodologies applied to transform its management model into a user-focused one. This presentation will also concentrate on the research and understanding of users’ needs and on libraries assessment. Additionally, both the model and the methodology stages will be described briefly, including the tools applied such as LibQual, co-creative workshops, interviews, observation, photographic records, focus groups, etc. It is important to mention that our libraries have applied the LibQual methodology since 2014 every 3 years.
Design & Methodology
This new model includes different agile methodologies to develop the libraries’ functions and activities. It promotes the libraries’ organizational and cultural change to transform them into a learning space that not only favors users’ inspiration and creativity, but also empowers their skills. This management model is based on three axes: – The Agility Axis, where the Design Thinking methodology is applied to put into practice the library model focused on improving user experience. – The Innovation Axis, which involves changes in physical and virtual spaces as well as the development of an activity plan to spread an innovation culture. – The Cultural Change Axis, which fosters change management within library teams, in line with the university’s organizational change vision, and which contributes to the integration of the university’s different schools and teams.
Conclusions
The implementation of the model allowed us to understand the desires of our users and provided us with tools to build and develop more solid and flexible work teams focused on innovation and change. Moreover, to support this change, a new libraries organizational chart was created, and its functional areas, goals and scope were defined.
Implications & Value
Such a challenge meant assuming that libraries should be an inspiring environment for students, teachers and researchers to enhance users’ learning experience. Therefore, the expected impacts on the academic level include better academic performance of students, better performance of teachers in their pedagogical role as well as greater visibility of the academic and scientific production of the University’s researchers. As a consequence, the three most important dimensions in our libraries (services, physical space, and collection and access management) have been transformed. This experience may contribute to new ways of organizing libraries and to become aware of the significance of library assessment for libraries management.
Regrounding the theory of LibQUAL for mid-century: a qualitative study
Colleen Cook, McGill University
Draft Paper (PDF)
View Slides (PDF)
Keywords: library surveys, LibQUAL+, naturalistic inquiry, grounded theory
View “Regrounding the theory of LibQUAL” abstract
Purpose & Goals
Research question: What is the construct of library service quality from a user perspective for the mid 21st century? How does that theory validate, add to or alter the original LibQual dimensions of affect of library service, information control and library as place? Purpose: To evaluate whether the user-based construct of library service quality in LibQual has changed in important ways since the original LibQual research was conducted in the early 2000s and, if so, to suggest ways in which the theory could be added to or altered.
Design & Methodology
The LibQual survey research (Cook and Heath, 2001; Cook 2001) was a mixed methods study in two parts, a qualitative theory proposal phase, and a quantitative survey construction phase. This study revisits the work of the qualitative work of the study, and regrounds a construct of library service from a user perspective through semi-structured, on site interviews with library users, (faculty members, graduate and undergraduate students and provosts) from 8-10 representative libraries, initially in North America. Employing naturalistic inquiry and grounded theory methods, data analysis begins immediately upon collection of data, and is subjected to multiple, rigorous analyses using Atlas TI until the data are saturated and a theory emerges. Cook, C. and F. Heath. (2001). “Users’ perceptions of library service quality: A LibQual+ qualitative study.” Library Trends 49( 4): 548-584 Cook, C. (2001). “A mixed-methods approach to the identification and measurement of academic library service quality constructs: LibQual+” Dissertation Abstracts International 62 (07): 2295A, (University Microfilms No. AAT 3020024).
Conclusions
Although the study is continuing and data will be continuously analyzed until fully saturated, preliminary results suggest that the three LibQual dimensions of library service: affect of service, information control and library as place will continue to largely construct a user-centric theory of library service into mid-century. However, the important manifestations of the dimensions and the relative weight of each dimension to different user groups may have changed. For example, the importance of library as place for students seems to have strengthened in importance. In the past 20 years, the explosion of digital content and ease in which it is accessed, the authority of content, the effects of the pandemic, particularly questions around copyright and use of print content, the emerging impact of AI, the global renaissance of library construction, and the role of the librarian are emerging as influencers in a theory of library service mid-century.
Implications & Value
It is incumbent upon the community to assess how well society’s large investment in its libraries is managed. The LibQual+ survey has been an important tool in this endeavor. From 2000 to 2021, the LibQual+ survey under the Association of Research Libraries’ (ARL) management, has been used by over 1,300 libraries worldwide. There have been over 3,321 institutional surveys from 37 countries, in 20 language translations, garnering 2.9 million survey respondents (Introduction, McGill notebook of LibQual+results). As a tested valid and reliable total market survey, LibQual has provided a tool that allows libraries to compare their results to similar libraries over time. Because psychometric instruments such as LibQual+ may atrophy over time, for LibQual+ to continue to serve as an important tool in the assessment toolkit it is essential to reground its underlying theory.
Paper Session 10: EDI
Please, not another climate survey: Lessons learned from a diversity, equity, inclusion and accessibility focus group study
Jen-chien Yu, University of Illinois at Urbana-Champaign
Additional Authors:
George Gottschalk, Head of Acquisitions and Resource Management, Kansas State University
Lauren Phegley, Research Data Engineer, University of Pennsylvania
JJ Pionke, Adjunct Professor of Information, Syracuse University
Draft Paper (PDF)
View Slides (PDF)
Keywords: climate assessment, inclusion, DEIA, focus group
View “Please, not another climate survey” abstract
Purpose & Goals
This study sought to collect ideas from employees of an academic library at a public land-grant research university regarding desired changes to improve diversity, equity, inclusion, and accessibility (DEIA) in the workspace. The goal of the study was twofold: 1) To allow library employees to generate new ideas for improving DEIA within the library, and 2) To collect ideas on possibilities for actionable change within the library. The subject population consisted of library employees (aged at least 18 years old) who worked for the University of Illinois Urbana-Champaign during the time of the study. The focus groups included the following employment classifications: Faculty (tenure-system), Civil Service (non-academic staff as defined by the Illinois State Universities Civil Service System/SUCSS), Academic Professional (academic staff who are exempted from SUCSS) , and Graduate Student. Recruitment consisted of flyers (posted at staff lounges) and email invitations sent to the Library’s internal LISTSERVs.
Design & Methodology
Participants involved in a 60–90 minute focus group with 5-8 other library employees of the same employment categories. The focus groups were facilitated by two research team members. For each focus group session, at least one of the facilitators was from the same employment category as the focus group participants. The facilitators followed a semi-structured focus group guide during the focus groups. In order to accommodate all participants, regardless if they were working on-site or remote due to the COVID-19 Pandemic, the focus groups were conducted virtually or in person ina library conference room. Participants could choose whether they wished to participate via a virtual or in-person session. The virtual focus groups were recorded using the university-licensed video conferencing software (Zoom) and the in-person focus groups were recorded using a digital audio recorder and the university-licensed Zoom as a backup. The digital audio files from the recorded focus groups were then transcribed. Data collected for this study include the recordings, transcripts, and de-identified metadata about the participants (i.e. their employment categories but not their name, gender or unit name). The data were analyzed using deep reading and open coding.
Findings
Five (5) focus groups were conducted: one (1) in-person and four (4) virtual. A total of 18 individuals participated. Civil Service staff and Graduate Student employees participated most, with six (6) and five (5) participants respectively. Major themes that emerged from the study are: First, while the library hired a diversity officer and invested in initiatives to address issues related to DEIA, it still feels that DEIA is not a priority. Participants shared experiences of neglect or inactions when they discussed accessibility or retention at the library or university. The study concluded with recommendations for each of the areas (DEIA) that came from the focus group participants. It is worth noting that while this study focused on library employees, the participants wanted to see changes that are not only for their own health and workplace safety, but also for the health and safety for their colleagues and library patrons.
Action & Impact
The researchers submitted a written report to the library task force that they belonged to and the library administration. The report’s findings, as well as the results from two library climate assessment surveys, contributed to library-wide DEIA efforts. Similar to what can be observed after many climate assessments in higher education institutions, the subsequent inaction or clearly stated future intentions to make organizational-wide change based on the findings from this study remain unclear at this time. However, the researchers did apply what we learned from conducting this study in our own subject or functional areas. For example, the Graduate Student participants expressed that the New Graduate Student Orientation (for library workers) did not provide adequate instruction regarding when to involve campus police. One researcher brought that feedback directly to the group that was responsible for the orientation and the relevant orientation components were revised. There are also incremental changes in recruitment and professional development across all employment categories. Examples include a renewed interest in providing formal mentoring to library tenure-system faculty as well as establishing a process and additional funding for advertising open positions with library associations of librarians of color.
Practical Implications & Value
The researchers learned valuable lessons from the process and findings from an assessment perspective. First, the researchers obtained human subject research approval from the Institutional Review Board (IRB) to ensure the protection of participants’ confidentiality and privacy. Second, the focus groups were conducted in 2022 while indoor masking and social distancing protocols were still in place in the state. the researchers learned how to construct a “COVID-19 Human Subjects Research Safety Plan” as part of the study. While such safety plans might no longer be required, it was helpful to learn about ways to reduce the risk of contagion in a physical research setting. Thirdly, the researchers developed qualitative research training for research team members who were not familiar with the focus group research method. The entire team then went through the training together before conducting the focus groups. The process not only familiarized the research team on the research protocol approved by the IRB, but also the foundational values of a quality approach to climate assessment. Last but not least, an important value was that by conducting thoughtful and listening focused research on DEIA in the workspace, it positioned the researchers as trustworthy and DEIA-allied individuals. Being intentional about our work communicated to our colleagues that we as researchers were serious about our values. Structural change at scale is beyond the scope of assessment to resolve. However, assessment methodologies can be designed to enhance safety and well-being for participants engaged in challenging issues as well. These experiences underscore that assessment is a crucial part of the process of change, not apart from the processes of change.
Show Me the Data: Using Numbers to Drive DEI Progress in Libraries and IT
Katherine Furlong, Bucknell University
David Consiglio, Bryn Mawr College
Additional Authors:
Alexandrea Glenn, University of Pennsylvania
Craig Milberg, Willamette University
Wesley Ng-A-Fook, Barry University
Ellen Yu, Union College
View Slides (PDF)
Keywords: DEI, service assessment, diversity, quantitative survey methodologies
View “Show Me the Data” abstract
Purpose & Goals
Purpose: Many institutions strive towards diversity, equity, and inclusion (DEI) within their libraries and IT departments. However, assessing the effectiveness of specific services in achieving these goals can be challenging. This session aims to showcase the potential of quantitative survey methodologies to address this challenge, focusing on the experiences of students and faculty of color. Goals: Highlight the importance of understanding user experiences across diverse groups: The session will emphasize the need to go beyond assumptions and gather data-driven insights into how students and faculty of color experience library and IT services. Demonstrate the utility of quantitative surveys in assessing DEI impact: By focusing on these specific research questions:
- Do students of color experience library and IT services differently than their peers?
- Do students of color place different value on our services compared to their peers?
- How effectively do our services contribute to the achievement of academic goals for students of color?
- Do faculty of color experience library and IT services differently than their peers?
- Do faculty of color place different value on our services compared to their peers?
Participants will gain a deeper understanding of:
- How quantitative surveys can be designed and implemented to gather valuable data on user experiences.
- How this data can be analyzed to assess the impact of library and IT services on diverse user groups.
- The role libraries and IT departments can play in advancing institutional DEI goals.
Design & Methodology
This study utilizes data from the Measuring Information Services Outcomes (MISO) Survey to investigate potential differences in user experiences between students and faculty of color compared to their peers. Data Source: The analysis draws on responses from over 150 institutions that have participated in the MISO Survey since 2014, resulting in over 170,000 responses (58,000 faculty and 114,000 students). The primary focus is on data collected between 2019 and 2024. Survey Instrument: MISO surveys cover various service areas, but this study focuses on questions related to:
- Importance: How important users perceive specific services for their academic success.
- Satisfaction: Users’ level of satisfaction with the quality and effectiveness of various services.
- Demographics: Self-reported information on racial/ethnic and other relevant demographic characteristics.
The surveys utilize a four-point Likert scale for both importance and satisfaction ratings. Data Analysis: Statistical analysis will compare mean scores between groups on each service point, categorized by the user’s racial/ethnic identity. A significance level of alpha = 0.01 will be used to identify statistically significant differences. Effect sizes will be calculated for services with significant differences to assess the magnitude of the disparity. The analysis aims to understand overall trends, not focus on specific services. Therefore, instead of discussing individual services, it will:
- Identify the number of services with statistically significant differences based on race/ethnicity.
- Examine the direction of differences (higher/lower importance, higher/lower satisfaction) across services with significant disparities.
- Analyze the magnitude of effect sizes in cases of significant differences.
Findings
Across the board, nearly every library and IT service is viewed as more important to students of color than it is to white students. In addition, students of color indicate our services contribute to the achievement of their academic goals more than indicated by their white peers. This is robust across every way the data has been examined. All indications point to a phenomenon that spans the breadth of higher education. Students of color are more likely to consider library and IT services to be more important than do their white peers. The high-level view shows that nearly all of the services have the same level of satisfaction for white and non-white students. These results remain strong across multiple levels of examination. This means at a general level our students’ satisfaction with our services are not a result of their racial/ethnic identity. The faculty results provide some unexpected insights. Like students, the overwhelming majority of technology and library services were found to be more important to faculty of color than they are to their white colleagues. However, faculty of color are less satisfied with the library services than their white colleagues are. This difference in satisfaction is limited to only the library services and spans across nearly every library service measured. Notably, there are no meaningful differences in satisfaction with technology services.
Action & Impact
This research using MISO data holds significant promise for promoting equity and inclusion within libraries and IT departments. By providing valuable insights and fostering data-driven decision-making, the study can positively impact the experiences of diverse user groups and contribute to building a more equitable and inclusive learning environment for all. As with all surprising finds, more questions are raised than answered in the data. We intend to spark a thoughtful interactive conversation among session participants by raising the emergent questions and challenging them to seek the answers.
Practical Implications & Value
These are crucial findings. As institutions put more energy into meeting the needs of students of color, it is important to recognize the services library and IT departments provide. We knew our services were important to students and faculty, but students and faculty of color are telling us through these data that these services are even more valuable to them than we previously understood. It says that the library and technology services we are already providing are critical to helping reach the goals of our institutions’ DEI initiatives. By identifying services considered more crucial by faculty and students of color, libraries and IT departments can strategically prioritize resources and develop targeted interventions to bridge any service gaps. Additional research is needed for a deeper understanding of specific user experiences and requirements. However, by acknowledging and addressing the varying importance currently placed on services by different user groups, institutions can create a more inclusive and supportive environment for all.
Success Beyond the Stacks: Narratives of Former BIPOC Students in Academic Library Employment
Vashalice Kaaba, Florida State University
Additional Authors:
Kirsten Kinsley
Seol Lim
Draft Paper (PDF)
View Slides (PDF)
Keywords: BIPOC Students, Library Employment, Comparative Analysis, Assessment, DEI
View “Success Beyond the Stacks” abstract
Purpose & Goals
This paper, representing the fourth cohort in a series of cohort-based investigations, is dedicated to exploring the experiences and career trajectories of Black, Indigenous, People of Color (BIPOC) alumni who worked part-time in or closely with the assessment department at Florida State University (FSU) Libraries during their academic tenure. This study investigates ‘How has part-time employment at FSU Libraries impacted the professional development and career paths of BIPOC alumni?’ and its significance in highlighting the often-overlooked narratives of the BIPOC community. Further, it compares the experiences and outcomes of Cohort 4 with Cohorts 1–3, examining any differences and their underlying factors. This secondary inquiry delves into the evolving impact and dynamics of part-time employment across cohorts, aiming for a deeper understanding of professional development trends within the BIPOC community at FSU Libraries. By concentrating on this cohort, the study aims to illuminate the unique experiences, challenges, and achievements of these alumni and to contribute a critical perspective historically underrepresented in library and information science research. This includes examining how their racial and cultural identities may have influenced their work experiences and, subsequently, their professional lives post-graduation. In addition to contributing to a broader understanding of the impact of part-time library employment on student workers, this paper aims to compare and contrast the experiences of the BIPOC cohort with those of previous cohorts. This comparative analysis will enable us to identify any evolving trends or changes in the library work environment over time, particularly concerning diversity and inclusion. Furthermore, the study seeks to inform policy and practice, offering actionable insights for FSU Libraries, and possibly other academic institutions, to enhance work experience for current and future BIPOC student employees. These insights are crucial for developing strategies that ensure equitable opportunities and support, fostering an inclusive and supportive work environment(s).
Design & Methodology
In the initial study series (cohorts 1–3) examining former student library employees’ experiences, a comprehensive method assessed how library work impacted their careers. Participants, chosen from student workers based on availability and willingness, were invited by the assessment librarian. Recognizing the potential for recruitment bias, measures were taken to address and mitigate it within the research design. Using qualitative research and semi-structured online interviews, the study offered detailed insights into the effects of part-time library employment on career trajectories. This established research design will be instrumental for the fourth cohort, ensuring consistent data comparability across groups. A Memorandum of Understanding (MOU) was employed as a substitute for a traditional informed consent form, following a determination by our university’s human subjects review board that classified the project as “Not Human Research.” This alignment with ethical guidelines was crucial due to the sensitive nature of the experiences shared by the participants. However, we intend to reapply for IRB approval for changes we are making to the protocol by assessing a particular population to meet human subject ethical guidelines and in order to generalize our findings. Data collection involved a series of interview questions that spanned a range of topics. These included the participants’ university study disciplines, career progression post-employment, current and future aspirations, as well as the pros and cons of their on-campus work. Focus was placed on their college and work experiences, challenges in balancing work and study, job skills applied in future roles, and desired learning opportunities. For the proposed fourth cohort, which focuses on BIPOC populations, the study aims to add additional questions, incorporating the unique contexts and experiences of these individuals. This includes inquiring about their experiences with mentoring, sponsorship, and networking at FSU Libraries, as well as discussing the challenges and advantages encountered as BIPOC individuals.
Conclusions
Our upcoming research is poised to explore the professional development and experiences of BIPOC students who are part-time employees. This study is focused on the fourth cohort, aiming to understand their unique experiences in comparison to the first three cohorts. Through this investigation, we anticipate uncovering the evolving dynamics within academic libraries and how they impact these students. Anticipated Conclusions. Our research anticipates uncovering key insights into the experiences of cohorts 1–3 compared to cohort 4 at FSU Libraries, highlighting differences and similarities that reveal the evolving work environment and its impact on BIPOC students. This understanding will be used to inform the development of policy recommendations for FSU Libraries and similar institutions, focusing on strategies to enhance support and foster inclusivity for BIPOC students, addressing their specific challenges and needs. Expected Research Outcomes. The anticipated research outcomes of this study include identifying factors that differentiate the experiences of the fourth cohort from earlier groups. It will investigate how shifts in the library, university policies, and broader societal trends have impacted these experiences. A significant focus will be on the key aspects of BIPOC students’ library experiences, especially in terms of support mechanisms, mentorship opportunities, and professional development initiatives. We also seek to explore how our observations at FSU Libraries might align or contrast with trends in other academic institutions, thereby offering broader insights into the support of BIPOC students in academic settings. Finally, this research will open new avenues of inquiry about enhancing diversity and inclusion within academic libraries. This motivation underpins our intent to initiate a subsequent cohort (cohort 5) focusing on women in STEM, aimed at assessing their specific needs and challenges in male-dominated fields and determining strategies for the library administration to enhance their academic experience.
Implications & Value
The potential engagement of the community with this work is multifaceted, extending beyond the confines of academic discourse to practical, policymaking, and community-building spheres. Firstly, at the institutional level, particularly within FSU and similar academic libraries, this conceptual exploration invites a reevaluation of employment practices and support systems for BIPOC student employees. By highlighting theoretical insights into their unique experiences, this work can prompt library administrators and policymakers to adopt more inclusive and supportive strategies, thereby enhancing the work environment for a diverse student body. At a broader level, this study significantly enriches the discourse in library assessment and related areas, introducing a focused examination of diversity and inclusion within library employment. By exploring how racial and cultural identities intersect with student employment experiences and outcomes, it adds a crucial dimension to the existing body of work. This research highlights the potential role of the library in being a leader in supporting diverse and inclusive employment experiences for student workers. Such a contribution is particularly valuable in the current social climate, emphasizing the importance of understanding and supporting diversity in educational and professional settings. Additionally, the theoretical frameworks and hypotheses proposed in this study lay the groundwork for future empirical research. This research can further validate, refine, or challenge our understanding, enriching the body of knowledge in library and information science, and addressing the persistent gap in literature on the intersection of library assessment and diversity and inclusion within library employment. It also opens avenues for interdisciplinary collaboration, drawing insights from fields such as sociology, education, and organizational psychology, to build a more holistic understanding of the BIPOC experience in academic employment. Campus partners like career centers and student affairs departments are crucial in developing BIPOC employment best practices, sparking initiatives and discussions that foster an inclusive, diverse campus culture.
To Benchmark or not to Benchmark: Should We Do Peer Comparisons in Academic Library EDI Assessment?
Kawanna Bright, East Carolina University
Draft Paper (PDF)
View Slides (PDF)
Keywords: assessment, EDI, benchmarking, diversity
View “To Benchmark or not to Benchmark” abstract
Purpose & Goals
Efforts to collect data around and asses for equity, diversity, and inclusion (EDI) services, practices, and initiatives in academic libraries have increased in the past five years, offering libraries opportunities to gain a better understand of the impact of their work in these areas. However, unlike data collected through national and standardized efforts such as the Academic Libraries Survey (ALS), there have been no unified or agreed upon efforts to collect and share EDI data. Most libraries who engage in EDI assessments enter into those processes individually and there are no attempts to share the data or findings more broadly. This, of course, limits opportunities for benchmarking or comparison of the data. But even if this data were available more globally, should it be compared and benchmarked like other types of nationally collected data?
Design & Methodology
The author, serving in a consultation role, had an opportunity to explore this question due to a recent project undertaken by the Oberlin Group to utilize the author’s DEISAA instrument across multiple member libraries in the group. As part of the planning, the author was asked to consider the merits of whether benchmarking and comparing results of a shared EDI audit process would be beneficial. This conceptual paper takes a reflective look at the decision-making process undertaken by the author to help the Oberlin group determine if a benchmarking aspect of their process would add value to their understanding of EDI efforts; or if the intersectional and individual variables that impact EDI efforts would inhibit the comparison of this data.
Conclusions
Based on this reflective exercise, that also included review of the literature and experience analyzing multiple separate DEISAA submissions from other libraries, the author determined that a traditional benchmarking approach would not carry the same expected benefits, but that there were potential positive outcomes that could be gained from doing a limited benchmarking/comparison exercise. This reflective study raises additional questions related to the value of benchmarking academic library data in general, as well as how libraries should approach applying elements of benchmarking specifically to EDI data.
Implications & Value
The implications of this conceptual paper and the findings shared revolve around the need for academic libraries to engage further in conversations around collection of EDI data, how this data is used, and what it means when we choose to compare ourselves across variables that may not be directly comparable.
Paper Session 11: Student Success & Engagement
At the intersection of information literacy and written communication: Assessing students’ source-based writing
Sarah Dahlen, California State University, Monterey Bay
Draft Paper (PDF)
View Slides (PDF)
Keywords: Information literacy, Citation Project, using information from sources, source-based writing, faculty-led assessment
View “At the intersection of information literacy and written communication” abstract
Purpose & Goals
While much library instruction focuses on finding, evaluating, and citing information from sources, another element of information literacy is the ability to effectively use information. In an academic context, students are expected to incorporate information from sources into their papers in order to meet the goals of the assignment. Doing this effectively requires appropriate use of quoting, paraphrasing, and summary, as well as the awareness of the purposes that information from sources can serve in written communication. To get a baseline measurement of how well our students were performing in these areas, we addressed the following assessment question: How are students using information from sources in their papers and what motivates their choices?
Design & Methodology
Our analysis of student work was inspired by some of the methods employed by the Citation Project. Our multi-disciplinary team of six faculty collected student papers from 300-level writing classes in six disciplines and coded them. Each instance of information from a source (n=204) occurring in each paper was coded by two faculty scholars, and any coding discrepancies were resolved through discussion. Cited information was coded as a direct quote, a paraphrase, or a summary, and each of these categories had additional codes related to the nature of its use and its appropriateness. Information regarding the presence of synthesis, the rhetorical purpose of the information, and its location in the paper were also recorded. These codes and categories provided a snapshot of how students are using information from sources at our institution. In order to gauge students’ perspectives on our initial findings and to better understand their motivations, we subsequently conducted focus groups with seven 300-level writing classes from five disciplines (n=100). These focus groups doubled as instructional workshops, as we first had students respond to prompts about their practices/attitudes, and then followed by presenting our best practices for source-based writing. Coding of the resulting qualitative data was conducted in NVivo by a multi-disciplinary team of six faculty.
Findings
Students were primarily writing from sentences rather than sources. Direct quotes were often used when a paraphrase would have been more appropriate, and for the most part, did not conform to the “quote sandwich” format. Paraphrasing was largely successful, although over a third of paraphrases were patchwritten. Summary was used in 22% of the cases. Indirect citations made up 21% of our sample, which is a high percentage considering that this is a practice that should be seldom used. More concerningly, 71% of the indirect citations did not attribute the idea to the proper author. In most cases (88%), we did not see any evidence of synthesis. There were a number of different ways that cited information was used by students in their papers (background information, evidence, definitions, etc.). Student focus groups uncovered some negativity associated with assignments that require information from outside sources, with 50% of participants anticipating the task to be difficult or time consuming and 28% describing a negative emotional reaction. Participants identified the most challenging aspects as finding sources (60%), evaluating sources (57%), and citing sources (26%). They wished that instructors would provide greater detail in their instruction prompts (45%), along with examples (42%) and additional information related to finding, evaluating, and using sources (31%). Participants were able to identify some purposes that information could serve in their papers, such as supporting claims (42%) and presenting new ideas (23%). Students had a variety of explanations for when they use a direct quote versus a paraphrase versus a summary, but one theme that arose in each of these categories was efficiency. Students clearly value efficiency and often choose the method of source incorporation (direct quote, paraphrase, summary) that they find the most efficient for the task.
Action & Impact
Our findings prompted us to develop materials to support faculty instruction in these areas. Our “Source Guidelines Template” allows faculty to copy/paste appropriate guidelines for finding, evaluating, and using sources into their assignment prompts. We also created a slide deck for faculty as a starting point for discussing these topics with their students. Additionally, we are targeting 300-level writing classes for further library instruction sessions on these topics.
Practical Implications & Value
Through this presentation, the library assessment community will become more aware of assessment methods for evaluating student use of information from sources. Additionally, while our results are not generalizable to other institutions, they may indicate that this is an area librarians should address with their instruction. The use of information from sources is a competency that is infrequently addressed in the library literature, perhaps because it exists at the intersection of information literacy and written communication. Our findings indicate that many of our students are not receiving sufficient instruction in this area, suggesting that it could be a welcome addition to library instruction. Faculty at our institution have been open to (and grateful for) librarian-led instruction related to source-based writing.
Empowering Librarians to Support Students Navigating College with College Fluency
Elmira Jangjou, Ithaka S+R
Melissa Blankstein, Ithaka S+R
View Slides (PDF)
Keywords: College Fluency, College Navigation, Non-Curricular Needs, Community Colleges, Survey
View “Empowering Librarians to Support Students” abstract
Purpose & Goals
As the type of information that students are seeking has been radically impacted by social, demographic, economic, and technological changes, librarians find themselves increasingly required to master an expanded array of nonacademic information skills; alongside supporting students with academic and research-based inquiries. Even with orientation programs and first year experiences or courses that introduce how to navigate college structures, students still may not know what programs and assistance are available to them nor how to access them, thus impacting their well-being in addition to their academic success. These barriers to success, which existed prior to but have been exacerbated by the COVID-19 pandemic, are particularly relevant to first generation and lower socioeconomic status students, a group who are disproportionately served by community colleges. Our research team has coined a novel term to describe this increasingly urgent phenomenon: College fluency—the knowledge and abilities enabling students to effectively access, and utilize, and advocate for needed college services and resources. This IMLS-funded research initiative explores how leaders, faculty, and staff members both within and outside of the library respond to students’ non-curricular inquiries, how familiar they are with the services provided by other offices, and how they receive feedback and data to determine if students have received proper support.
Design & Methodology
To gain a comprehensive insight into students’ needs regarding college fluency and explore effective ways libraries can support them, our project employs a mixed methods approach. We began our study by conducting three qualitative case studies, examining multiple college fluency programs at institutions across different geographic regions, one of which was Sinclair Community College.. The case study interviews were designed to investigate various factors influencing the development of college fluency, explore the experiences of librarians involved in crafting relevant programs, as well as the criteria utilized to assess their success and impact. Two analysts collaborated to analyze the interview transcripts, engaging in a comprehensive open-coding process to establish a thematic codebook. They conducted a thematic analysis using NVivo to analyze interview transcripts and report case study findings. The case study findings offered iterative, on-the-ground insights that informed our subsequent phase: a nationwide survey. We are in the midst of administering this national survey involving community college librarians, among those in other student-facing roles across the college to assess their own college fluency abilities, and delve into their approaches to and perspectives on fostering college fluency among students. The survey instrument also draws on the input from our team of project advisors. We tested the instrument via cognitive interviews with six individuals in the survey population to ensure the questions are clearly and consistently understood by respondents in a variety of roles and institutional contexts. Our sample includes a random drawing of 5000 individuals, and we expect 500 to 1,000 respondents after the survey has closed in early April.
Findings
The case study findings shed light on the goals of college fluency initiatives, such as librarians’ efforts to connect with student spaces, build relationships across college, and create a welcoming and inclusive environment on campus. Moreover, they delve into multiple challenges that disrupt the success of these programs (e.g., low levels of engagement, impacts of the COVID-19 pandemic, and staff turnover) and critical factors in effectively cultivating a culture of college fluency on campus (e.g., fostering collective responsibility and assessing success). We anticipate concluding our national survey in early April, with findings expected to be publicly available by July 2024. We anticipate sharing with attendees the findings from our case studies and survey to highlight applicable ways in which their own libraries can assess and expand their own college fluency capacity. Between case study and survey findings, we are confident that we will have impactful recommendations for the library community.
Action & Impact
Findings from the case studies and this novel national survey will directly support the development of actionable recommendations and steps for librarians to enhance college fluency initiatives, foster collaboration between departments, and provide professional development opportunities to enhance the skills and knowledge of one’s own college fluency. To do this, the findings of this research initiative will guide a workshop series aimed at strengthening librarians’ skills and knowledge of college fluency, directly benefiting the material well-being of underserved students, and empowering librarians to contextualize the findings within their own communities. Attendees will be one of the first recipients of this newly developed workshop, and will depart with a list of potential collaborators and actionable next steps for enhancing college fluency initiatives on their campuses.
Practical Implications & Value
We expect the academic library community to be able to turn the project’s actionable findings into practice. Academic librarians will gain insight into assessing students’ non-curricular needs and the role that college fluency may play in enhancing student success and overcoming barriers to persistence. Librarians will be able to build on potential models identified in the project findings to partner on college fluency initiatives and institutional efforts to address students’ holistic needs. Student success can be greatly enhanced through collaborative efforts across departments, particularly between student affairs and the library. These coordinated strategies are especially valuable for students facing challenges navigating complex institutional structures, such as first-generation students, English language learners, and returning adult learners.
Creating a culture of sustainable relational student consultation in support of library service assessment
Karen Munro, University of Victoria
Courtney Lundrigan, University of Victoria Libraries
View Slides (PDF)
Keywords: Student engagement, User-centered library services, Service evaluation, Qualitative data, Student consultations
View “Creating a culture of sustainable relational student consultation” abstract
In Fall 2023, as part of a broader range of assessment and engagement activities, UVic Libraries began strategic consultations with student groups and student leaders. Our goals were to identify students’ needs, desires, and awareness of library offerings; to more systematically and sustainably gather qualitative data about student use of the library; and to build relationships with students in an intentional and accessible way.
One key approach we adopted was a two-question survey strategy. We used the same two high-level questions for multiple consultations with different student groups over the academic year. We asked these questions at both invited and proposed outreach opportunities, including orientation events, academic writing events, and meetings with student governance bodies. Our preferred method was administering the questions via an online survey, allowing students to give us their feedback in their own words while also providing them with time to think fully about the questions. This produced qualitative data that was both rich (i.e. permitted richness and variability) and somewhat comparable (i.e. structured around consistent prompts.)
Our initial findings helped to confirm and deepen our understanding of students’ experiences in and awareness of library services. Students tended to volunteer the most information about their experiences with self-serve resources like bookable spaces, digital collections, and equipment loans, and less about of their experiences of mediated services like research help. At the same time, we found differentiated perceptions across user groups like international, Indigenous, and graduate students. Students responded positively to our consultations, signaling the success of a relational approach to assessment and engagement. Finally, students expressed a desire for more communication about offerings from the Libraries.
This paper describes concrete actions we have taken in response to what we learned. They include new engagement strategies and service offerings, adjustments to core student services, routinized outreach, and ongoing funding for student positions. It also describes the initial impact of this work, including improved relationships with student leaders and groups, improved relationships with the International Centre for Students, improved social media engagement with students, and improved communication with students.
Key findings from our review of the consultation data include insights into general and differentiated student awareness and uptake of library services and resources; greater understanding into barriers and gaps for students using the library; and greater understanding of how students prefer to receive library communication.
Using Data Parties to Engage Students in the Survey Lifecycle
Angela Zoss, Duke University Libraries
Additional Author:
Joyce Chapman, Duke University Libraries
View Slides (PDF)
Keywords: participatory design, survey analysis, generating recommendations, student engagement
View “Using Data Parties to Engagement Students” abstract
Purpose & Goals
The biennial Duke Libraries’ student satisfaction survey was conducted in 2023. In previous survey cycles, assessment staff produced lengthy lists of potential recommendations based on survey data for review and ranking by library staff and administration. This year, we decided to try a new approach to give students a seat at the table, not only in providing feedback to the Libraries, but also in analyzing survey findings, ranking problem areas, and suggesting solutions.
Design & Methodology
We conducted two Data Parties, one for undergraduates and one for graduate students. Participants were offered an incentive (a $25 Amazon or restaurant gift card) as well as snacks during the event. Email invitations were sent to the 437 students who had provided their contact information in the 2023 biennial student survey agreeing to be contacted about future feedback opportunities with the Libraries. The email provided eight date and time options. Thirty-eight students volunteered, and 14 additional volunteers were garnered from advertising via the Duke International Student Center newsletters and library social media accounts, as well as advertising on the library homepage, and tabling outside the library coffee shop with candy. During the Data Parties, a series of data visualizations were posted at five topical stations around a large conference room. Students were split into small groups, and each group was provided with a worksheet to complete as they moved through the stations. Students had roughly ten minutes per station, half of which was spent examining the data individually prior to discussing the data as a group and completing the worksheet. At each station, students were asked to consider the following questions and write down their answers on the worksheet:
- What, if anything, surprises you about the data?
- Do you notice any other patterns?
- What more do you wish you knew or what additional information do you wish you had?
- Given the data, what are the problems or issues that exist for the libraries in this area?
Following the small group work, students came together into a single group with staff moderators. Students generated a list of problems on a whiteboard, which they then ranked with colored post-its as having high, medium, and low impact. Then, they brainstormed solutions to the problems on a second whiteboard.
Findings
The structure of the Data Parties worked well to engage students in discussions about the survey data and generate high priority solutions. Students generated new ideas that would likely not have been included in a staff-only recommendations process. A post-event feedback form indicated that students enjoyed talking with peers about the libraries and brainstorming solutions, and they overall found the visualizations to be useful. One major challenge was recruitment and participation. Despite slots filling up quickly, only half of the graduate student volunteers attended the event. We used that information to increase our recruitment efforts for the undergraduate event, but we expect to need to “oversell” events like this in the future. We also tried to keep it easy for students to participate by keeping the event short and avoiding pre-work before the event. Graduate students, however, seemed to want to explore the data in more depth and might be willing to participate in a series of discussions instead of a single event. With a single, two-hour event, students are only seeing a staff-curated view of the data, which prevent students from exploring the data deeply and generating their own insights. We also found it difficult to juggle gathering feedback from both students and other library stakeholders. This method of engaging students in the analysis process had the unintended result of generating suggestions that did not get reviewed by the broader library staff. In the future, it may be better to treat the process as three phases that each need both staff and student feedback: analyzing survey data, brainstorming recommendations, and prioritizing those recommendations.
Action & Impact
We are entering a new strategic plan cycle, and we expect a lot of changes to be happening in the libraries over the next few years. Our plan is to reflect on our new priorities and what we have learned from our biennial surveys and redesign our survey instrument and analysis process. Some changes we are considering are: lengthen the cycle to one survey every three years, redesign the survey to reduce the length and ensure coverage of high priority topics, expand our engagement with students during survey analysis, use the data party format for staff data exploration events as well, and make sure our recommendations are focused and reflective of a combination of data from both the student and staff perspective.
Practical Implications & Value
Our focus on participatory design was inspired by presentations and workshops at prior LAC conferences, and we hope that others will use and expand on our methods. The methodology described here could be repurposed for a variety of data sources. Events like Data Parties serve many purposes: most importantly, they bring a student perspective to the table, giving students a greater voice in library decision-making around services and providing a new perspective for analyzing data and brainstorming solutions. Additionally, Data Parties are an engagement, advertising, and outreach opportunity. As more libraries begin testing methods for engaging patrons in the design of services, we hope to continue sharing ideas and lessons learned.
4:20 p.m.–5:20 p.m. | Concurrent Session 6
Paper Session 12: Collaboration & Community Building
Building Cross-Campus Collaborations
Maggie Faber, University of Washington
Jackie Belanger, University of Washington Libraries
Additional Authors:
Jillian Morn, University of Washington
Courtney Berger Levinson, University of Washington
Fer Palomares Carranco, University of Washington Libraries
Draft Paper (PDF)
View Slides (PDF)
Keywords: Institutional Research, Campus partnerships, collaboration, building relationships, professional practice
View “Building Cross-Campus Collaborations” abstract
Purpose & Goals
This paper engages with the issue of how library assessment practitioners can build community with institutional partners. In the recently revised ACRL Proficiencies for Assessment in Academic Libraries, librarians are encouraged to engage with campus partners in a variety of ways:
- “Collaborate and partner with individuals or groups such as institutional research; teaching, learning or research centers; information technology units; and other assessment offices.”
- “Advocate for resources, support, and inclusion of the library in institutional assessment initiatives.”
This paper will explore the question: what does this work look like in practice? At many institutions, assessment efforts can be dispersed across different units (and even campuses), making it challenging to forge relationships, share expertise, and avoid duplicative efforts. This paper explores lessons learned at one large, highly decentralized research university through efforts to break down organizational silos in university assessment activities. We will discuss the history and impact of these efforts and show how others could implement similar community building initiatives on their campuses. The efforts the presenters and university partners have used in recent years include: 1) creating a Campus Assessment Working group; 2) setting up a formal, centralized review process for surveys; 3) and forging connections with adjacent groups on campus to create connections between those working with student data, data visualization tools, institutional research, and assessment activities. The paper will share successful (and not-so-successful) ways to identify organizational barriers to collaboration and develop strategies to navigate these barriers in order to build relationships. Attendees will come away with potential approaches they could apply in their own contexts. The presentation will also engage participants by posing questions about how they connect with colleagues at their institutions (is there a central assessment committee, for example?), and invite conversations about strategies attendees have used for building partnerships outside the library.
Design & Methodology
The paper takes a case-study approach, drawing on the experiences and reflections of multiple authors from different campus units over the course of a six-year period. The paper will also draw on a literature review focused on cross-unit collaboration in higher education, as well as feedback from partners gathered throughout the activities described above.
Conclusions
The efforts to build connections and break down organizational barriers between those involved in assessment have resulted in multiple benefits. The development of personal and professional relationships across departments has been integral to the success of this work, and the results include a greater understanding of the broader institutional context for individual departmental work, more effective data sharing, and collaborative skills development. Perhaps even more importantly, these efforts have also helped to surface and center equity-informed assessment practices across the campus. We have learned that using multiple strategies to build community can amplify the effects of these efforts and increase connection among the assessment community, but that sustaining this work over time can be challenging. One key question that we will continue to pose as we do this work is: what are the most effective strategies that are also sustainable for participants and leaders over time?
Implications & Value
The paper is co-authored by librarians and assessment/institutional research professionals from two other campus departments, and we believe hearing the perspectives of institutional partners can be beneficial to the library assessment community in exploring strategies for community building. The contributes to the overall body of work by providing concrete strategies for forging strong campus assessment partnerships and addressing important proficiencies for library assessment practitioners.
Partnering with Alumni Donor Board: Leveraging outside expertise to enhance a donor engagement initiative
Steve Borrelli, Penn State University Libraries
Robin Tate, Penn State University Libraries
Additional Author:
Leigh Tinick
View Slides (PDF)
Keywords: Development, board member expertise, assessment, appreciative inquiry, the concept of the knowledgeable stranger
View “Partnering with Alumni Donor Board” abstract
Purpose & Goals
In the Summer of 2021, the University Libraries Development Office contacted Library Assessment for assistance evaluating their new fundraising initiative, Donor Community Meetings. Donor Community meetings (DCM)s offered monthly beginning in the Fall of 2020, were conceived to bring like-minded donors and prospects together to build community and generate deep and sustained interest. Approximately five to 10 donors attended each meeting and up to five library employees presented at these meetings. Typically, the meeting was hosted by a Library Development Board (LDB) member, who also participated in preparing speakers for the event. While the initiative slowly gathered momentum, the Development Office and LDB wanted to identify concrete ways in which to grow these nascent coalitions.
Design & Methodology
Library Assessment, in partnership with Development, organized a series of focus groups with Development office staff, librarians who had presented at the meetings, LDB members, and selected donors and prospects to investigate 1) what do participants like about the DCMs?; 2) what kind of content do donors and prospects find engaging?; 3) what suggestions do library stakeholders have for improving DCMs?; and, 4) how can the meetings be structured to sustain the continued interest of participants in 2022 and beyond? Six 90-minute online focus group sessions were conducted on Zoom in the early fall semester with a total of 30 participants. Participants were asked a series of up to 10 open-ended questions dependent on the stakeholder group. Two members of the research team conducted each session—one facilitating and the second observing and taking notes. Questions were sent to participants in advance of each focus group. All sessions were recorded and transcribed by the research team. A verbal consent process was used in all sessions and participants were informed that the transcriptions and recordings would only be made available to members of the research team. In response to the findings and recommendations resulting from the focus groups, the Development department formed a task force comprised of LDB members, Development Office personnel, Senior Library Administrators, and Library Assessment personnel to transform recommendations into implementable actions. In the Fall of 2023, researchers followed up with the Development Office to re-evaluate the Donor Community Meeting initiative. Two development colleagues involved with the task force project initiatives were interviewed. This paper presents the initial study and results and closes the loop by following up after a year of implemented recommendations evaluating the overall success of the collaboration.
Findings
The LDB focus group meetings leveraged recommendations for DCM improvements and identified 9 areas for intervention and improvement. LDB focus group responses were used to develop implementable action planning to improve DCMs. Taskforce members then used their expertise and tacit knowledge to create task lists and methods to operationalize and deploy the interventions. As a result, LDB members and the Development Office rebranded DCMs into “Library Discovery Hours”.
Action & Impact
Due to the aftermath of COVID, the Library Discovery Hours (LDH) did not launch until January of 2023. Attendance was robust across the six LDHs presented in 2023. When comparing Library Discovery Hours before 2023 (with attendance of less than 10 people per session) to 2023, attendance increased over 900% which was well above the donor attendance goal of 200-300% by year-end 2021. The six Library Discovery Hours in 2023 represented changes to practice due to LDB members’ involvement and expertise, improvements in consistent offerings, as well as improved communications and attendance. Additional recommendations including taking LDHs “on the road” to highlight the Commonwealth Campus Libraries will begin in 2024.
Practical Implications & Value
Transforming “Donor Community Meetings” to “Library Discovery Hour” would not have been possible without the perspectives of LDB members. Board members are high-achieving individuals who recognize that when operating as a member of the LDB they’re participating in a community providing a valued opportunity to learn with and from other highly successful individuals. The Penn State Libraries recognize that our Board members are partners in success, eager for active involvement, to share more than their time and financial resources to better our libraries. Focusing their energy and enthusiasm on a library problem that they are equipped to intervene in provided an opportunity for each to make contributions aligned with their expertise, further vesting them in the success of the initiative and libraries. While it will take some time to evaluate the ultimate measure of success of participation in LDH leading to increased donations, indicators of success of the collaboration are many. The rebranding of Library Discovery Hour combined with enhanced marketing and communication strategies, improvements to meeting structures and topics delivered have resulted in vast improvements in registration and attendance catalyzing advocacy for dedicated FTE, and interest among library personnel in attending LDH events and delivering future programming further illustrates success of the effort. Post-event evaluations provide positive feedback on audience member experiences.
Open House: Socializing Assessment and User Experience within Your Library
Katherine Ahnberg, Princeton University Library
View Slides (PDF)
View “Open House: Socializing Assessment and User Experience” abstract
Learning Outcomes
The AUX department at Princeton University Library defined the following learning outcomes before we held our open house. LO’s of this paper will support attendees in drafting and creating their own goals and basic prototypes of activities and info stations for a similar effort at their home institution:
AUX Open House Learning Objectives 1. Participants should know how and who to contact in order to be able to start engagement with the department. 2. Participants should have fun in order to reduce the anxiety of “assessment” and engaging with the department. 3. Participants will be able to articulate 3-5 types of assessment or user experience activities in order to be able to identify the types of methods that might be opportunities for assessment in their work. 4. Participants should be able to define assessment in order to understand the purview of the Assessment and User Experience department. 5. Participants should be able to identify the benefits and added value of assessment and user experience in order to engage with the department.
Detailed Outline
In the summer of 2023 the Assessment and User Experience Open House held a fun, informal, gathering for Library staff to come-and-go to meet the department’s staff and learn about this new department at the Princeton University Library. The Open House was a come-and-go event with both mediated and unmediated stations related to different aspects of assessment and user experience. There was no agenda and no formal presentation. Instead we worked to remove barriers to asking questions or proposing ways that our department might work across the large organization that is our multi-branch system. Examples included creating a library facts trivia wall, repurposing a prize wheel to be a randomizer for an on-the-spot results shared survey, and a hands on chance to work with in-development AUX data dashboards. We learned a lot about the needs of our colleagues in a few short hours, and we’d love to share our insights into what works. An early success, our three person team engaged with over 70 employees from all walks of library service and generated invitations to follow up with staff in a low key way that has continued to expand and extend our support for PUL well past our initial time investment. Participants will gain insight into the hands on stations AUX staff created to spark conversation, invite immediate feedback, demonstrate upcoming projects, and welcome our colleagues to further connect with department staff as individuals and library assessment on the whole. This paper will support attendees in creating a similar open house event, offering an interactive environment that will provide tools for identifying key questions, projects, and activities in their role or department that translate to staff in-reach.
Using assessment as a tool for relationship-building: proving need, gaining traction with your strategic goals, and demonstrating a dedication to equity
Michael Harris, Utah State University
Lindsay Ozburn, Utah State University
Additional Authors:
Erin Davis, Utah State University
Kacy Lundstrom, Utah State University
View Slides (PDF)
Keywords: Assessment scaffolding, equity-based assessment, space assessment, holistic assessment
View “Using assessment as a tool for relationship-building” abstract
Purpose & Goals
This paper details the purpose, methods, and results of a 6-year phased assessment of collections and space at the Utah State University (USU) Blanding library, a small academic library serving a diverse Indigenous American population from the Mountain West. Despite its merge with the USU Statewide system, there have been few updates compared to the main campus library in Logan, the Merrill-Cazier Library. Librarians from Logan, recognizing the equity issues at play, conducted comprehensive assessments to address the library’s collections use, space utilization, and technology concerns. These assessments involved extensive interviews, focus groups, collections analyses, and space assessments. The findings highlighted the need for a holistic overhaul to better facilitate student success and foster stronger connections to the wider USU Libraries network. The Logan campus library lacks administrative purview over the Blanding Library but shares non-monetary resources, functioning as a familial library. This lack of administrative purview complicated the implementation of changes, raising questions about decision-making, funding, and project management for the Blanding library. These uncertainties slowed the assessment and planning process, requiring thorough assessments to build trust among stakeholders—USU Logan land USU Statewide Campus librarians, Blanding administrators, and Blanding librarians—in each other’s professional competence and intentions. This paper will detail: The holistic, phased assessment process that occurred over 6 years, going from no traction to a solid commitment to change. The process included: focused ethnographic interviews, focus groups, a collections analysis, and space assessments. The interplay between space and physical collections analysis (and an eventual weeding) in a small library. The deep stakeholder engagement employed to ensure the assessment process was equitable, sensitive to cultural differences and needs, and built trust.
Design & Methodology
We took a multi-pronged, scaffolded approach to assess needs at the USU Blanding campus library and build stakeholder trust, including: Focus groups with students, faculty, and librarians. Focused ethnographic research via multiple site visits to the USU Blanding campus. Informal space assessments inside the library such as sticky note assessments for gathering student feedback. Multiple relationship-building meetings with USU Blanding’s administrative team and librarians asking what they want, what they need, where their thoughts are at, and how/whether or not they feel their current approach is contributing to student success. Physical collections analysis. All of these assessments resulted in multiple reports of recommendations and potential projects, broken down into smaller process, and “staged” for implementation. For example, we continued the scaffolding idea by breaking down a collection weeding process into more manageable pieces for a small team (e.g., first determine if they want to weed based on reclaiming square footage, reducing the collection by x%, or some other factor). More manageable, staged project processes were less daunting and more feasible for a library of their resource and staff size.
Findings
- When dealing with an interplay between collections and space changes, particularly in a familial institutional relationship without formal oversight, deep stakeholder engagement was absolutely paramount for success.
- Proof of need based on data collected via structured, holistic, and well-designed assessments was key to convincing stakeholders to move forward with change.
- Relationship and trust-building was absolutely paramount when performing assessments. We didn’t want the stakeholders to feel any amount of angst of threat, which is what can happen when any assessment is implemented.
- Working with, not against; listening, not overstepping; presenting options, not ultimatums; and approaching the process with a high-level goal in mind (to increase student success through spaces and curated collections) helped to ensure our assessment data were used for equitable decision-making.
Action & Impact
Action: Each assessment within this scaffolded process led to the next assessment. For example, stakeholder need evaluations led to ethnographic interviews, which led to physical space assessments, which led to collections assessments. After each assessment, the research team presented their findings and recommendations on moving forward to the Blanding stakeholders – administration, specifically – and simultaneously planned the next assessment phase. Each meeting reevaluated or reconfirmed goals, timelines, available resources to spend, and any new stakeholders that needed to be involved. Currently, the team is flushing out collections weed processes and building more staged project process for the Blanding team. This scaffolded assessment process will soon be applied to other USU statewide campus to determine how the libraries can better integrate with non-residential campuses (without physical libraries) to further student success.
Impact: Our project on its face integrate the library in addressing critically unmet needs at a statewide campuses that serves a large underrepresented population. From a process perspective, our project processes built strong relationships and trust where they were previously incredibly strained due to tensions over the campus merger several year past. Where stakeholders we previously very suspicious of us and our intentions, our assessment processes proved both that the Blanding library could address unmet needs on their campus (insofar as more study space, social interaction opportunities, access to newer resources, etc.) and that only primary goal was to work with them to advance student success.
Practical Implications & Value
Our community often discusses how to utilize assessment in very tangible ways to demonstrate impact and need. But, what about the intangible, humanistic component of assessment? In other words, what impact do our assessments and approach to assessment have on the relationships we must cultivate in order to build trust around our work and analyses? This paper will contribute to discussions about how to approach building trust with assessment processes and an equity mindset in order to increase the overall effectiveness of your efforts.
Paper Session 14: Open Access
A Multi-method Analysis of Faculty Perspectives on Open Access Publishing
Lori-Ann Tschirhart, University of Michigan
Craig Smith, University of Michigan
Additional Authors:
Alexa Pearce, University of Michigan
Yulia Sevryugina, University of Michigan
Nancy Allee, University of Michigan
Draft Paper (PDF)
View Slides (PDF)
Keywords: Open Access, Scholarly Publishing, Faculty Engagement, Faculty Perspectives, Article Processing Charges
View “A Multi-method Analysis of Faculty Perspectives on Open Access” abstract
Purpose & Goals
The purpose was to enhance current understanding of faculty perspectives, needs, and motivations related to open access (OA) publishing, both within the university library system and among colleagues in other campus units. While the library offers long-standing services and expertise to support faculty across many aspects of scholarly communication, the acceleration in OA publishing, as well as the adoption of significant and relevant policy updates and public access requirements by governments and funding agencies, have presented a timely opportunity to review corresponding library service models and levels of investment. As we continue to explore and evaluate the viability of a range of models for open scholarship support and advancement, it is imperative that we center faculty needs and perspectives in our analysis. By surveying and interviewing faculty authors regarding their OA experiences, we aimed to learn how local faculty are impacted by recent changes to the scholarly publishing landscape, including changes to publisher policies, and related institutional efforts to relieve the financial pressures presented by OA Article Processing Charges (APCs). By inquiring about their attitudes regarding OA and the changing publishing landscape, we also aimed to understand more about their overall publishing and access barriers, concerns, and insights. The studies have contributed to our overall efforts to engage our research community in an ongoing conversation related to open research and scholarship. We have designed these engagement efforts to inform our current and future service planning, especially given the potential cost implications of OA publishing agreements, including “transformative agreements,” for library acquisitions budgets.
Design & Methodology
To explore faculty experiences and opinions regarding OA publishing, our paper features two independently planned studies—one used a survey, the other interviews—that together enrich understanding about our institutional OA landscape. The survey was distributed via email during fall and winter, 2021–22, and the interviews were conducted one-on-one via Zoom during the summer of 2021. Analysis includes 233 survey responses and 14 interviews. The survey included faculty from all disciplines, ranks, and tracks and the interviews were with tenure track faculty from two natural sciences departments. The survey was structured around the following themes: 1) awareness of and attitudes about OA publishing 2) OA formats 3) library services 4) editorial experience 5) demographic information. The interviews were structured around themes of: 1) general impressions of OA publishing venues 2) predatory publishing 3) library support 4) decisions that influence publication choices. Overlapping themes in both studies included 1) interest and experience in OA publishing 2) motivations and reservations related to OA publishing 3) OA processing charges 4) the library’s role in the future of OA. Coding schemes for interview and open-ended questions were developed via an inductive process during conversations held within the research team. For coding categories, the agreement between two raters was evaluated based on Cohen’s kappa (κ) benchmark of .70 or greater. In a few cases where κ < .70, the authors revisited the approach to coding for the relevant categories and conducted independent coding again to ensure that interrater reliability was acceptable. All statistical processing was conducted using SPSS.
Findings
Both studies established that faculty are knowledgeable about, and generally supportive of, OA publishing with caveats. Participants described perceived benefits for publishing OA, including broadening readership, potential to increase impact, and opportunities for career progression. OA Author fees emerged as a primary and substantial barrier. Among survey respondents, the most common motivations for OA publishing included: increasing equity in access to knowledge; increasing international audiences; meeting funder requirements; and getting more citations. Interview respondents cited increased research visibility and potential to boost career progression as main motivations. The most common reservations among both studies’ participants were: steep publishing fees, the view that authors should not have to pay to publish, and concerns about quality. The survey highlighted equity as a top motivating factor in relation to expanded access to knowledge. A more nuanced perspective emerged through analysis of both studies. Interview participants described ways that OA presents challenges to equity. Survey respondents universally indicated that OA interest in their primary fields of research was moderate to substantial and growing. They reported positive experiences publishing OA across multiple formats previously. Participants in both studies indicated a desire to publish OA in the future. The journal article was the most likely format for which respondents reported prior OA experience, with an average reported APC payment of $2458. Interview participants indicated that APCs are most frequently paid with grant money, followed by discretionary accounts and departmental funds. Interviewees also indicated that the library could support them by negotiating better terms with publishers and prioritizing publishing agreements with trusted scholarly societies. Our faculty recognize many affordances for sharing their work openly, tempered by the high cost of OA publishing. In principle and practice, faculty described dissatisfaction with the APC model. Our collective findings showcase the complex considerations that characterize faculty perspectives on OA.
Action & Impact
Since conducting these studies, our library has incorporated faculty perspectives into internal planning activities and conversations with partner units. Owing to our university’s decentralized environment, we have necessarily socialized our understanding in distributed ways, with connections to relevant school, college, and departmental initiatives. We have shared findings with faculty across disciplines and administrative groups, including the library’s Faculty Council, the Engineering Faculty Library Advisory Committee, and the Research Associate Deans Committee. Engaging around the findings has served as a productive avenue for continuing campus conversations, as we learn more about points of resonance, agreement, and inquiry. The library’s Collection Strategy Steering Team incorporated the concept of an open research and scholarly ecosystem as a pillar of its work for the next several years and charged an Open Ecosystem Subcommittee (OES) with several initiatives, including: 1) building a knowledge base of current OA publishing agreements and 2) developing criteria for evaluating publisher agreements. In addition to the primary charge, the OES has initiated several companion projects to enhance the library’s promotion of OA-relevant content, including the discoverability of waivers and discounts for APCs. We have been more attentive to our messaging and support avenues and have created a dedicated email address for OA publishing questions, which has quickly become a high traffic pathway for consultation and engagement. We have transitioned from pilot phases to longer term agreements for some read and publish agreements. We have leveraged consortial initiatives via the Big Ten Academic Alliance, as well as some smaller institutional agreements with society and commercial publishers. We continue to explore options for agreements with several other publishers with whom our university community frequently publishes. Even with well-established OA publishers, current fee structures continue to be a barrier for sustainability, and increased work toward mutually beneficial cost models are needed.
Practical Implications & Value
This work is timely and relevant to academic libraries as they seek to effectively manage transitions in scholarly publishing toward a more open landscape. This two-study paper highlights faculty perspectives at a major public research university and identifies a series of approaches for advancing OA campus conversations and library initiatives. The study features two approaches, a campus-wide survey and interviews with faculty, that can readily be adapted by other institutions for gathering similar information and informing data-based decision making. As a unit that supports authors at various stages of their careers and from boundary spanning disciplinary areas, the data we collected will be helpful in highlighting author decision making processes related to publishing outlets and formats. Knowing more about how authors think and feel about OA publishing options can aid library subject experts as they provide information and referrals to authors in their liaison areas. Our research also highlights that the library’s publishing and OA-related services are unknown to many authors on our campus. This finding points the way toward new and broader communication efforts and also elevates planning on potential new library roles and services. Faculty made substantive recommendations for how librarians can support Open Access publishing initiatives, including the following areas: Negotiating better terms with publishers Prioritizing deals with trusted professional societies Lobbying for improved funder policies Developing publishing and writing workshops Supporting OA repositories Investing in systematic rigorous analysis of usage statistics Educating and creating knowledgeable academic constituencies Additionally, knowing more about the OA-related challenges that authors encounter can position the library to advocate at the institutional level for sustainable solutions and can illuminate discussions with publishers as libraries evaluate subscription offers and licensing agreements.
“Transform to Open”: Analysis & Insight for Transformative Agreements at the University of Miami
Kineret Ben Knaan, University of Miami
Additional Author:
Lisa Fish, University of Miami
View Slides (PDF)
Keywords: Transformative Agreements, Read and publish, Open Access, Article Processing Charges (APC), Publishing patterns, Data-driven practices, Contract negotiation
View “Transform to Open: Analysis & Insight for Transformative Agreements” abstract
Transformative Agreements (TA) are bringing about significant changes in the roles of academic librarians. These agreements, which are often managed by acquisitions and assessment professionals, focus on payment for publishing rather than payment for reading. As costs shift from traditional subscriptions to fees that include both reading and publishing costs, there is a need for librarians to develop mechanisms and best practices for collecting and evaluating various data points and metrics. These metrics include publishing patterns, cost inflation, usage statistics, funding support, and other relevant data. The insights gained from these analyses are essential for negotiating contracts and tracking the ongoing implementation of new agreement models. The University of Miami (UM) Libraries has entered into several TAs with different scholarly publishers. The primary objective of these agreements is to encourage open-access publishing and to reduce the cost to UM authors for publishing open-access. In parallel, UM Libraries has also ensured that these agreements are aligned with our institution’s publishing interests and priorities and that these agreement models are cost-transparent and attained at a sustainable price. In this paper, we present the review, data collection, and analyses conducted by our team to evaluate TA opportunities offered by publishers either directly or through consortia. We aim to discuss the insights from the analyses, the importance of institutional publishing interests, and the weight of each in decision-making when pursuing TA offers. Based on our experience at UM Libraries, this paper aims to achieve two key goals: 1. Demonstrate the value of data-driven practices to evaluate TA offers. These practices include consistent data collection, establishment of key metrics for evaluation, and standardized analysis processes. 2. Demonstrate the significance of institutional publishing interests and their importance in decision-making when pursuing TA offers.
6:30 p.m.–8:30 p.m. | Poster Session & Reception
Join us at our Poster Session and Reception, with over 70 posters on a variety of topics—including teaching and learning/student success, collections, spaces, and assessment methods. The session provides an opportunity to engage with poster presenters and connect with each other over delicious food and drinks and an evening of community networking.
Saturday, November 9
9:00 a.m.–11:00 a.m. | Concurrent Session 7
Paper Session 15: Program Evaluation
Adapting the Program Review for the Academic Library
Amy McLay Paterson, Thompson Rivers University
Elizabeth Rennie, Thompson Rivers University
Additional Authors:
Joey de Costa
Erin May
Franklin Sayre
Draft Paper (PDF)
View Slides (PDF)
Keywords: Program evaluation, quality assurance, collaborative self-study
View “Adapting the Program Review” abstract
Purpose & Goals
Academic programs at Thompson Rivers University are evaluated every seven years through an intense, faculty-led program review process, involving curriculum mapping, stakeholder surveys, a SOAR analysis, a collaborative self-study, and an external review. The TRU library had previously received external review reports in 2007 and 1998 but had never participated in the full program review process. However, in 2023–24, the library took our first foray into the academic Program Review. It was considered something of an experiment, with acknowledgements by all involved that adaptations to the process would be needed. Unlike other academic programs that mainly support students, the library supports the entire university community. Our goals were the following: 1. To determine what adaptations would be needed to the academic program review process. 2. To holistically conceptualize and frame the library’s services as an academic program and communicate this framing to relevant stakeholders 3. To determine if the program review process as it exists is valuable and worthwhile for the library and the university, in terms of quality assurance and continuous improvement.
Design & Methodology
In partnership with the Center for Excellence in Learning and Teaching and the Office of Quality Assurance, we proceeded with the Program Review Process, discussing necessary adaptations along the way. 1. Creating Learning Outcomes: This review process was the first time that the library had conceived of itself as an academic program with our own learning outcomes (PLOs). We decided to base our learning outcomes off the ACRL Framework for Information Literacy. 2. Curriculum Mapping: The next step in the regular program review process is to map PLOs to courses in the program. Since the library does not have courses, we had to conceptualize our work areas in terms of how they would meet the learning outcomes. We also realized that, unlike other academic programs, mapping work areas to learning outcomes alone does not describe our work. The library doesn’t just support students, but also infrastructure. To reflect this developed a crosswalk to map our programs onto the infrastructure that we support. 3. SOAR analysis: Strengths, Opportunities, Aspirations, and Results. There were no adaptations needed in this process, but we were able to collectively identify themes in each of these categories that we had not been able to express before. 4. Surveys: Because the library does not have its own students or alumni, we decided that rather than send out our own surveys it would be more productive to add a few library-related questions to other cohorts’ surveys. Librarians collaboratively developed the questions that we most wanted to ask both students and faculty. 5. Collaborative self-study: Much of the original self-study was based on enrollment rates and student figures, so we needed to adapt to identify and discuss the data that would be most relevant to our unique program.
Conclusions
We were pleasantly surprised at how few adaptations were needed to make the academic program review work for the library. Furthermore, the adaptations that we had to make were inventive and generally helped reveal aspects of assessing our work that we have struggled with identifying. For example, having survey questions included in other cohorts’ surveys will over time give us a broad overview of feedback from our closest stakeholders. Having our services framed and communicated as an academic program essential to the academic mission of the university is often something libraries struggle with, when administrations over-focus on resources, to the exclusion of library services and other work to steward these resources. Developing our curriculum map and PLOs was a succinct way to summarize and communicate the educational value of our services. The process so far has been highly valuable. It has been a forum for many difficult but productive conversations as a department about determining our values, priorities, and directions. Our newly established PLOs have already been well-used to align and direct our work. The self-study was a rare opportunity to reflect deeply on many aspects of our program as a whole. While we focused in this round on adapting the existing review elements, I wonder if there are any elements specific to the library that we would want to add in future program review years.
Implications & Value
We would suggest that more academic libraries engage in these methods, or pieces of them, for the goals of identifying values and priorities, and continuous improvement. We are not aware of many other libraries who have established PLOs for their department, but it is an extremely worthwhile exercise, that helps align work priorities to educational value. If your university undertakes any similar review processes adding tailored questions about the library can be far more valuable than attempting to survey the entire community yourself. Finally, the reflective component of coming together to have these conversations helped unite our vision of what the library is and does. There were many disagreements, but we were able to come to a hard-won consensus in most areas of our program review work. When I am called upon to explain the difference between librarians and other faculty members, I generally sum it up as such: librarians support and educate students, but we also support a massive amount of educational infrastructure that is essential to the programs and services we provide. This infrastructure is maintained by our faculty because the choices we make in configuration and support are tied to the learning outcomes we want our students to achieve; they are also tied to our philosophy of service, informed by values, such as promoting access and operating with an ethic of care for our community. Technology alone cannot support learning outcomes, because technology is a tool whose ends are determined by those in control. Having this holistic vision of the library can be extremely helpful in understanding and communicating what we are and what we do.
Anything I Can Do, You Can Do Meta: An EDI-Informed Process to Evaluate Your Libraries’ Assessment Practices
Harini Kannan, New York University Libraries
Additional Authors:
Hafeezah Hussein, NYU
Alexandra Provo, NYU
Evonn Stapleton, NYU
Lia Warner, NYU
Nicholas Wolf, NYU
Rachel Mahre, NYU
View Slides (PDF)
Draft Paper (PDF)
Keywords: library assessment, meta-assessment, inclusive data practices, academic libraries, internal research, user research
View “Anything I Can Do, You Can Do Meta” abstract
Purpose & Goals
Assessment is indispensable for libraries seeking to create inclusive programs, evaluate policies, and substantiate ongoing initiatives. However, despite this importance, it is not inevitable that library assessment practices (including methods, instruments, and platforms) are themselves rooted in equity, diversity, and inclusion (EDI). As public and academic libraries move towards EDI, every facet of library work must align in this direction– including assessment work. Without EDI-informed assessment practices, libraries risk maintaining harmful consequences to marginalized community groups we supposedly aim to equitably serve. With this challenge in mind, this paper represents the product of a year-long working group tasked with defining and executing an EDI-informed meta-assessment process. This means: examining the means, gaps, and limitations of our current libraries’ assessment practices and exploring ways to shift our current and future practices to be aligned with our libraries’ stated values. In this paper, while we will touch upon all aspects and results of our work, we will primarily focus on our process of meta-assessment, including: grounding anti-racism and anti-oppression frameworks; determining effective methodologies, providing reproducible survey and analysis materials, and offering next steps. We believe that this focus will provide readers a clear model of meta-assessment that can be selectively reproduced at their own institutions, which will enable practitioners to develop findings and next steps that are tailored for their libraries.
Design & Methodology
Our team of seven library staff and faculty, representing assessment perspectives across the library, first tackled the “meta” aspect of meta-assessment: how are we currently approaching assessment, and what gaps emerge specifically around EDI? To answer these questions, we conducted a survey that was shared across all 10 of our libraries and 31 departments. As part of our survey design, it was important to have a clear, collective definition of EDI-informed assessment. We conducted a preliminary literature review which looked into common assessment challenges related to racism, disability, homophobia, gendered oppression, and autonomy & privacy. We chose these specific areas in order to create sharper avenues to examine EDI. Our survey, which had a 90% response rate, consisted of 27 questions (19 qualitative, 8 quantitative) and addressed the following areas: what do departments assess, how (e.g., tools, platforms, workflows) and why; what principles around power, anti-racism, or privacy do they consider, and how; how is ethical assessment operationalized? Next, we created highly organized tools to divide the work and track our outreach. To analyze the data, we divided the dataset by each question, in order to better track themes across departments. We assigned 1-2 questions to a primary analyzer and a secondary reviewer to see if data was interpreted differently between group members. Because of the number of members working on this, and that the size of the data varied per question, we decided that manual data analysis worked best. Informed by traditional qualitative data analysis methodologies, we created an analysis worksheet that walked group members through a process of coding and thematic analysis, thereby doubling as instructional artifacts. We compiled the worksheets to identify salient themes, predominantly around gaps in EDI-informed assessment practices.
Findings
We found that our libraries use quantitative assessment (reference, instruction, circulation, and collections data) and qualitative (instructional feedback, user experiences, employee satisfaction surveys) assessments. The survey also revealed that our library departments want to create EDI-informed assessments but didn’t know how to start that work beyond digital accessibility. A few departments considered ideas around positionality and tokenization around demographic collection and noted interest in learning how to include harm reduction of marginalized communities. The majority of responses mentioned privacy and accessibility of physical and digital assessment forms as their primary EDI considerations. More than half of respondents weren’t relying on any (internal or external) resources to guide them through assessments. Those that did use resource guides tended to use them for very specific types of assessment (e.g., utilizing W3C standards for assessing web accessibility). Despite these uncertainties, we discovered that the majority of departments, regardless of the type of assessment, indicated that they consider the who, what, where, when, why, and how of the project, its cost, whom it may affect, relevance, short and long-term considerations, safety, accessibility, and impact before conducting an assessment project. Our libraries were also concerned with how assessment impacts students, departmental relations, or faculty-liaison relationships; however, overwhelmingly, departments prioritize students as the center of their assessment work. We synthesized that generally our libraries wanted to find ways to center EDI in assessment through: ethical representation (concerning the inclusion of marginalized or oppressed student communities in assessment projects); autonomy and safety of the user (concerning patron consent and anonymization of their information); ethics of data retention, storage, and reuse (concerning how patron information is maintained and used to protect their safety and autonomy); and presenting data (concerning the way data is synthesized and presented, and whether it authentically reflects the data collected).
Action & Impact
Our post-assessment action plan consists of two phases: a comprehensive literature review and a pilot program to act on our review’s recommendations. Our literature review provided an initial foundation of knowledge in response to the confusion and lack of direction our colleagues, including our group, had around EDI-informed assessment. We structured our review into three large categories: assessment design, execution & analysis, and data retention & sharing. Within each section, we addressed the major themes from the findings, providing clear and approachable recommendations around questions such as: how to identify researcher positionality, how to ethically recruit marginalized participants, what informed consent looks like in assessment, why we must have data destruction policies, and more. In addition to our recommendations, we found an anti-racist assessment checklist created by the educational non-profit WestEd, which aligned closely with our findings and clearly atomized many of our recommendations. Our group’s final report to senior leadership included the survey and literature review findings, and was approved in January 2024. In the next few months, we will reconvene to detail and implement the following action plan. In honor of our findings around data transparency, we will host a series of workshops that share out the report. These workshops will also serve to gather interest in our pilot program. The pilot will ask 3-4 assessment practitioners to choose a past, existing, or new assessment project, and utilize our recommendations to determine and implement adaptations to their practice. We will evaluate the pilot’s efficacy through listening sessions with pilot participants to assess: ease of digesting our group’s recommendations; need for additional research; capacity for individual practitioners to implement findings; further confusion or gaps navigating our report. We hope the pilot, upon revisions, can be a model for departments to reproduce for every assessment project they take on.
Practical Implications & Value
Through this paper presentation, we aim to share with audiences a clear and comprehensive meta-assessment process which they can amend and reproduce at their own libraries of whatever size. Readers will take away a blueprint that includes resources to: design and implement meta-assessment surveys; work with colleagues using template tools to effectively divide labor and collectively analyze large qualitative and quantitative data sets; and determine impactful avenues that are tailored for their institution’s assessment gaps and needs. We believe that our meta-assessment practice, with its particular lens of anti-racism, equity, and inclusion, is a vital reflexive process in any library’s assessment strategy. Primarily, this process ensures that libraries pause to evaluate whether their assessment work is materially aligned with their stated values. With our paper, we hope those looking to engage with this process can do so without creating workflows, assessment instruments, and more from scratch, and better focus on the questions they’d like to engage their peers in.
Strategic Planning as an Iterative, Emerging Process
Louis Becker, University of Tennessee, Knoxville Libraries
Olivia Kelley, University of Tennessee
Draft Paper (PDF)
View Slides (PDF)
Keywords: Strategic Planning, Agile Methodology, Visioning, University Libraries
View “Strategic Planning as an Iterative, Emerging Process” abstract
Purpose & Goals
Strategic planning in academic libraries exists in tension between the strategic vision and plans of the university as a whole, and the long-term goals of library administrators, faculty, and staff. The library needs to support the vision of the university, while the university vision may not fully encompass library-specific concerns of collection development, access, and preservation. Successful strategic planning also requires input and support from personnel across the library. This paper considers how the strategic planning process can be designed in a way that produces a plan or vision that aligns with the priorities of diverse stakeholders and is adaptable to a rapidly changing future.
Design & Methodology
As a case study of the recent strategic planning process at the libraries of a state flagship research university, this paper explores how the university libraries built a new strategic vision aligned with university priorities, accounted for input and feedback from libraries personnel and university administration, and remained adaptable to future changes. References will be made to current literature on planning in academic libraries. Our planning process at first attempted to move directly to project proposals in each of the university’s vision areas. A review of the resulting proposals by administrators, together with feedback from across the libraries, made it apparent that aspects of many projects were already in process, while others would require substantial preparation and reorganization. Instead of becoming a strategic plan, these documents, together with input from university administrators, became source material for a second round of strategic visioning with a large group of stakeholders, resulting in the final strategic vision.
Conclusions
The standard visioning process begins broadly and proceeds to highly specific goals and timeframes. In our library, the previous strategic plan process had ended with an extensive list of action items. While attempts were made to track and celebrate these individual goals, many were superseded or rearranged by changes in the university context or institutional priorities. As we embarked on our next strategic visioning process, we used a variety of techniques to make our university’s recent strategic vision our own. Our process borrowed from agile methodology to form iterative phases that informed one another and the final vision. Ultimately, we produced a vision for 2023-2030 that has brought our work into alignment with university priorities through broad priority statements stretching across multiple teams and departments. These statements attempt to meet stakeholders’ needs for storytelling frameworks. Individual ‘strategic goals’ will be shorter-term departmental and individual goals justified through ties to the vision statement. We thus aim to be more adaptable to changes in the university and the broader environment. Now that we have completed our visioning process, we are presented with another question: how do we track progress towards our vision in a way that is meaningful to stakeholders? Assessment was a consideration in our process, but demonstrating progress in meaningful ways requires more creativity than a simple checkbox or single statistic can provide. Our broad vision is attractive to stakeholders’ desire for powerful storytelling but will need an assessment plan that is just as impactful.
Implications & Value
Strategic planning is an unavoidable part of life in an academic library, and assessment librarians will be called upon to contribute to planning with data and trend analysis. Our case study will provide some pointers and some cautionary tales to help those planning their own strategic visioning process. Assessing a strategic plan sometimes means assessing the planning process as well as progress toward goals. We will consider how the strategic planning processes should remain flexible to guide an organization through transformative changes in technology, the student population, and university business decisions.
What “Assessment” Means to Us: A Case Study of a Department-level Assessment Framework within a University Library
Holly Surbaugh, University of New Mexico
Olivia Baca, University of New Mexico
Draft Paper (PDF)
View Slides (PDF)
Keywords: assessment framework, strategic planning, program development, planning templates
View “What ‘Assessment’ Means to Us” abstract
Purpose & Goals
In 2022–2023, the Learning & Outreach Services (LORS) department within the University of New Mexico’s Libraries responded to post-pandemic challenges with a renewed focus on project-driven strategic planning. The department set an annual goal to develop a LORS-specific assessment program and charged a small team with the task. The assessment team grappled with how to operationalize the literature on library assessment at a level that makes sense for a department of 12 librarians within a larger library organization. Many of the publications identified in our initial environmental scan described assessment at an institutional level for an entire library or at a granular level for individual projects.
Design & Methodology
This paper will summarize the assessment team’s approach to developing a consensus-driven assessment framework tailored for our department’s scope and context, focusing on how we defined values to guide assessment efforts and creating useful tools to promote long-term program sustainability. This process began with targeted reading and using multiple methods to capture LORS librarians’ preliminary ideas and concerns, which informed a series of exploratory conversations among assessment team members that led to the recommendation to create a custom assessment framework. The framework includes a guidance document, an impact map (adapted from work by Megan Oakleaf), and templates. The guidance document articulates deliberate choices LORS librarians made about what and how we plan to assess, pursuing projects and methods meaningful to our group. Intended as a living document, the guidance document also demonstrates alignment with library-wide and campus-level strategic planning, captures logistical and institutional information for the purposes of knowledge management, and outlines procedures for planning and tracking future assessment projects (i.e., instructions for how to use the templates).
Conclusions
Our team found it beneficial to strip down relevant assessment concepts and resources in order to rebuild from the basics our departmental understanding of what we hoped to achieve with an assessment program. Our framework intentionally emphasizes projects that support decision-making and service enhancements over other types of assessment. After two years of refinement, the framework we have developed functions as a pragmatic toolkit that will see long-term use within our department.
Implications & Value
Our experiences can provide an example of one approach to developing and implementing an appropriately scaled assessment program. The lessons learned from this effort could prove applicable for similar groups at other institutions. Our assessment framework (particularly the templates) are suitable for reuse and remixing.
Paper Session 16: Ethics of Assessment
Library Virtue 1: excellence and ethics in library strategy and assessment
Stephen Town, University of York
View Slides (PDF)
Keywords: Ethics, excellence, research libraries, strategic measurement, value assessment, virtue theory
View “Library Virtue 1” abstract
Purpose & Goals
Libraries are faced by a growing set of challenges with significant ethical content. Their ‘goodness’ is being increasingly judged and challenged on contributions to wider societal issues concerning the common and public good. Libraries are required to measure up to an implicit set of virtues which are critical to their credibility, their share of resources, and in some cases to their very existence. In the academy multiple definitions of excellence and political influences can cause uncertainty and confusion. Ethical challenges have been met positively by libraries, often ahead of their parent institutions, but mainly through one-off projects and initiatives, rather than as part of a holistic framework for library relational, reputational and transcendent strategies. This paper is intended to further debate by providing an introduction to virtue ethics, and seeks a conversation on its relevance and application to the understanding of libraries, and its potential absorption into assessment programs and projects. The ultimate aim of this study is to extend the understanding of the excellent research library, and to close the gap between library values and ethical behaviors within organizational strategies and practices.
Design & Methodology
This paper builds on doctoral work on library value, and seeks specifically to apply virtue theory to the organization, management and leadership of research libraries. The method of investigation has included an intensive period of desk research on the philosophical and psychological foundations of virtue ethics and its contemporary applications, with subsequent contributions to conferences and the literature on practical wisdom (phronesis), difference and diversity, loyalty, and the social turn in library research. An empirical investigation into research library leadership and organization across world-class universities in North America and the UK was conducted in 2023. This work used methodological techniques which have not been widely used in libraries, but deserve increased consideration. This conceptual paper connects a number of different streams, and provides an introduction to the long tradition of virtue theory in classical and modern philosophy. This idea has been attracting increasing interest within management literature and practice over the last forty years, resulting in its application to organizational excellence, character, leadership and phronesis (practical wisdom) within education, the professions, public administration and business. Discussion and application of this ‘big idea’ to libraries seems absent.
Conclusions
In times of pressure and financial constraint, some aspects of library excellence might be traded off, but what cannot be lost is a reputation gained by moral excellence. This depends on libraries behaving justly in relation to fair dealing, offering fair witness, and in equitable resource allocation and treatment of their communities. If this argument is accepted, then measurement of performance in these areas should be core to assessment efforts.
Implications & Value
The value of the contribution is to provide the assessment community with a new lens from which to view their work, particularly for the strategic and transcendent dimensions of library organizational value. This perspective opens a potential agenda for further research and development work for libraries to incorporate ideas of virtue into their ongoing assessment, strategy and practice, and also into professional and leadership education and formation.
Library Virtue 2: assessing excellent leadership in the research library
Stephen Town, University of York
View Slides (PDF)
Keywords: Ethics, excellence, leadership, research libraries, strategic measurement, virtue theory
View “Library Virtue 2” abstract
Purpose & Goals
The ultimate goal of the author’s long term research quest is to understand what constitutes the good library, and how to assess and measure all aspects of its performance. This project follows doctoral work on library value, and seeks specifically to apply virtue theory to the organization, management and leadership of research libraries. Virtue theory has been attracting increasing interest within management literature and practice over the last forty years, resulting in the application of ideas of organizational character, leadership ethics and phronesis (practical wisdom) to education, the professions, and business generally. Discussion and application of this ‘big idea’ in libraries seems absent.
Design & Methodology
This paper reports on a multi-participant study of research library leaders in world class universities in North America and the UK to assess the relevance of virtue theory concepts to library leadership, strategy and practice. In particular the study sought to understand how leaders formed their ethical positioning; what they regarded as the key important virtues for research library leadership practice; and how they handled difficult problems, crises and critical incidents. The methodology was qualitative, using ethnographic and autoethnographic approaches to analyze and interpret autobiographical and socio-cultural data collected through in-depth face-to-face interviews. In the process, researcher and participants co-created reflexive understanding to establish the meaning of their shared stories and narratives.
Findings
The interview process was successful in generating data on the three specific sub-question areas above (ethical formation; leadership practice; practical wisdom). The analysis tested the data thematically against existing frameworks and prior research in the field to identify the important and distinctive virtues for leaders and libraries as organizations. A novel model and categorisation of virtues was consequently synthesized, to show how leadership virtues interact with and influence library character, strategy and operations. Some cases of leadership ethical formation, and of practical wisdom in action will also be presented.
Action & Impact
The author is developing a program of dissemination and publication of the detailed findings and conclusions over 2024/25, through professional and academic channels. He is seeking interest and collaboration from the community in further developing these ideas into practical tools, and to aid their integration into assessment programs and library strategies.
Practical Implications & Value
The value of this work is to provide the assessment community with a new lens from which to view library assessment, particularly for the strategic and transcendent dimensions of library organizational value. This perspective centers ethical considerations within all levels of library practice and performance, and provides a unique insight into how leaders affect organizational cultures and climates. Methodologically this work provides further evidence of the value of (auto)ethnographic approaches to provide a deeper understanding of the social world of libraries, and of how excellence is achieved in practice. The model of leadership and organizational virtue provides a new and ethical practice -oriented framework to add to the armamentarium of assessment tools. The paper opens an agenda for further research and development work for libraries to incorporate ideas of virtue into their ongoing assessment, strategy and practice, and also into professional and leadership education and formation.
The Re-ID Risk is Real: Quasi-identifiers in Library Learning Analytics Data
Andrew Asher, Indiana University
View Slides (PDF)
Keywords: Learning Analytics, Reidentification, Privacy, Ethics
View “The Re-ID Risk is Real” abstract
Purpose & Goals
As academic libraries expand their capacity and participation in learning analytics (LA) data collection and analysis, the datasets produced by these activities increasingly pose potential ethical and privacy-related risks. Library LA datasets are often presented as “deidentified” after direct identifiers (e.g. name, email address, or student ID) to the individuals represented have been removed. However, combinations of demographic information commonly retained in LA datasets produce potentially unique “quasi-identifiers” that might allow reidentification of large numbers of individuals within these data. Such quasi-identifiers can therefore render any associated confidential data publicly visible, a substantial risk to the privacy of research participants and a potential violation of ethical research conduct.
Design & Methodology
Using the pigeonhole theorem, this study evaluated combinations of demographic variables contained in a dataset of approximately 40,000 students and calculated the number of individuals that are theoretically likely to be uniquely identifiable. These findings were then validated using cell counts of demographic combinations from the dataset.
Findings
This study found that information frequently retained in library learning analytics datasets renders a majority of individuals identifiable, and that the burden of this reidentification risk falls disproportionately on minority groups. Since these groups are often already subject to higher levels of discrimination and surveillance, these findings question whether learning analytics datasets meet the justice standard of ethical research with human participants.
Action & Impact
This presentation will suggest data collection and aggregation approaches that limit reidentification risk as well as synthetic data analysis techniques that enable the statistical substitution of quasi-identifiers to remove reidentification risk.
Practical Implications & Value
This paper provides a practical approach for assessment practitioners for evaluating privacy and reidentification risk in analytical datasets prior to data collection and procedures to minimize this risk. By providing an empirical assessment of the potential of reidentification contained in a real student dataset, in contributes to conversations about data ethics and justice with verifiable examples.
Learning Lab 8: Level Up in Excel
Level Up in Excel: Using Excel to Perform Serials Value Analyses
Sephra Byrne, University of North Texas Libraries
Whitney Johnson-Freeman, University of North Texas
Lidia Arvisu, University of North Texas
View Slides (PDF)
View “Level Up in Excel” description
Learning Outcomes
- By the end of the session, participants will be able to conduct an e-resource value analysis using Excel, COUNTER 5 reports, and a list of annual costs.
- Participants will be able to use Excel PowerQuery to pull in data from other Excel files or a variety of sources like databases, Power BI datasets, or SharePoint.
- Participants will be able to use Excel PowerQuery to compile multiple Excel files into a single table.
- Participants will be able to create pivot tables and choose different ways to compile data like sum, count, average, or percent of total.
- Participants will be able to unpivot data tables and identify when unpivoting data is appropriate.
- Participants will be able to use SUMIFS to use multiple conditions when finding the sum of a column of data.
- Participants will be able to use Excel’s AND and OR functions to create complex SUMIFS statements.
- Participants will be able to use Excel’s IFERROR function to handle data problems such as dividing by zero.
- Participants will be able to use Excel’s Percent Rank function to rank the value of e-resources based on inflation, usage, and cost-per-use.
- Participants will be able to use Conditional formatting to highlight high, low, or invalid values.
Please download these files onto your local machine: https://bit.ly/lac-2024-excel
Learning Lab 9: A Participatory Process
A Participatory Process: Shaping Learning Materials with Minoritized Students’ Mental Models
Rebecca Greer, University of California, Santa Barbara
Tina Lin, University of California Santa Barbara
View Slides (PDF)
View “A Participatory Process” description
Learning Outcomes
This study’s topic and activity would be relevant for the library assessment conference as it showcases user-centered assessment approaches that other organizations could implement into their evaluation strategy. In this learning lab, library instructors and assessment personnel are introduced to the hands-on applications of UX and participatory design methods to assess digital learning materials and video tutorials. In addition, they also learn strategies to analyze the concept map from the activity to explore’ struggles and learning needs within their processes, taking that information to provide more context and learning opportunities to build and teach foundational information literacy concepts into digital materials outside of the classroom setting. By the end of the Learning Lab, participants will be able to…
- Define UX and Participatory Design can be blended to assess information literacy videos.
- Engage in an activity using participatory design techniques to represent mental models of a given process.
- Apply analysis techniques with provided guidelines using co-constructed concept maps.
- Evaluate results with the intention of modifying a given learning object or creating alternative learning materials based on students’ mental models.
This workshop is intended for an audience of library professionals who rely on asynchronous learning materials for instruction. Attendees need not be well versed in usability study or participatory design methods but are oriented as inclusive practitioners to integrate student experiences in the library’s learning environment.
11:20 a.m.–11:45 a.m. | Closing Plenary Session
1:00 p.m.–4:30 p.m. | Post-Conference Workshops
Assessing EDI in Libraries and Information Organizations
Kawanna Bright, East Carolina University
Disambiguating Strategy: Plan the Work and Work the Plan
Maurini Strub, University of Rochester
Rebecca Greer, University of California, Santa Barbara
Starr Hoffman, University of Nevada, Las Vegas
Using National Data for Local Benefit
Devin Savage, Illinois Institute of Technology
Martha Kyrillidou, QualityMetrics LLC
