Citation:
Proceedings of the 2022 Library Assessment Conference: Building Effective, Sustainable, Practical Assessment, November 1–3, 2022, virtual conference, ed. Angela Pappalardo (Washington, DC: Association of Research Libraries, 2023).
Jump to a theme (please click twice):
Diversity, Equity, and Inclusion
Collections Value/Impact
Services/Usability
Organizational Issues/Assessment
COVID-19
Data Presentation & Visualization
Organizational/Space/Critical
Methods & Tools/Digital Libraries
Teaching/Learning
Diversity, Equity, and Inclusion
Towards a Modern, Inclusive Library Advisory Board
Anita Hall (University of Louisville Libraries)
Paper
Full Text (PDF)
Slides (PDF)
Show Abstract
The purpose of this study is to determine the prevalence, composition, and value of advisory groups for academic libraries in 2022 and make recommendations for modernization of and inclusivity within these groups. With an increasing focus on diversity, equity, and inclusion (DEI) initiatives in Academic Libraries, we must ensure that these groups are inclusive and representative of our user populations while providing a safe environment for members of diverse and minoritized groups to provide genuine feedback and ensuring that participation in library advisory groups leads to tangible improvements to their library experience. Additionally, the study will explore ways in which the Covid-19 pandemic and a broader increase in the availability of virtual/hybrid options for participation have impacted current practice around advisory group recruitment, participation, meeting formats, and overall group utility moving forward.
Design, methodology, or approach
This study has two components. The first (quantitative) is a census of ARL academic library websites to determine the prevalence and characteristics of library Advisory Groups in 2022. The second (qualitative) is a series of semi-structured interviews with assessment librarians and others who are responsible for facilitation of these groups to identify ways in which these groups have evolved and determine best practices for utilizing these groups in the current moment. A modified Grounded Theory Methodology will be used for analysis of interview responses.
Findings
Anticipated findings include strategies for recruitment of diverse and representative advisory group members, meeting schedules and formats, and engagement of members to ensure quality feedback that can be used for meaningful improvements to library spaces, services, and/or collections.
Practical implications or value
This study will provide guidance for assessment librarians and other who currently facilitate advisory groups or are interested in doing so. There is very little published work on advisory groups in academic libraries from the last decade, and none that evaluates DEI in this context. The early stages of the study indicate that these groups are still widely used among ARL academic libraries, and a more modern look at their usage and/or utility will add to the overall understanding of assessment and user research in libraries.
Demographic Analyses in the US: An Insight-Based Approach to Studying Diverse Needs for Library Planning
Starr Hoffman (UNLV Libraries) and Martha Kyrillidou (QualityMetrics, LLC)
Paper
Full Text (PDF)
Slides (PDF)
Show Abstract
This paper describes analyzing statewide demographic shifts, in order to better identify changing library patron needs, as part of an approach to assessing the outcomes of Library Services and Technology Act (LSTA) grants (IMLS initiative that provides funds to state library agencies). This practical approach to evaluation studies will explore:
How to put a demographic analysis together
How it relates to the LSTA “Measuring Success” framework outcomes (Focal Areas and Intents)
Design, methodology, or approach
For this statewide demographic analysis, primary sources included data from the U.S. decennial census and the American Community Survey, as well as related demographic reports. Demographic shifts over the past ten years were explored at various geographic levels to help determine changing library needs. The Census’s Public Library Statistics (PLS) and ACS data website was also used to understand library service locations and adjacent needs.
Findings
Abbreviated findings for some states will be provided.
Practical implications or value
Similar demographic analyses could be useful for State Library Administrative Agencies (SLAAs) seeking to clarify their statewide goals, as well as for library directors of all types, community colleges, and universities seeking information on needs of state residents. This paper will explore which census tables are useful for particular topics (especially as related to race and ethnicity), locating census data or other sources, and special issues with census data. We will also discuss which demographic elements relate to the six LSTA outcome areas: Lifelong Learning, Information Access, Institutional Capacity, Employment and Economic Development, Human Resources, and Civic Engagement, and their associated intents.
Converting Climate Assessments into Evidence-Based Change
Brian Keith (University of Florida)
Paper
Slides (PDF)
Show Abstract
Many organizations engage in climate assessment for diversity, equity and inclusion initiatives and other strategic purposes. However, the results, qualitative and quantitative, are difficult for organizations, let alone individual employees, to interpret in order to understand their workplace or to use in guiding or contributing to change. This case study addresses the problem. The methods presented, data analysis and visualization, show how to make the results digestible to all employees. The resulting organizational development and change management processes presented here address the challenges of converting data into action. Attendees will be better able to deliver on the expectations of employees for workplace growth and that their participation in climate assessment will result in evidence-based change.
Design, methodology, or approach
The ClimateQUAL climate assessment tool was the source of the initial data used here. Quantitative and qualitative analysis were used to interpret the voluminous data set in the standardized tabular report. Data visualization and communication methods used for the internal audience reflected adult learning practices. The resulting evidence-based change actions that were implemented were anchored in organizational development and change management literature, including transformational leadership, and in modern management concepts, including procedural and contributive justice.
Findings
Data collection is the easiest and very much preliminary stage in climate assessment. Most efforts fail at the subsequent interpretation or action phases. The methods to be presented here are effective in positioning leaders, at various levels, and staff to understand climate assessment results and engage in visualizing and implementing change initiatives, based on organizational strengths and weakness.
Practical implications or value
Libraries are increasingly focused on organizational improvement. Effectiveness, generally and in terms of recruitment and retention of key personnel, depend on evidence-based planning and decision making. Also, employees expect issues in their work experiences to be resolved. So, climate assessments are likely to be conducted at most institutions and with some frequency, but delivering on the promise of these efforts is a challenge. Assessment professionals and other leaders need to be prepared with a variety of methods to lead the data collection, analysis, and interpretation activities and to support change implementation. Each successive phase is more challenging, and this presentation will offer methodologies for each. The methods are transferable to various assessment instruments and organizational initiatives.
A Mixed-Methods Approach to Assessing Diversity, Equity, and Inclusion in Library Collections
Jayne Sappington, Sara Schumacher, Esther De Leon, Kimberly K. Vardeman, and Donell Callender (Texas Tech University)
Paper
Full Text (PDF)
Slides (PDF)
Show Abstract
Diversity, equity, and inclusion (DEI) studies have increased in prominence on academic campuses along with calls to question privilege and power structures, making DEI collections assessment critical. Texas Tech University Libraries undertook a two-part project that evaluated user needs, collections usage, cataloging and discoverability, and user behavior in searching for and evaluating DEI resources.
Design, methodology, or approach
The researchers administered online surveys to understand faculty perceptions about the library’s collections and availability of DEI resources for research and class assignment purposes. They used a course syllabi scan to look for instructor assigned resources and reviewed DEI-related award lists to identify key resources to compare to library holdings. Usage reports and faculty resource requests were additional methods to measure the comprehensiveness and appropriateness of the Libraries’ collections to meet user needs.
Additionally, the researchers conducted 32 usability tests to evaluate the user experience of searching for library resources related to DEI topics. They measured user satisfaction and the perceived difficulty of the search process using observation, quantitative and qualitative survey and interviewing methods, and structured and closed-ended questions to understand the views of library users.
Findings
The researchers identified the potential for partnering with academic programs that currently utilize and request DEI-related resources. They found that even though most users reported they were satisfied by their search results, many of them expressed uncertainty in searching and evaluating DEI resources. Users expressed interest in search enhancements for better filtering, library website improvements, and more prominent guidance for DEI research help. An item’s description or abstract and its title were the components that users most often used to determine how relevant a source would be. This finding supports other research results that demonstrated the need for increased attention on cataloging and metadata, particularly Table of Contents and abstract or summary fields.
Practical implications or value
The researchers identified potential ways individual libraries can address patrons’ difficulties narrowing a topic or search and more large-scale cataloging changes that could improve discoverability and user experience. Areas for future study include research into what cataloging enhancements are most effective and identifying ways librarians can advocate for large-scale changes in cataloging practice involving collaborations between multiple libraries or even vendors. The researchers envision that other academic librarians could replicate and expand aspects of this study in their own libraries, while avoiding the challenges identified in this research.
An Equity Audit for DEI Data in an Academic Library
Ashley Lierman, Shilpa Rele, Samantha Kennedy, Marryam Naqvi, Marlowe Bogino, Christine Davidian, and Sharon An (Rowan University)
Paper
Full Text (PDF)
Slides (PDF)
Show Abstract
When our university libraries’ DEI Committee first formed in late 2019, an initial self-assessment revealed a need for data regarding barriers to inclusion and equity arising from our libraries’ policies, services, and resources. As a framework for approaching this investigation, we decided to adapt a tool more commonly employed in K-12 schooling: an equity audit. While school equity audits tend to focus on such areas as equity of teacher quality, programs, and student achievement (Skrla et al., 2009), we adapted our audit to investigate areas of library service that would be more relevant for our organization: virtual collections and spaces, physical collections and spaces, and interactions with library personnel.
Design, methodology, or approach
Following the principles of Green’s (2017) Freirean-based model of community equity audits, we sought means of collecting data that would elicit the genuine concerns and needs of our university community. Our data was collected using a three-stage mixed-methods approach: 1) a short, primarily quantitative survey focused on online collections and services, as those were the only resources available at the time due to the COVID-19 pandemic; 2) a second, similar survey including all collections, policies, and services, after on-campus operations had resumed; and 3) a set of in-depth interviews with community members for qualitative data on their experiences.
In each case, our strategy for these surveys and interviews has been to avoid potentially loaded questioning about specific incidences of discrimination (which many community members may be understandably reluctant to name), but instead to inquire generally about positive and negative experiences of the library’s collections and services, collect demographic information from participants, and cross-analyze the results to reveal any patterns of inequity in users’ experiences by demographic group.
Findings
While our first survey had a low rate of return, from its results we have already tentatively noted what appear to be patterns of inequity in users’ experiences of the libraries, with particular patterns of less positive experiences among Black and/or African-American users and among users with disabilities. As the second survey had far more responses, particularly from marginalized community members, as we analyze its results and those of the interviews, we anticipate finding more definitive patterns in the quantitative results and specific issues to address in the qualitative results.
Practical implications or value
Our experiences can help other institutions use similar methods to investigate the status of equity and inclusion in their libraries, and what areas most need improvement. Equity audits have been little discussed outside of K-12 education, so this study will provide a valuable example of how this strategy can be used as a library assessment technique.
Assessing the Needs of Users with Disabilities in Pursuit of More Accessible, Inclusive Libraries
Emily Daly, Ira King, and Angela Zoss (Duke University Libraries)
Paper
Full Text (PDF)
Slides (PDF)
Show Abstract
In 2021 and 2022, Duke University Libraries formed a cross-departmental research team of library staff to explore two primary questions: To what extent are the Libraries supportive of disabled users, caregivers, and allies? How might library staff make library spaces, web interfaces, collections, and services more supportive for these users? To answer these questions, we designed a multi-faceted study that included a literature review and environmental scan, informational interviews, a user survey, and follow-up user interviews. This is the third mixed-methods user study that Duke Libraries staff have led to learn more about the experiences and needs of marginalized or underrepresented students.
This paper will summarize the research team’s methodology, focusing on ways we engaged with users with disabilities. We will briefly describe our findings and highlight ways that library and campus stakeholders might implement the team’s recommendations to make libraries more inclusive and welcoming for individuals with disabilities, as well as caregivers and allies.
Design, methodology, or approach
This study began with a literature review and informational interviews with campus stakeholders to better understand current support for the target population and any prior research conducted. The research team then distributed a user survey and conducted four follow-up interviews. Team members used affinity mapping to analyze the interview transcripts and develop themes based on the data. The team then used findings to develop recommendations for improving library spaces, services, web interfaces and collections.
Findings
Preliminary findings indicate that while the university may outwardly welcome people with disabilities, there are many barriers that prevent people with disabilities from having their needs met. The library is often seen as more inclusive than campus as a whole, but there are clear pain points, like transportation difficulties and inaccessible electronic materials. Awareness of existing services is also low. Preliminary recommendations include updating library webpages on services available to users with disabilities, increasing support for individual private study spaces, and providing sensory-friendly spaces.
Practical implications or value
Findings and recommendations will be incorporated into a report to be shared with library staff, campus stakeholders, higher education communities interested in providing support for students with disabilities, and potential donors interested in funding relevant library services. This mixed-methods study also serves as a model for other libraries who wish to use research methods such as surveys or interviews to learn more about users from marginalized groups. Finally, this project highlights ways to collaborate with students and staff without formal assessment training at every stage of the research process, from literature review to recruitment to interviews to analysis.
Black Undergraduate Perceptions of Inclusion and Engagement in a Public Research University: Strengths, Challenges, and Recommendations for the Library
Jung Mi Scoulas, Elena Carrillo, and Linda Naru (University of Illinois Chicago)
Paper
Full Text Paper (PDF)
Slides (PDF)
Show Abstract
The goal of this paper presentation is to re-examine a campus survey conducted during the COVID-19 pandemic, which assessed the culture climate of university students in order to better understand how Black/African American undergraduates engaged with university life and their perceptions on feelings of inclusion compared with other racial and ethnic groups. While there have been efforts to understand the needs and challenges of university students using various assessments at the university, college and unit level, little has focused primarily on a deeper understanding of the Black/African American college experience.
University of Illinois Chicago (UIC) is committed to achieving racial equity at all levels. As part of a coordinated campus-wide initiative, the University Library recently completed an Achieving Racial Equity (ARE) plan for 2022. This paper presentation draws on broader campus data to develop support and resources to improve Black/African American perceptions of engagement and inclusion.
Design, methodology, or approach
This project reuses a survey conducted Spring 2021 by the Office of Institutional Research (OIR) in collaboration with the Office of Diversity and the Office of the Vice Provost for Academic and Enrollment Services. This survey focused on undergraduate perceptions of overall university life, academic engagement, interactions with campus resources, and services. The research team received de-identified data from OIR and analyzed it for patterns and gaps, comparing student experiences across racial and ethnic groups.
Findings
912 undergraduates participated in the survey. Among them, 32% were Hispanic, followed by White (26%) and Asian (24%). 7% were International Students, and an additional 7% self-identified as Black/African American. Results showed these students had the highest average scores rating quality of interactions with other students, academic advisors and faculty. Black/African American students also recorded the 2nd highest average scores related to quality of interactions with student service staff and other administrative offices.
Black/African American students were most likely to seek academic support and other resources, and indicate the highest level of comfort speaking with academic advisors. They also had the 2nd highest perception of receiving sufficient academic support, and were most likely to feel they had a positive family/school balance.
However, they also had the lowest perception of fitting in, and had the largest percentage of being dissatisfied in terms of emotional support.
Practical implications or value
The findings may guide the University and the UIC Library to strengthen the current programs, resources and services, and make adjustments as needed, however specific questions about students’ perceptions of the Library were not included. Overall, members of UIC’s Black/African American community have expressed their experiences with racism, systemic bias, and exclusion in many forms. As a part of the University’s commitment to a more welcoming and inclusive campus environment, the Library plans to conduct a climate survey focusing on Black/African American students to gauge explicit feelings about experiences within the libraries to determine what additional services can be provided to support their needs.
Collections Value/Impact
360 Degree Approval Plan Assessment
Eva Jurczyk and Naz Torabi (University of Toronto Libraries)
Paper
Show Abstract
Libraries collections are one reason that students, faculty, and researchers access the library’s physical and digital spaces. The University of Toronto Libraries (UTL) acquire a great number of monographs through Approval Plans (APs). For example, between April 2019 to March 2020, over 50% (or 10,895 titles) of the monograph acquisitions from a major US/UK vendor were through the library’s AP. APs shape the library’s collective collection but these plans require regular assessment to ensure they perform as expected.
The goal of the project was to understand how this major AP was performing to support research and teaching, new academic programs, and whether the plan reflected current and evidence-based collection development practices, including to what degree the plan contributed to the diversity and inclusivity of our collections.
Design, methodology, or approach
Phase 1: (September – November 2021)
The first phase of the plan was to define the quantitative criteria for an optimally performing approval plan. This was undertaken through four focus groups among subject selectors and liaison librarians from across the system. The transcripts of these focus groups were coded using NVIVO and a follow up survey was sent to this same group. The results of this phase were used to set the benchmarks by which we would define a successful approval plan.
Phase 2: (December 2021 – March 2022)
The project team gathered and analyzed data, including five years of approval plan and firm order purchase data, bibliometric data about activities of University researchers, circulation and course reading list information, and holdings data for other libraries in our network. This analysis was centred on answering how well our plan was performing against the benchmarks defined in Phase 1.
Findings
Through the focus groups, the project team found that a diversity of assessment approaches was needed for approval plan assessment and that a consultative, systematic, data-driven approach was necessary. This approach allowed us to set benchmarks, identify patterns, uncover the reasons for those patterns and determine what can be improved. While specific benchmarks varied by discipline, we addressed the following questions through data analysis:
Are there any gaps in geographical coverage?
Are there any gaps in the subject areas we collect?
Are there any significant differences in usage by acquisition types?
What is the level of overlaps in print purchases between UTL and non-central libraries?
What percentage of materials used for teaching are covered by the approval plan?
What percentage of slipped material do we select?
Practical implications or value
For libraries using approval plans for print collections, these plans represent a significant portion of an acquisition budget. They are complex, assembled by multiple stakeholders, and bound by the limitations of vendor systems. This multidimensional approach to assessment can be used as a model in other libraries who are hoping to undertake such a review.
Opting In or Out of Checkout History: What Drives Patrons’ Decisions about Their Library Data
Craig Smith and Ken Varnum (University of Michigan)
Paper
Slides (PDF)
Show Abstract
Traditionally, libraries have been cautious about retaining patron data. In recent years, however, some academic libraries have embraced analytics as a way to understand and make inferences about patrons’ library-related behavior. These types of studies require the preservation of large amounts of patron data; in many cases patrons are not given explicit choices about whether their data are retained or not. At the University of Michigan, we recently began offering patrons a choice about whether their checkout histories are retained or not. We used this as an opportunity to explore the following research questions: What reasons motivate library users to either retain or delete their checkout data, and how do these decision relate to other factors such as campus role, patron demographics, and disciplinary area?
Design, methodology, or approach
To explore these questions, we sent a survey to patrons whenever a patron logged into their account and opted in or out of retaining a checkout history. The survey asked each patron why they made the choice they did, and also explored issues of trust in data management. Although the study is ongoing, we have already analyzed the responses from 180 members of our campus community.
Findings
One key finding is that, on average, patrons trust the library significantly more than the broader university with regard to data management, and they further trust the university significantly more than internet-based companies (e.g., Amazon, Netflix). This may be one reason why, when offered the choice about whether to retain their checkout histories, 90% of patrons have thus far chosen to have the library store their checkout data. The richness of the survey data lies in the reasoning supplied by patrons, when asked about the choices they made. These open-ended responses were coded independently by two raters who achieved acceptable agreement (all Cohen’s kappas > .70; discrepancies were easily resolved through discussion). Prominent reasons why people wanted the library to retain their checkout data included using their checkout histories as helpful reading lists, using their histories to facilitate research endeavors, and feeling confident that their data would be managed carefully. Prominent reasons why others chose to have their checkout data deleted included deep privacy concerns, the view that the library should not be storing such data, and even some astonishment that the library had historically been storing checkout data. We also explored whether patrons in historically marginalized groups had different levels of trust in the library’s data management, compared to people not in such groups; thus far we have not found differences.
Practical implications or value
These findings are quite unique in the area of library assessment and research; understanding how people think about their library data will allow libraries to offer appropriate choices to patrons, and may even point the way to new services that can be offered for patrons who do want some of their data retained.
Services/Usability
Assessing Scan and Deliver during COVID-19 and Beyond
Lisa Levesque and Sonny Banerjee (Toronto Metropolitan University)
Paper
Full Text (PDF)
Slides (PDF)
Show Abstract
Toronto Metropolitan University Library implemented the scan and deliver service in June 2020, during the COVID-19 pandemic. With this service patrons can request a portion of text, such as a chapter of a book or a journal article, be scanned by a library staff member and emailed to them. This and other complementary services were implemented because of limited access to the physical library building and the print collection due to pandemic lockdowns. As the Library space reopened to patrons, we assessed this service to understand its impact and plan for future service offerings. This paper addresses why Toronto Metropolitan University Library patrons have used the scan and deliver service: what benefits does it offer them, what role do scanned materials play in their scholarly research, and what barriers does it help them overcome? These questions will inform the future of this service at our library and can be used to address similar questions facing other libraries regarding services implemented mid-pandemic.
Design, methodology, or approach
This assessment used an email survey and qualitative coding to explore this research question.
Findings
Almost all survey respondents (97.5%) replied that the service had enabled them to overcome barriers to their academic pursuits, with one calling the service “critical to mission” during a time of limited access to print resources. The scan and deliver service extended access to Library resources in a manner that respondents describe as fast, convenient, reliable, and which made research possible even at great distances. Scan and deliver was used strategically by patrons in combination with other library services, such as interlibrary loan and online course reserves, to broaden access to print resources. We found that the scan and deliver service addressed patron needs that extend beyond those experienced during lockdown, as our patrons continue to travel long distances to reach campus, face challenges in accessing critical resources needed for research and coursework, and have health concerns that limit their access to the physical library.
Practical implications or value
This assessment has practical value for libraries that are considering implementing a scan and deliver service, or that are assessing services implemented mid-pandemic. In a time when there are no longer COVID-19 lockdowns, library assessment must consider the future of services implemented during the pandemic. Patrons told us that they want scan and deliver to continue; a “return to normal” should not undo service improvements. We envision the library assessment community drawing on our findings in relation to assessment projects that focus on barriers to access and the impact of the COVID-19 pandemic on sustained service improvements.
Optimizing a Library Website for Student Research: Comparing User Metrics between Encore and Google Scholar
Lindsay Ozburn, Ryan Bushman, Margaret Winward, and Liz Woolcott (Utah State University)
Paper
Full Text Paper (PDF)
Slides (PDF)
Show Abstract
The paper addresses the methods and general conclusions portion of an experiment that evaluated user preference and search experience between using Google Scholar and Utah State University Libraries’ Encore discovery layer as a starting point for research. USU’s 2019 Ithaka S+R Faculty survey highlighted that our faculty utilize Google Scholar more as a starting point for their research. To triangulate these findings, the experiment attempts to identify which search methods undergraduates prefer.
Research questions:
- What is the average completion time for tasks performed in Google Scholar versus tasks performed in Encore?
- How many actions or clicks does it take to perform tasks in Google Scholar versus in Encore?
- What are the benefits and/or drawbacks that users perceive when using Google Scholar versus Encore to search for information?
Design, methodology, or approach
Methodology:
This research study included a pre-survey, task analysis, and post-survey for two groups of randomly selected undergraduate students.
The task analysis portion of the experiment utilized A/B testing, with the two groups evaluating two search platforms:
- The Library’s current search environment (Encore)
- A mock environment with a dual search tab set-up and direct access to Google Scholar.
Each group inversely performed 10 common information search and retrieval tasks, split between the two search platforms.
Loop11 (website usability testing software) tracked the number of clicks and completion time for each task and was also used to collect the data for the pre-survey and task analysis portion of the project.
Following the task analysis exercise, all participants completed a post survey that asked them to reflect on the pros and cons of each interface.
To ensure generalizability to the campus population, the team utilized random stratified sampling methodology with the follow strata: STEM, non-STEM, and undeclared majors
Using a random number generator, the team statistician randomly selected several sets of 112 students from a total list of undergraduates in such a way that the proportion of STEM, non-STEM, and undeclared students in the sample was proportional the campus population.
The research team then sent out numerous recruitment emails to the samples, offering the experiment on 2 different days. The research team was prepared to accept up to the first 56 participants based on a Power calculation that identified what number of participants would provide the strongest data, statistically.
Findings
Expected conclusions:
- Our discovery layer, while adequate, doesn’t offer the best search experience based
- Undergraduates prefer to use a combination of Google Scholar and a more curated discovery layer to search for information
- A dual-tab search interface facilitates more streamlined information searching as opposed to a single search box experience.
Practical implications or value
- Our work provides a straightforward example of implementing A/B testing to evaluate user search experiences.
- The study provides easy-to-understand statistical sampling methodology that enhances the validity of data user testing data. Our methodology can be easily replicated in other research.
- Our outcomes will contribute to bodies of assessment work that evaluate ever-changing user preferences with information search and discovery.
Integrated and holistic project assessment for a library website redesign
Heidi Burkhardt (University of Michigan Library)
Paper
Full Text (PDF)
Slides (PDF)
Show Abstract
When assessing a project it can be difficult to decide what to focus on, and the things you want to measure are often multidimensional and do not fit into a single assessment method. For the University of Michigan Library’s 2-year website redesign project, assessment was integrated into the work and timeline from the start. The goals for our assessment plan were to be able to measure the legacy website against the redesign across a variety of metrics, as well as know how well the structure of the project team worked and whether our internal communication and outreach were successful.
Design, methodology, or approach
The assessment plan was split into seven metrics: usability, accessibility, mobile experience, content authoring experience, site performance, project management and structure, and internal communication and outreach. Within each metric, we articulated a desired outcome and a set methods that we’d use to evaluate it. Each method included “plan statements” that laid out exactly what we would do. In some cases the statements were specific tasks (e.g. conduct tree testing on draft information architecture), while others stated an intention or practice (e.g. follow WCAG 2.1 (A and AA) guidelines, or use JIRA Project to track the progress and completion of deliverables and tasks). Overall, we used a combination of formative and summative assessments, while also employing programmatic strategies for how we worked and built the site to support achieving the desired outcomes.
Findings
We found the website redesign demonstrates significant improvement over the legacy website in usability, accessibility, site performance, and the content authoring experience. In addition to the results of our assessment metrics, the praise that came in from colleagues in the library and broader community illustrated an overwhelmingly positive response to the redesign. The project also demonstrated best practices in project management, stakeholder engagement, and internal communication, not only in what we did, but how we learned and adjusted throughout the project.
Practical implications or value
This paper will illustrate how building assessment into a project plan from the beginning benefits the project as a whole and makes the assessment manageable and meaningful. It will also demonstrate that there are lots of ways to measure success and being intentional about it is arguably the most important factor. The lessons learned that came out of this effort are broadly applicable to library assessment work, especially for time-bound projects.
Organizational Issues/Assessment
Supporting Development Initiatives through an Investigation of Stakeholder Insight
Steve Borrelli, Lana Munip, and Leigh Tinik (Penn State University Libraries)
Paper
Slides (PDF)
Show Abstract
In fall 2021, Penn State Library Assessment was asked to conduct a study of one of the Library Development Office’s core initiatives—Donor Community Meetings. While these meetings were aimed at connecting donors with librarians and building community around common areas of interest, donor engagement was proving to be hard, and the sustainability of the initiative was in question. Guided by principles from the appreciative inquiry approach to organizational improvement—a strengths-focused and generative process that engages internal and external stakeholders—Library Assessment sought to investigate concrete ways to improve the meetings, based on stakeholder insight. This investigation resulted in a series of recommendations to improve engagement and sustainability, which catalyzed a taskforce comprised of Alumni Advisory Board members, Development office and University Libraries personnel, which prioritized a list of results-oriented actions to advance the initiative.
Design, methodology, or approach
Six 90-minute virtual focus groups were held over Zoom with donors, prospective donors, Development office staff, and librarians. In all, 30 individuals participated in the study. Participants were asked a series of up to 10 open-ended questions dependent on stakeholder group. The sessions were recorded and transcribed by the research team and QSR International’s NVIVO software was used for coding and analysis. By framing the analysis of this multi-stakeholder study through an appreciative inquiry lens, the research team sought to center the strengths of the community meeting initiative and the opportunities that existed for improvement.
Findings
The analysis revealed numerous opportunities for improvement related to the meeting structure, communications, participant interactions, and other areas. Recommendations from the study were “workshopped” in spring 2022 by the taskforce. In a series of meetings, this group further refined the recommendations and developed a plan of action to re-brand and re-package the initiative, while holding intact aspects of the meetings that the stakeholders valued.
Practical implications or value
The immediate practical implications of this project are that the Development Office was able to use the results of this study to refine and improve one of their core initiatives. Beyond this, the project had value in that it was a participatory and collaborative exercise that engaged donors (stakeholders) throughout the process, such that they were willing to share their time and expertise in working with the recommendations. This may lead to stronger connections between the Libraries and its donor base.
As public funding for higher education (and by extension, academic libraries) shrinks, fundraising plays an increasingly prominent role. Collaborations between a library’s assessment unit and its development office could potentially strengthen an organization’s donor-facing programs and fundraising initiatives.
Responding to Faculty Concerns: An Approach to Salary Equity Analysis and Course Correction
Leigh Tinik and Steve Borrelli (Penn State University Libraries)
Paper
Slides (PDF)
Show Abstract
In response to perceived salary inequities expressed at a Library Faculty Organization meeting, the Dean of Libraries charged a taskforce comprised of personnel from Human Resources, Finance, Library Assessment, and Senior Administration to investigate. This paper describes a methodology for identifying and addressing salary inequities, including the development of a process to analyze salaries for market equity and compression and a commitment to corrective action and periodic review to minimize the potential for salary inequities.
Design, methodology, or approach
Cases were systematically identified for an equity review using both market and compression analyses. To identify market inequities, a peer comparison group using the Big Ten Academic Alliance (BTAA) was compiled using the 2019 Association of Research Libraries (ARL) salary survey data. Cases falling below the BTAA 25th percentile by rank were flagged for review. To identify inequities due to compression based on rank and years of experience, a regression analysis was conducted. Cases where the actual salary fell 1.5 times below the predicted salary were flagged for review. A cost to fix salary inequities was estimated and a request for funding was submitted to the provost. Throughout this process, the taskforce sought to minimize subjectivity in selecting and reviewing cases to maintain transparency and trust among faculty.
Findings
A first round of analysis identified nearly a third of all cases as potential salary inequities, requiring a confidential review. And an additional handful of cases were flagged for review at the request of the Administration. A corrective plan of action was developed by a senior finance administrator and employed to begin a multi-year corrective strategy. Additionally, the resulting dataset has informed the process of initial salary offers by reducing the time to calculate by nearly 75%, and catalyzed critical questions informing salary offers minimizing the potential for salary inequities.
Practical implications or value
Academic libraries are increasingly engaging in critical reviews of historical practice to minimize discrimination in operations. This paper provides a methodology to consider when evaluating salary equity which minimizes subjectivity. It illustrates the value of data in equipping Human Resources personnel in calculating initial salary offers, improving upon traditional practices. It spotlights both a commitment to course correction and an approach to consider for multi-year corrective courses of action.
Assessing Student Employment in Libraries for Critical Thinking & Career Readiness
Rick Stoddart (Michigan State University Libraries), Jennifer Pesek (SJSU Graduate Student), and Kate Thornhill (University of Oregon)
Paper
Full Text (PDF)
Slides (PDF)
Show Abstract
Academic libraries typically employ a large percentage of student employees at colleges and universities. Students employed at academic libraries can benefit greatly from high-impact practices that contribute to their academic retention, training as scholars, and future employability after graduation. This paper will report on an assessment study to explore the connections between skills acquisition, career competencies training, and high-impact practices in student employment in one academic library. The study seeks to understand how well current library employment practices are preparing student employees for the skills and competencies most valued by employers as measured by the National Association of College Employers (NACE). The paper proposed isolates the top employer-ranked competency (critical thinking) and interrogates the student employment experience for engagement with critical thinking during their library employment, training, and student perceived importance to their academic majors and future careers.
Design, methodology, or approach
The study compares both quantitative and qualitative data from a survey of library student employees (N=30) and student employee supervisors (N=3) to data collected from NACE and other employer surveys. The research analyzes the connection between competencies sought by library supervisors and employers with the training and skills acquired by students while employed in the library.
Findings
While critical thinking is highly valued by the students and by prospective employers, the student and supervisor experiences indicate that training and application of critical thinking in their library employment experiences are low. The paper provides constructive suggestions on how academic libraries might effectively leverage career readiness competencies with their student employees while on the job and how assessment data might be collected to gauge improvement in these areas. The paper proposed isolates the top employer-ranked competency (critical thinking) and interrogates the student employment experience for engagement with critical thinking during their library employment, training, and student perceived importance to their academic majors and future careers.
Practical implications or value
This paper will provide insight into the student employment experience in libraries and how to leverage this to demonstrate the impact of student employment in libraries as well as possible areas to focus student and supervisor professional development. Libraries invest significant resources in student employees and this paper will offer some pathways to enhance the return on investment on student employment in libraries.
Building Collective Capacity for Assessment and Advocacy: A Model for Academic Library Consortia
Lisa Hinchliffe (University of Illinois at Urbana-Champaign), Karen Brown (Dominican University School of Information Studies), and Anne Craig (Consortium of Academic & Research Libraries in Illinois)
Paper
Full Text (PDF)
Slides (PDF)
Show Abstract
While a growing body of evidence supports the assertion that academic libraries positively impact student success, libraries, individually and collectively, must make the argument to their higher education stakeholders in ways that are meaningful. Doing assessment is itself challenging; however, then translating assessment data into advocacy strategy is a further challenge. CARLI Counts: Analytics and Advocacy for Service Development is a continuing education library leadership immersion program that took on this challenge. Funded in part by a four-year, $243,885 Laura Bush 21st Century Librarian Grant from the Institute of Museum and Library Services, CARLI Counts prepares librarians to make effective use of research findings on the impact of academic libraries on student success for service development and library advocacy. In three consecutive program cohorts, CARLI Counts participants have learned how to use local library data analytics to improve their services, demonstrate library value, and build their confidence in the ability to do so. Two program cohorts consisted of teams in which each individual worked on an issue or topic specific to their local campus; a third program cohort had teams undertaking a collaborative project that focuses on a specific topic (e.g., OER, online tutorials, library space). This paper compares these two approaches, assessing their relative efficacy and provides insights into the potential contributions of academic library consortia in fostering collective impact.
Design, methodology, or approach
CARLI Counts is being evaluated using multiple methods, quantitative and qualitative, and includes feedback from participants, evaluation of program materials, and performance outcomes from participants. Methods include interviews, document analysis, and surveys. With each of the three cohorts, the evaluations have taken place before, after, and during the year-long experience. Additionally, the concluding year of the project includes participant surveys of the previous two cohorts to determine the impact of the program on the participants since completion of their cohort in the program. The evaluation addresses the participants’ perceptions and uses of evidence-based library practices, their leadership role around evidence-based practices at their library and on their campus, their level of engagement with CARLI, and their perspectives on the roles of an academic consortium in advancing cross-institutional, collaborative assessment initiatives.
Findings
Program evaluation data indicates that CARLI Counts is achieving its twin program outcomes of improving the ability of librarians to investigate and communicate the impact of academic libraries on student learning and success, and of growing the confidence that librarians have in their ability to do so. The evaluation results also document the benefits and challenges associated with multi-institutional team approaches to demonstrating library impact on student success.
Practical implications or value
CARLI Counts demonstrates that library consortia are well positioned to serve as a center for professional development on academic library impact on student learning and success. CARLI Counts will also be releasing an open version of the training curriculum that can be utilized by other groups. CARLI Counts personnel will make themselves available to consult with other consortia or libraries that may wish to adopt and/or adapt this model.
COVID-19
Envisioning our Future Phase III: The Pandemic Changed Everything
Nancy Turner (Temple University Libraries)
Paper
Full Text Paper (PDF)
Slides (PDF)
Show Abstract
The Envisioning Our Future project is a case study of how space supports how staff work at Temple University Libraries. We explore how physical and virtual spaces and technologies accommodate our individual work, as well as with colleagues and with users. We aim to understand how the hybrid environment has impacted our work and our organization.
Design, methodology, or approach
The project builds on previous research conducted to explore how physical space supports library work. The research took place in three phases, from Spring 2019 as the library staff were preparing for a move to a new library space, to six months post-move (Winter 2020), to Phase III, conducted in the fall of 2021. Each phase consisted of semi-structured interviews with staff (n=86). While Phase III was not part of the original project design, the pandemic and subsequent increase in hybrid work introduced important new aspects to the question of how space, increasingly digital, supports our work as individuals, with colleagues and users.
Phase III consisted 28 interviews of staff working at a range of levels, functional areas and work spaces, from fully onsite to fully remote. All interviews were conducted via Zoom, audio-recorded, fully transcribed, coded and analyzed to discern themes.
Findings
Staff working remotely experience benefits in work productivity. They enjoy the ability to focus on individual work and have more control and flexibility around their day’s structure. Staff working remotely appreciate their supervisors’ trust and respect in providing this opportunity. If the technology is working, staff communications via Zoom and other means are seamless, and in some ways afford improved interactions with colleagues and with users.
While staff enjoy the privilege of working remotely, they describe feeling lonely and isolated at times. They describe less interaction with colleagues outside of their immediate department or project team, and less serendipitous connection. Some sense a lack of cohesion across the organization and a growing gap between onsite workers and those working from home, exacerbating what some perceive as an already siloed organization.
Staff working onsite during the height of the pandemic developed a special comradery from this shared experience. At the beginning of pandemic when vaccinations were unavailable, health and safety concerns contributed to anxiety and frustration for onsite workers, as well as their remote-working colleagues.
Working in the same physical space offers opportunities for serendipitous meet ups and informal socializing. Being onsite also allows for a direct connection with students and the community who depend on the libraries’ physical spaces and resources, allowing for immediate assistance and communication with patrons, perhaps harder to accomplish through virtual means.
Practical implications or value
The project resulted in recommendations for next steps, from staff discussions of best practices for effective hybrid meetings to review of job descriptions for remote work opportunities across all levels of the organization. Of interest to all libraries, the research generates questions about how hybrid work environments impact organizational culture. Now that students are back on campus, we need a better understanding of their needs for in-person interactions with library staff.
Sharing is Caring: Empowering Voice and Engaging Library Staff During the Pandemic
Susanna Cowan (University of Connecticut)
Paper
Full Text (PDF)
Slides (PDF)
Show Abstract
From March 2020 through spring semester 2021, the majority of our library staff worked wholly remote schedules. For the 2021-22 academic year, staff worked mostly hybrid schedules resulting in, on a typical day, only some fraction of staff being onsite concurrently. Like many academic libraries, work was mainly accomplished through virtual interaction launched from a wide array of offsite home offices or de facto (improvised, sometimes shifting) work spaces. Although our Library staff had a variety of means for collaborating and communicating in both formal and informal ways – from WebEx and Teams to Slack, Miro boards and more – gauging the experience of the pandemic along both work-professional and home-lived-experiential axes was anecdotal more than intentional.
Several questions persisted (and persist) across the span of the “pandemic months” – including not only the very human “how is everyone doing out there?” but also the organizationally critical “how is collaboration going?,” “how well are we connecting to each other and stakeholders,” “how engaged are staff as this continues,” and “how successfully are Things Getting Done”?
In the pandemic, things have, even when in planning mode, mostly been reactive: shifting institutional and public health codes had to be mashed up rapidly with library norms and care for the wellbeing of our staff. In this climate, there was no time to launch formal large-scale climate surveys or service assessments. Instead, over the course of the pandemic, we conducted a series of assessments that complemented and extended each other, although that was not there intent initially. It would be misrepresenting this assessment sequence to describe it as planned – like most things in the pandemic, it emerged.
Nonetheless, in aggregate these assessments became critical pieces of feeling our way forward through this period. Powerfully, what also became evident was that staff were willing, perhaps eager, to be offered opportunites to share their experience of and perspective on the pandemic as it was affecting their work and personal lived experience. In this sense, these assessments, which had functional intent, turned out to be powerful tools in giving space for individual Voices to be heard in ways that pushed past the loose, informal exchanges in public forums like Slack. During a period of overall withdrawal, this engagement with questions about the workplace, work interaction, and the work itself, was striking.
Design, methodology, or approach
There is no one tradition of scholarship or method that these assessments, or that this reflection on the organizational impact of these assessments, draws on, although it is fair to say that both are indebted to the methodology begind ARL’s ClimateQUAL organizational climate and diversity survey. We had only just finished running the ClimateQUAL several months before the lockdown. As the library and University shut down in March 2020, we were finishing data analysis and writing first drafts of an executive summary of that survey’s findings to share with staff (work that would, like many things, be completed later than planned, as we had to push pause in the face of the immediate emergency).
ClimateQUAL was formative for our library, and every all-staff assessment implemented since has drawn on both principles in its design and findings specific to our organization. ClimateQUAL is a survey built to elicit often very personal experiences of an organizational culture in a manner that, in structure, content, and implementation, emphasizes care for privacy and personal autonomy. We launched ClimateQUAL in an organization that was suffering from rifts caused by distrust of leadership, feelings of disempowerment and resulting disengagement. Although the findings of ClimateQUAL showed those organizational characteristics to still exist, the way we ran ClimateQUAL worked to bolster organizational trust and individual empowerment. In pre-survey education, survey rollout, and post-survey analysis and data sharing, we were careful and meticulous. We made ample time for questions about survey data stewardship, and chose to share the full survey data report, with the exception of a single comment that attacked another staff person by name.
As a consequence of running ClimateQUAL, all of the organization-wide assessments designed during the pandemic, which included several surveys and a series of facilitated topical conversations, were intentionally structured to solicit feedback in ways that were empowering, transparent, and safe.
The experience of ClimateQUAL, and its almost uncanny placement on the eve of the pandemic, uniquely prepared us to design as-needed assessment over the course of the two years of remote and hybrid work. The “sequence” of assessments comprised an initial, quick, almost on-the-fly survey of “remote work.” Running across fall semester 2021, a group picked up a piece of ClimateQUAL findings by conducting a series of “conversations around teams and teamwork” aimed at uncovering why our organization had scored relatively low on the “structural empowerment of teams” climate, a topic that felt particularly timely in this time of remote work. At the end of all 2021, as we completed our first semester of mostly hybrid work schedules, we conducted an extensive evaluation of remote and hybrid schedules and work, asking staff to share their perspectives on several key aspects of “working,” including communication, productivity, engagement, and relationship/community building and maintenance. Finally, a strategic project group conducted a survey of the pandemic’s impact on library work and services in winter 2022.
Findings
Each of these instruments has its own local findings, and we are still drawing out of the cumulative data what we have learned, in sum, about various aspects of our organization and the work of its staff during this period.
But as important as the per-instrument findings, for this study, is what we learned about running intentionally-designed and empowering instruments at a time of individual partial or complete physical isolation. During such a period, one would expect, perhaps, to find exacerbated organizational issues such as disengagement and distrust. This would seem especially likely during a period in which “emergency” decisions were often top-heavy (or at least originating from narrow circles around leadership), as urgency dictated fast and definitive response.
But what we found, and will continue to tease out, is that staff were, regardless of specific feedback, eager to participate when invited to do so. When conducting ClimateQUAL, the work was to woo staff to participate, and the method of wooing was to be transparent, make promises (about confidentiality, data curation, and communication of results) and to stick to those promises. The surveys and group conversations we conducted over the months of the pandemic took these lessons from ClimateQUAL – methods were transparent, protection of participants was paramount, and findings were share as wholly as possible. When conducting the most sensitive of these assessments, the Fall ’21 (alternative work arrangement) Staff Self-Assessment, we designed a survey with an “employee” section and a “supervisor” section, but allowed any staff person to click through the questions of both to see what was being asked. Open-ended questions, often the “ask” that asks too much, were highly successful, and staff answered even multiple open-ended questions in single instruments.
So although the “findings” of each instrument are valuable, and this paper will highlight some of them, the “finding” that may be most lasting is the degree to which we can use assessments to cultivate Voice and engagement at a time when communication is literally channeled and constrained beyond our control.
The idea of self-assessment in particular, as critical to organizational health has, of course, been around for a long time, and informs not only ClimateQUAL but also powerhouse organizational-corporate approaches such as TQM and its many iterations. That background is relevant to this paper, although the focus here is on how self-assessment was particularly powerful at a time of overall constraint of individual voice and empowerment.
Practical implications or value
We have learned many things about what works in assessing one’s own staff that can translate easily across organizations. There will be many, many institution-specific and regional/national (and international) studies to emerge about “what happened” in the pandemic, from services to staff experience. There is no doubt that trends will emerge from these studies that our local investigations will echo.
But the bigger lesson that will have more lasting impact on our organization – and more importance as we share with the greater assessment community — will be how “intentional” and organizationally-aware question asking can serve two purposes at once: the satisfaction of an immediate data need and a more “meta” purpose of making participants feel listened to and empowered to use their Voices, a feeling that in turn leads to continued participation (and empowerment). We have two imperatives, as we emerge from what we hope has been the worst of the pandemic: to learn about what happened and to learn from what happened.
This paper hopes to contribute to the latter. It’s a question worth asking: how does what we experienced locally extend beyond our own community? As we emerge into the next-normal, will such intentional, organization-savvy question-asking similarly elicit feedback from stakeholders such as students, faculty, and the public while also making them feel more empowered in the process? We will be running LibQUAL in the coming year: it may be our first broadly-impactful opportunity to consider both what we’re asking and what we’re accomplishing by asking. We have learned something about how question-asking, when designed on principles of personal empowerment and organizational transparency, can accomplish something undesigned (at least initially) within our organization. Can we draw on what we have learned to empower new voices, perhaps, that we have not heard from traditionally in library service assessments?
“They’ll Still Come, They Still Need You, Right?” Library Value after COVID-19
Amy McLay Paterson (Thompson Rivers University) and Nicole Eva (University of Lethbridge)
Paper
Full Text (PDF)
Show Abstract
This paper will discuss the work experiences of Canadian academic librarians during the COVID-19 pandemic, as they relate to participants’ thoughts on the value of libraries and librarians going forward. Throughout the semi-structured interviews, almost all study participants shared thoughts on how libraries should change as a result of COVID and how their work was valued (or not) by their patrons, colleagues and administration.
Design, methodology, or approach
As the goal of this study was to explore in-depth individual experiences, it was determined that semi-structured interviews would be the best method of capturing our participants’ thoughts, feelings and understandings of their work during the COVID-19 pandemic. Previous attempts to capture the phenomenon of librarian work during the COVID-19 pandemic have been through surveys, which inherently capture a wider breadth of experience; however, we wanted the chance to both probe into the depths of our participants’ experiences and to follow up or clarify any points that were raised. Our scope was limited to those working in non-administrative librarian positions at Canadian post-secondary institutions. While the observations of other library workers, such as library technicians or assistants, would undoubtedly be interesting and noteworthy, it was determined that their work and experiences would be distinct from that of librarians.
Findings
Most participants were resistant to returning to the “old normal” without myriad changes inspired by the COVID-necessitated adaptations. Proposed changes were varied and often specific to the participants’ work area but often focused on either the future of remote work or the reevaluation of core services. However, there were concerns raised about whether or not their ideas would be implemented or even heard. Additionally, many participants felt caught between proving their value through productive (and measurable) labour and the care-work that felt necessary and pressing but was not externally validated.
Practical implications or value
Libraries often fall back into the refrain of “just the way we’ve always done it.” Furthermore, austerity and resilience are constantly evoked as a crisis response for libraries. When we interviewed librarians in March and April of 2021, there was a resounding belief that COVID-19 was a different sort of crisis, that it was an opportunity for real change–changes that the librarians in our study were actively hoping for. However, in order to achieve these changes, library decision makers need to reevaluate their conceptions of both library value and core services.
Overcoming Technology Barriers, Particularly for Historically Underrepresented Students
Travis Teetor and Robyn Huff-Eibl (University of Arizona Libraries)
Paper
Full Text (PDF)
Slides (PDF)
Show Abstract
This paper will describe efforts at the University of Arizona Libraries to improve access to internet and technology during the pandemic and adapting to an ongoing hybrid instructional modality. We will highlight how our institution leveraged campus data and new partnerships to better meet student needs, particularly for underrepresented and first-generation students.
Design, methodology, or approach
The University of Arizona Libraries analyzed anonymized student demographic data, including race/ethnicity, first generation student status, and Pell grant recipients to determine how existing service utilization aligned with the campus population. Our goal was to reach more underrepresented populations and students in need. Located on the U.S. Mexico border, 80% of University of Arizona Distance Education students identify as Hispanic/Latinx and 74% are first-generation college students. University of Arizona typically uses Pell Grant eligibility to identify students as low-income, and in fall 2021, 40% of the total University of Arizona student population was Pell eligible or received a Pell Grant at some point during their undergraduate academic career.
Findings
Reliable broadband internet and technology became increasingly essential educational resources during the COVID-19 pandemic. This is particularly true of students who work in remote regions where access is limited or multigenerational households where the environment is not conducive to learning. Inequities in accessing these resources were exacerbated by the pandemic and stay at home orders, especially for marginalized communities which have been technologically disadvantaged. Similar to national surveys, a Fall 2020 UA survey indicated that one in three students faced limited internet access and two in ten reported that a lack of access to technology or software reduced their ability to perform well in classes delivered online. In order to address these needs, libraries can embrace new roles collaborating in new partnerships and creating spaces to reduce students’ barriers and increase academic success.
Practical implications or value
Traditionally, libraries have served as a central hub for information and resource dissemination across the campus community. New roles include bringing together units that have not traditionally worked together in order to provide increased access to technology and spaces for students. Our approach intentionally prioritizes students most in need, while acknowledging the historic inequities that our community members face. We will share approaches to looking at demographic data in order to better leverage partnerships between university units with a goal of designing efforts with significant reach and impact.
Data Presentation & Visualization
Visualizing Value of Library Collections Relative to the University Teaching and Research Enterprise: An Application of the CDL Journal Weighted Value Algorithm by Three BTAA Libraries
Sarah Murphy (The Ohio State University); Jim Stemper and Mary Schoenborn (University of Minnesota Libraries); and Lee Konrad and Stephen Meyer (University of Wisconsin-Madison Libraries)
Paper
Full Text Paper (PDF)
Slides (PDF)
Show Abstract
Academic libraries license many e-resources through state or regional consortia. Differences in school demographics, disciplinary emphases, budgets, priorities, and licensing restrictions can make the analysis of use and cost patterns in shared collections challenging. Studies have shown that a minority of e-journals in publisher packages get a majority of the downloads. However, not all articles downloaded are later used in teaching or research, leading to the question: What other metrics of use should be considered and how can they be presented to support decision-making?
Design, methodology, or approach
The Big Ten Academic Alliance (BTAA) recently purchased a subset of the Clarivate Web of Science dataset, which includes the entire WOS database. The authors, representing three out of the fourteen schools in the Alliance, adapted for analysis and visualization the California Digital Library (CDL) Journal Weighted Value Algorithm, which generates a value score for individual journals using authorship, citation, and usage data. This project shows how to model in Tableau COUNTER download data with bibliographic, authorship, and citation data from Web of Science, plus average cost data from Library Journal’s annual periodicals price survey and then effectively display this data through a series of dashboard visualizations. Specifically, the authors created three prototype dashboards to visually answer several questions by subject discipline and publisher package at both broad and more granular levels. The questions included:
Title dashboard
What journals do BTAA faculty and researchers publish their research in?
What journals do BTAA faculty and researchers cite?
What journals do BTAA faculty and researchers download?
Do 20% of the titles represent 80% of total downloads?
Publisher dashboard
Do a publisher’s titles in a subject area represent better value compared to the average score for all publisher titles in that subject?
Subject dashboard
Do authorship/citation rates vary widely by subject discipline?
Findings
Each dashboard shows how many BTAA schools subscribe to each title. Dashboard users can also limit the view to a particular school.
Practical implications or value
Tableau offers an opportunity to automate the packaging and display of large datasets, allowing librarians to design useful visualizations and then set a schedule to refresh this data, on a daily, weekly, or monthly basis or on a customized schedule. This saves library analysts significant time, when information is needed to inform decision-making. This project shows how to model, analyze, and assess the value of publisher journal collections held locally or by the consortia by visualizing trends for downloads, citations, and authorships. It also provides a proof of concept for the long-range potential for automating the visual analysis of big data to enhance academic library collection development.
Visualizing the Intersection of Impact and BTAA Libraries Investments in the Research Enterprise Using Open Government Data: An Exploratory Model Using Tableau
Sarah Murphy (The Ohio State University)
Paper
Full Text Paper (PDF)
Slides (PDF)
Show Abstract
Grant funding serves as an important proxy for quantifying the the value of the academic library. Past studies rely on researchers to self-report whether they used and cited library resources when crafting successfully funded research proposals. (Kaufman 2008, Tenopir et al. 2010) More recent studies seek to quantify library support for the grant seeking process using data gleaned from Scopus, Web of Science, Journal Citation Reports, and other tools. (Boukacem-Zeghmouri, et al. 2016; Monroe-Gulick, Currie, & Weller 2014) With the advent of the Federal Funding Accountability and Transparency Act of 2006 and the NIH Public Access Policy, libraries can now leverage open government data to explore the relationship between grant funding and investments in library collections and services. This study explores modeling open data on government spending and federally funded research outputs to 1) visually demonstrate how libraries contribute to the research enterprise by providing information scholars need to both develop and sustain their research agendas and 2) allow libraries to visualize and utilize this same data to inform the development of library services and collections.
Design, methodology, or approach
Open government data now allows libraries to identify publications authored by their institution’s faculty that result from federally funded research. Libraries may model this data to determine what journals faculty chose to publish their research in, as well as what journals faculty cite, and the value of these individual research outputs. The model for this project was created by first identifying all NSF and NIH project grants awarded to BTAA faculty between FY2010 and 2018 using the Federal RePORTER (now USAspending.gov) data portal. BTAA schools collectively expended more than $11 billion on research in FY2019 and have a robust program for optimizing researcher’s access to member libraries’ collections. A list of publications associated with these grant projects was then downloaded with the corresponding link tables from the Federal ExPORTER and enhanced by pulling lists of citing papers and reference papers to identify what journals were used to inform the author’s research and what journals cite the author’s research. NlmIds were added wherever possible using the journal title for each publication to later identify and use MeSH terms as visualization filters. The Relative Citation Ratio (RCR) value was also pulled for each publication to use as a supplemental metric of value. All data was then modeled in Tableau using a series of relationships and joins and visualized in a series of interactive dashboards.
Practical implications or value
Modeling and visualizing the outputs of successful grant-seeking using Tableau allows libraries to explore this data at both a high aggregate and lower level of detail. This project demonstrates how to assemble and utilize such data to both illustrate libraries ongoing contributions to the research enterprise and inform library collections and services.
Development of an IPEDS Academic Programs Dashboard: Leveraging Public Higher Education Data for Strategic Insight
Joe Zucca and Clair Johnson (University of Pennsylvania)
Paper
Slides (PDF)
Show Abstract
This paper addresses the question of how academic libraries can gather intelligence about patterns within the institution and across the ecosystem of higher education. More specifically, it explores a particular approach to monitoring changes in academic programs and the emergence of new areas of study.
Design, methodology, or approach
To demonstrate this approach to tracking academic programs, this paper explains how data from the Integrated Postsecondary Education Data System (IPEDS) were accessed and processed into an interactive dashboard developed in Microsoft Power BI. The final dashboard is publicly accessible and displays data gathered from the 132 institutions with a Carnegie classification of “doctoral university: very high research activity.”
Findings
This paper demonstrates the value of utilizing public data with data dashboarding methods to leverage crowdsourced insights from a massive trove of information. Insights that have already emerged from these efforts include:
- The growth of multi- and inter-disciplinary areas of study
- The emergence of data-focused areas of study alongside the decline of content-bound areas of study (e.g., data science in contrast to economics)
- The unique identities of Ivy League institutions based on their predominant areas of study
These insights (and others that have or will continue to emerge) provide information that can be used to inform the development of academic libraries. This paper will discuss the implications for hiring, organizational structures, and services provided. One such implication is the need to hire librarians able to support multidisciplinary research and curricula and create an organizational structure that does not position them in service to individual subject areas.
Practical implications or value
This paper provides a proof of concept for the use of public data in an interactive dashboard and demonstrates how other institutions can replicate this approach with any number of datasets and visualization tools. Furthermore, in an era when massive troves of data serve as an overwhelming source of information, this paper demonstrates how those working in library assessment can crowdsource insights most relevant to the context in which they work. Finally, this paper demonstrates the importance of thinking about library assessment with an outward-looking lens, assessing not only how the institution is functioning internally, but also monitoring the external trends and patterns that ultimately impact the work of the library.
Organization/Space/Critical
A Sense of Place
Holt Zaugg (Lee Library BYU)
Paper
Full Text Paper (PDF)
Slides (PDF)
Show Abstract
Recent efforts to help all students feel included and welcome on campus, especially in the library. However, before embarking on efforts to help students feel more included and welcome, one needs to know the current state of inclusion and welcomeness. This effort describes the first efforts to establish baseline data for our library.
Design, methodology, or approach
Following a literature review, six measures were identified six measures that we collectively call Sense of Place measures. These include:
A student’s sense of belonging
A student’s connection to the library
How respected the student feels in the library
How safe a student feels in the library
A student’s level of comfort in talking with a library employee
How welcome a student feels in the library
Using a survey and a random stratified sample of undergraduate students, we invited students to indicate the level to which they experienced each of these measures. We examined mean ratings for all students and then disaggregated by ethnicity, gender, and university status. We also used a principal components analysis to determine how the Sense of Place measures clustered for each group.
Findings
There were strong mean ratings for five of the six Sense of Place measures with one, connection to the library, having a lower mean rating. Disaggregation helped to indicate specific groups that may have higher or lower mean ratings. The principle components analysis typically indicated that all measures were in one or two groups, indicating a common Sense of Place measure.
Practical implications or value
The mean ratings provide baseline data for comparison following future initiatives. The six measures create an initial understanding of components that help students feel like the library is a positive part of their lives and that students are a positive part of the library’s life. Most importantly, the results offer value about what we are doing well with an eye towards where and how we can improve.
Methods & Tools/Digital Libraries
Understanding Library Reach and Impact with a CRM
Ellie Kohler (Virginia Tech University Libraries)
Paper
Show Abstract
This paper shares the approach taken by the Data Analytics Team at the University Libraries at Virginia Tech (Blacksburg, VA, USA) to measure library connections with university areas in an effort to understand the depth and scope of the library’s influence on the university. The purpose of this study is to examine the relationships created between library personnel and the Virginia Tech community through information recorded in the library’s Customer Relationship Management (CRM) software. Also included are discussions of necessary CRM modifications and descriptions of tools and methods used to transform and display results.
Design, methodology, or approach
This study will be utilizing 12 months of data collected by individuals in the University Libraries through the use of LibConnect, a Springshare CRM software. All efforts were made to gather information in an ethical manner, and this paper will address necessary modifications to LibConnect and the impact that has on analysis. Analysis methods will include standard quantitative statistical analysis, application of time series algorithms to understand how seasonality affects the data, and clustering and network analysis to generate relationship-based mapping. This study will be looking at both the breadth and the depth of the respective relationships.
Findings
It is anticipated that the findings will demonstrate that relationships correlate closely with the size of each respective college or department within the university. In part, because of the liaison librarian organization structure, it is also forecasted that many relationships will be vertical, involving multiple instances of a single librarian interacting with a respective department or college. It is also acknowledged that since this is a relatively new system that was only recently adopted by the library, compensation will need to be made for gaps in the data.
Practical implications or value
This study details the approaches taken by the Virginia Tech University Library Data Analytics Team to focus on measuring engagement, and is part of a greater effort to understand library users and provide the best possible service and address gaps in outreach efforts. In recent years, there has been an effort to know how physical and electronic resources are utilized. This is incredibly valuable information, however it does not fully show or demonstrate the value of library relationships created between library personnel and other members of their communities. The work will contribute to library assessment as a whole by demonstration of a way to create a system that allows the ability to measure library connections created through instruction events, consultations, collaborations, and partnerships using a CRM. Through the setup and use of data collected through the CRM, this project ultimately hopes to create a blueprint of a library’s influence while respecting ethical data collection principles.
The Meaningful Measurement of Liaison Librarian Services in an Uncertain World
Jennifer Thomas and Joanna Logan (Queensland University of Technology)
Paper
Full Text (PDF)
Slides (PDF)
Show Abstract
This paper discusses an approach the Queensland University of Technology (QUT) Library in Brisbane, Australia took to more accurately measure the value and impact of their Liaison Librarian service. This service consists of teams of Liaison Librarians (faculty librarians) who liaise with the university’s faculties and divisions, and an enduring issue has been the inaccurate measurement of this service. Reporting requirements have changed over the years and the library was using a legacy system for tracking work that was no longer fit for purpose. There was also great variation in how individual librarians used the system resulting in further issues with data integrity. This paper discusses how, based on the strategic imperative to report on liaison initiatives and engagement more meaningfully, a small working group within QUT Library redefined liaison data collection in an effort to future-proof and elevate the importance and value of liaison work into the future.
Design, methodology, or approach
The Liaison Impact Working Group (LIWG) was formed, led by QUT Library’s Liaison Service Manager and consisting of four Liaison Librarians and the Library’s Quality and Planning Manager. The group carried out extensive stakeholder engagement, a comprehensive data audit and an environmental scan which included consulting with colleagues from other institutions. The group also gathered feedback on the use of the current system (SharePoint Online) and prototyped several updates (SharePoint Online and MS Forms), resulting in the final product that is currently in place. It was an iterative process that took place over approximately eight months between 2021 and 2022.
Findings
Early findings are promising. While change is hard, having Liaison champions on the working group was key in selling the value of the new procedures and system to its users. The new system has been operating since January 2022. By November 2022 we anticipate being able to share meaningful insights into liaison engagement, a reduced duplication of effort in capturing workload, and an elevated awareness of the value of liaison work which is critical in the current environment.
Practical implications or value
The new system has been implemented at no extra cost to the Library. Working group members volunteered their time, appraised options and chose to update an existing tool. This process could also be rolled out to other library services seeking more effective forms of measurement. The process could also assist colleagues in other institutions facing similar issues.
One-Size-Doesn’t-Fit-All: Differentiated Engagement Pathways for Transfer Student Success
Rebecca Croxton and Anne Moore (University of North Carolina at Charlotte)
Paper
Full Text (PDF)
Slides (PDF)
Show Abstract
Transfer students are an increasing sub-population of college and university students. High-transfer, four-year institutions strive to understand the indicators of transfer student adjustment, retention, and success to inform policies and services to support these students to succeed in their academic goals. As the number of students entering higher education from high schools decreases and the number of adults needing to complete or continue their education increases, we must develop a deeper understanding of the factors that contribute to transfer student retention and success. Which engagement activities should be promoted as critical pathways for success for this student population?
This study investigates which library, other co-curricular and extracurricular activities, pre-college, and demographic factors contribute to transfer versus first-time-in-college (FTIC) freshmen retention and success at a large, public university in the southeastern US with a high transfer population.
Design, methodology, or approach
This project is part of a longitudinal study of undergraduate student engagement and success of students who matriculated in fall 2012 through fall 2020. The dataset contains more than 130,000 student records and includes information about engagement with the library, other co-curricular and extracurricular services and activities, high impact practices, pre-college variables, demographic factors, and measures of student success.
Using Tinto’s “student integration theory” and Hills’s theory of “transfer shock” to frame the study, the researchers conducted an analysis of students who entered the university as FTIC freshmen and transfer students, including a deeper exploration of transfer student data disaggregated based on (1) the number of incoming credits, (2) first generation status, (3) in-state versus out-of-state originating institution, and (4) type of transfer institution. Data were analyzed using Analysis of Variance and binary logistic regression with propensity score matching.
References
Hills, J. R. (1965). Transfer shock: The academic performance of the junior college transfer. The Journal of Experimental Education, 33(3), 201–215.
Tinto, V. (1993). Leaving College: Rethinking the Causes and Cures of Student Attrition, 2nd ed. University of Chicago Press.
Findings
Preliminary findings indicate that the undergraduate subgroups based upon admission status (transfer students and FTIC freshmen) and the number of incoming credits are uniquely different from each other with respect to engagement with the library and other co-curricular and extracurricular activities and in achieving the identified measures of success. The pathways for success are also nuanced based upon the subgroup and the success measures. Additional analyses related to (1) first generation status, (2) in-state versus out-of-state originating institution, and (3) type of transfer institution are underway.
Practical implications or value
This study is the first of its kind to compare out-of-classroom engagement of transfer students with FTIC freshmen that is nuanced based upon (1) the number of incoming credits, (2) first generation status, (3) in-state versus out-of-state originating institution, and (4) the type of transfer institution. Findings will help universities structure support systems and services to help this growing population of students succeed and graduate. Methodologies used in this study can be adapted to explore engagement pathways to success for other student subpopulations.
Best Practices for Assessing Reuse of Digital Content: Educational and Instructional Design Perspectives
Joyce Chapman (Duke University), Caroline Muglia (University of Southern California), Santi Thompson (University of Houston), Ayla Stein Kenfield (University of Illinois at Urbana-Champaign), Liz Woolcott (Utah State University), Elizabeth Kelly (LOUIS: The Louisiana Network), Ali Shiri (University of Alberta), Derrick Jefferson (American University), Nicole Hennig (University of Arizona Libraries), and Ranti Junus (Michigan State University Libraries)
Paper
Full Text (PDF)
Show Abstract
While digital library practitioners measure “use” of digital collections using access metrics, they rarely measure or assess “reuse” in research, social media, instruction, and other formats. Reuse metrics are often anecdotal and ephemeral, which pose a challenge to collection and comparison to other metrics. To that end, the Digital Content Reuse Assessment Framework Toolkit (D-CRAFT) has developed ethical guidelines and Best Practices for practitioners to assess how users engage with, reuse, and transform digital content. D-CRAFT is a multi-year IMLS grant that began in summer 2019. At the last LAC conference, we presented on the Ethical Guidelines developed for this project. This presentation will present the completed Best Practices and discuss the development of the project’s Educational Tools and online Toolkit.
Design, methodology, or approach
As assessment, access, privacy, ethics, cultural competency, and educational tools are key pillars of the toolkit’s design, the grant provides funds to hire part time consultants specializing in Privacy, Diversity and Inclusion, Assessment, Instructional Design, and Accessibility. Consultants contribute valuable expertise to key product development.
Developing Best Practices and Engagement and Education Tools:
The project team began by conducting a wide-ranging literature review. The team used Dedoose to code, thematically group, and tag excerpts from the resulting corpus.
Sub-teams developed Best Practices and Educational Tools. Each group used the rich data in Dedoose to conduct a gap analysis, and perform further data gathering as needed.
Sub-teams authored the Best Practices for each Method, as well as for Tools associated with each Method. Supplemental Materials were also created where appropriate.
Subject experts from the GLAMR community were hired to review the Best Practices and enhance as needed.
The Instructional Design consultant began to create Educational Tools in March 2022
Findings
The deliverables of D-CRAFT include Ethical Guidelines for assessing digital object reuse, Best Practices around assessment of digital content, and a suite of freely available engagement and Education Tools. Examples of instructional design modules for use cases that focus on methods and tools for digital object reuse assessment will be shared.
Practical implications or value
The D-CRAFT toolkit will be a vital GLAMR community resource that addresses the lack of common practices and instructional resources for assessing reuse of digital materials, provides definitive guidelines on what constitutes use and how that differentiates from reuse of digital content, and develops the first Ethical Guidelines for assessment and reuse of digital content.
D-CRAFT is a product of the GLAMR community. This session will enable the D-CRAFT project team to collect valuable feedback on the project from the assessment community.
Teaching/Learning
Assessing Synthesis of Information from Sources
Sarah Dahlen (CSU, Monterey Bay)
Paper
Full Text Paper (PDF)
Slides (PDF)
Show Abstract
When we teach information literacy, much of our attention is focused on students’ ability to find information, evaluate it, and cite it. How students incorporate that information into their papers is equally important, as this allows students to achieve their communicative purpose. Many instructors expect students to go beyond summarizing information from sources to synthesizing that information, showing the reader the connections between sources. Assessment of students’ ability to synthesize information has received scant attention in the scholarly literature, leaving librarians with a desire to assess this area with little guidance. After spending three years working with multidisciplinary teams of faculty on the assessment of synthesis, the author has developed a set of tools and recommendations for assessing synthesis in student work, as well as instructional materials for making improvements to teaching and learning in this area.
Design, methodology, or approach
Information literacy assessment at the author’s institution is a collaborative process in which the author leads multidisciplinary teams of faculty in scoring authentic student work with a rubric. Initial assessments using an adapted version of AAC&U’s Information Literacy VALUE Rubric identified synthesis as an area in which students were not demonstrating proficiency at the desired level. This rubric, however, merely rates the presence/absence of synthesis as part of one criterion, prompting us to create a rubric dedicated to the synthesis of information from sources. The rubric developed by Lundstrom et al. (2015) served as a valuable starting point, but we needed a rubric broad enough to evaluate assignments from different courses, disciplines, and class levels. The first iteration of our synthesis rubric was employed in 2020 for program-level assessment in the Social and Behavioral Sciences major. Applying the rubric led to revisions, and its second iteration was employed in 2021 for campus-level assessment. A final round of revisions resulted in the version adopted by our campus.
Findings
Using a rubric such as the one we developed is a viable method for assessing students’ ability to synthesize information from sources in a way that can lead to improvements in teaching and learning.
Our results showed much room for improvement in this area. In an effort to close the loop, we developed an assignment guide that advises instructors how to incorporate synthesis into their assignment prompts, and a video showing students how to use a synthesis table to identify connections between sources.
Practical implications or value
The library assessment community will recognize synthesis as an important component of information literacy and one that can be assessed by applying a rubric to student papers. The rubric we developed is available to be used or adapted to meet the needs of other institutions, and our assessment methods may be a useful model for those considering similar endeavors.
Rubrics are not merely assessment tools, but also roadmaps for instructors and students seeking to better understand synthesis and its component parts. Our rubric, assignment guide, and instructional video can all be employed as teaching tools to assist librarians and other faculty in their efforts to improve students’ ability to synthesize information from sources.
Comparing Two Information Literacy Development Strategies for Online Doctoral Students
Carolyn Heine (California Baptist University)
Paper
Full Text (PDF)
Slides (PDF)
Show Abstract
Doctoral students appear to be under-supported by libraries as both adult learners and students conducting original research. Further complicating this issue is the increase of low-residency or online-only programs that limit librarians’ ability to offer face-to-face or synchronous instruction for all doctoral students. Experiential learning has been shown to be an effective pedagogical approach, but there is little research on its effectiveness with information literacy development in an asynchronous context.
Design, methodology, or approach
Design: This study used a pretest-posttest controlled experimental design to test the effectiveness of fully asynchronous modules that incorporated principles of Kolb’s (1984) Experiential Learning Theory and best practices in online instruction (Darby & Lang, 2019) to develop information literacy in first-year doctoral students.
Participants: Students from the Doctor of Social Work program and the Doctor of Public Administration program were the participants and were randomly assigned to a control or treatment group.
Intervention: The control modules contained only video tutorials, a common type of library support offered in an asynchronous context. The treatment modules employed an experiential learning intervention. Both sets of modules took roughly the same amount of time, and the content was centered around (a) conducting a literature search for original research and (b) strategies for tracking searches for a dissertation over several years.
Data Collection:
Pretest – Confidence (10 Likert scale items) and IL knowledge (8 MC items)
Posttest – Pretest items (confidence and knowledge) plus the application of knowledge in a practical exercise (4 question activity graded using a rubric)
Data Analysis: A MANCOVA was conducted to determine whether there were significant differences in participants’ information literacy confidence, as well as their IL knowledge as assessed on a multiple-choice test, with their overall pretest scores used as the covariate. A MANOVA was conducted to determine whether there were significant differences in participants’ ability to demonstrate their literacy in a practical exercise.
Findings
Although the experiential learning treatment did not yield significant differences between the groups in confidence or IL knowledge, the treatment did produce a significantly greater ability to demonstrate IL in a practical exercise (F(4, 15) = 3.586; p < .05). Student feedback indicated a positive reception to the online modules and that IL development would be beneficial beyond the first semester.
Practical implications or value
A major implication of this research, as it relates to assessment, is that there is a difference between a student’s ability to demonstrate “knowledge” of IL and their ability to apply that knowledge to real-world assignments. This implication informs a second implication that multiple-choice questions alone may not accurately whether a students is information literate. Assessing actual student work will likely be more accurate. A third implication, as it relates to instruction, is that doctoral students who engage in activities that approximate real-world tasks will be better equipped to transfer learned skills to coursework and the dissertation. I hope this study inspires librarians to design assessment efforts that will allow them to identify causal relationships, not just correlative relationships, between their instruction and student learning.