2022 Schedule

This is a preliminary schedule. Details are subject to change.
All times are listed in eastern daylight time.

Tuesday, November 1, 2022

11:30 a.m.–1:00 p.m.: Assessment Accelerators

CARL Library Impact Framework
Justine Wheeler (University of Calgary), Mark Robertson (Brock University), and Tania Gottschalk (Thompson Rivers University Library)

A Roadmap to Practical Strategic Planning
Maurini Strub (University of Rochester) and Starr Hoffman (UNLV Libraries)

Present & Future Proficiency: Updating the ACRL Assessment Proficiencies to Reflect Current and Coming Realities
Rebecca Croxton (University of North Carolina at Charlotte), Megan Oakleaf (Syracuse University), and Jung Mi Scoulas (University of Illinois Chicago)

1:30 p.m.–3:00 p.m.: Assessment Accelerators

Choosing Your Own Scholarly Communication Assessment Adventure: Applying and Reviewing a Draft Engagement Matrix to Evaluate Program Growth and Development
Emily Chan (San Jose State University), Suzanna Yaukey (Towson University), Nicole Lawson (California State University, Sacramento), and Daina Dickman (Network of the National Library of Medicine Region 5)

Demystifying Qualitative Coding
Alisa B. Rod, Marcela Y. Isuster, and Tara Mawhinney (McGill University)

Online Participatory Design: Activities and Approaches for User Engagement in the Remote Environment
Jackie Belanger and Reed Garber-Pearson (University of Washington)

3:30 p.m.–4:45 p.m.: Assessment Accelerators

A Discussion on the Evolution of Assessment Work in Academic Libraries: Is this a fluke, or is this our future?
Kat Bell (George Mason University, Emily Guhde (Georgetown University), Steve Borrelli (Penn State University Libraries), and Maurini Strub (University of Rochester)

Data Ethics and Learning Analytics: Putting Privacy into Practice
Lisa Hinchliffe (University of Illinois at Urbana-Champaign)

Introducing the Values-Sensitive Library Assessment Toolkit: A Practical Approach for Ethical Assessment
Scott Young (Montana State University)

5:00 p.m.–6:00 p.m.: Attendee Welcome Session

 

Wednesday, November 2, 2022

11:30 a.m.–11:45 a.m.: Welcome Session

11:45 a.m.–12:45 p.m.: Keynote

Elusive Measures: The Imperative to Identify and Utilize DEI Indicators that Shape Stakeholder and Organizational Outcomes

Denise Stephens.Denise Stephens, Peggy V. Helmerich Chair and Dean of University Libraries, University of Oklahoma

Denise Stephens is Peggy V. Helmerich Chair and Dean of University Libraries since May 2021. She returned to OU after serving in library and information technology leadership roles at the University of Virginia, the University of Kansas, Syracuse University, the University of California at Santa Barbara, and Washing University in St. Louis. She is an Oklahoma native who received her undergraduate and graduate degrees from the University. During her professional career, she has focused a considerable among of her leadership effort on considerations related to how academic research libraries develop organizationally and strategies for demonstrating relevance and value in the higher education landscape. Her contributions include service on the ARL Diversity and Research and Analytics committees, as well as board service for several national organizations advancing access to scholarship and cultural heritage materials. Among these are the Digital Public Library of America, the Center for Research Libraries Global Resources Network, the WEST Regional Storage Trust, and the Greater Western Library Alliance.

1:30 p.m.–3:00 p.m.: Concurrent Sessions

Diversity, Equity, and Inclusion

Towards a Modern, Inclusive Library Advisory Board
Anita Hall (University of Louisville Libraries)
Paper
Show Abstract

Purpose and goals
The purpose of this study is to determine the prevalence, composition, and value of advisory groups for academic libraries in 2022 and make recommendations for modernization of and inclusivity within these groups. With an increasing focus on diversity, equity, and inclusion (DEI) initiatives in Academic Libraries, we must ensure that these groups are inclusive and representative of our user populations while providing a safe environment for members of diverse and minoritized groups to provide genuine feedback and ensuring that participation in library advisory groups leads to tangible improvements to their library experience. Additionally, the study will explore ways in which the Covid-19 pandemic and a broader increase in the availability of virtual/hybrid options for participation have impacted current practice around advisory group recruitment, participation, meeting formats, and overall group utility moving forward.

Design, methodology, or approach
This study has two components. The first (quantitative) is a census of ARL academic library websites to determine the prevalence and characteristics of library Advisory Groups in 2022. The second (qualitative) is a series of semi-structured interviews with assessment librarians and others who are responsible for facilitation of these groups to identify ways in which these groups have evolved and determine best practices for utilizing these groups in the current moment. A modified Grounded Theory Methodology will be used for analysis of interview responses.

Findings
Anticipated findings include strategies for recruitment of diverse and representative advisory group members, meeting schedules and formats, and engagement of members to ensure quality feedback that can be used for meaningful improvements to library spaces, services, and/or collections.

Practical implications or value
This study will provide guidance for assessment librarians and other who currently facilitate advisory groups or are interested in doing so. There is very little published work on advisory groups in academic libraries from the last decade, and none that evaluates  DEI in this context. The early stages of the study indicate that these groups are still widely used among ARL academic libraries, and a more modern look at their usage and/or utility will add to the overall understanding of assessment and user research in libraries.

Demographic Analyses in the US: An Insight-Based Approach to Studying Diverse Needs for Library Planning
Starr Hoffman (UNLV Libraries)
Paper
Show Abstract

Purpose and goals
This paper describes analyzing statewide demographic shifts, in order to better identify changing library patron needs, as part of an approach to assessing the outcomes of Library Services and Technology Act (LSTA) grants (IMLS initiative that provides funds to state library agencies). This practical approach to evaluation studies will explore:

How to put a demographic analysis together
How it relates to the LSTA “Measuring Success” framework outcomes (Focal Areas and Intents)

Design, methodology, or approach
For this statewide demographic analysis, primary sources included data from the U.S. decennial census and the American Community Survey, as well as related demographic reports. Demographic shifts over the past ten years were explored at various geographic levels to help determine changing library needs. The Census’s Public Library Statistics (PLS) and ACS data website was also used to understand library service locations and adjacent needs.

Findings
Abbreviated findings for some states will be provided.

Practical implications or value
Similar demographic analyses could be useful for State Library Administrative Agencies (SLAAs) seeking to clarify their statewide goals, as well as for library directors of all types, community colleges, and universities seeking information on needs of state residents. This paper will explore which census tables are useful for particular topics (especially as related to race and ethnicity), locating census data or other sources, and special issues with census data. We will also discuss which demographic elements relate to the six LSTA outcome areas: Lifelong Learning, Information Access, Institutional Capacity, Employment and Economic Development, Human Resources, and Civic Engagement, and their associated intents.

Converting Climate Assessments into Evidence-Based Change
Brian Keith (University of Florida)
Paper
Show Abstract

Purpose and goals
Many organizations engage in climate assessment for diversity, equity and inclusion initiatives and other strategic purposes. However, the results, qualitative and quantitative, are difficult for organizations, let alone individual employees, to interpret in order to understand their workplace or to use in guiding or contributing to change. This case study addresses the problem. The methods presented, data analysis and visualization, show how to make the results digestible to all employees. The resulting organizational development and change management processes presented here address the challenges of converting data into action. Attendees will be better able to deliver on the expectations of employees for workplace growth and that their participation in climate assessment will result in evidence-based change.

Design, methodology, or approach
The ClimateQUAL climate assessment tool was the source of the initial data used here. Quantitative and qualitative analysis were used to interpret the voluminous data set in the standardized tabular report. Data visualization and communication methods used for the internal audience reflected adult learning practices. The resulting evidence-based change actions that were implemented were anchored in organizational development and change management literature, including transformational leadership, and in modern management concepts, including procedural and contributive justice.

Findings
Data collection is the easiest and very much preliminary stage in climate assessment. Most efforts fail at the subsequent interpretation or action phases. The methods to be presented here are effective in positioning leaders, at various levels, and staff to understand climate assessment results and engage in visualizing and implementing change initiatives, based on organizational strengths and weakness.

Practical implications or value
Libraries are increasingly focused on organizational improvement. Effectiveness, generally and in terms of recruitment and retention of key personnel, depend on evidence-based planning and decision making. Also, employees expect issues in their work experiences to be resolved. So, climate assessments are likely to be conducted at most institutions and with some frequency, but delivering on the promise of these efforts is a challenge. Assessment professionals and other leaders need to be prepared with a variety of methods to lead the data collection, analysis, and interpretation activities and to support change implementation. Each successive phase is more challenging, and this presentation will offer methodologies for each. The methods are transferable to various assessment instruments and organizational initiatives.

A Mixed-Methods Approach to Assessing Diversity, Equity, and Inclusion in Library Collections
Jayne Sappington (Texas Tech University)
Paper
Show Abstract

Purpose and goals
Diversity, equity, and inclusion (DEI) studies have increased in prominence on academic campuses along with calls to question privilege and power structures, making DEI collections assessment critical. Texas Tech University Libraries undertook a two-part project that evaluated user needs, collections usage, cataloging and discoverability, and user behavior in searching for and evaluating DEI resources.

Design, methodology, or approach
The researchers administered online surveys to understand faculty perceptions about the library’s collections and availability of DEI resources for research and class assignment purposes. They used a course syllabi scan to look for instructor assigned resources and reviewed DEI-related award lists to identify key resources to compare to library holdings. Usage reports and faculty resource requests were additional methods to measure the comprehensiveness and appropriateness of the Libraries’ collections to meet user needs.

Additionally, the researchers conducted 32 usability tests to evaluate the user experience of searching for library resources related to DEI topics. They measured user satisfaction and the perceived difficulty of the search process using observation, quantitative and qualitative survey and interviewing methods, and structured and closed-ended questions to understand the views of library users.

Findings
The researchers identified the potential for partnering with academic programs that currently utilize and request DEI-related resources. They found that even though most users reported they were satisfied by their search results, many of them expressed uncertainty in searching and evaluating DEI resources. Users expressed interest in search enhancements for better filtering, library website improvements, and more prominent guidance for DEI research help. An item’s description or abstract and its title were the components that users most often used to determine how relevant a source would be. This finding supports other research results that demonstrated the need for increased attention on cataloging and metadata, particularly Table of Contents and abstract or summary fields.

Practical implications or value
The researchers identified potential ways individual libraries can address patrons’ difficulties narrowing a topic or search and more large-scale cataloging changes that could improve discoverability and user experience. Areas for future study include research into what cataloging enhancements are most effective and identifying ways librarians can advocate for large-scale changes in cataloging practice involving collaborations between multiple libraries or even vendors. The researchers envision that other academic librarians could replicate and expand aspects of this study in their own libraries, while avoiding the challenges identified in this research.

Collections Value/Impact

360 Degree Approval Plan Assessment
Eva Jurczyk (University of Toronto Libraries)
Paper
Show Abstract

Purpose and goals
Libraries collections are one reason that students, faculty, and researchers access the library’s physical and digital spaces. The University of Toronto Libraries (UTL) acquire a great number of monographs through Approval Plans (APs). For example, between April 2019 to March 2020, over 50% (or 10,895 titles) of the monograph acquisitions from a major US/UK vendor were through the library’s AP. APs shape the library’s collective collection but these plans require regular assessment to ensure they perform as expected.

The goal of the project was to understand how this major AP was performing to support research and teaching, new academic programs, and whether the plan reflected current and evidence-based collection development practices, including to what degree the plan contributed to the diversity and inclusivity of our collections.

Design, methodology, or approach
Phase 1: (September – November 2021)

The first phase of the plan was to define the quantitative criteria for an optimally performing approval plan. This was undertaken through four focus groups among subject selectors and liaison librarians from across the system. The transcripts of these focus groups were coded using NVIVO and a follow up survey was sent to this same group. The results of this phase were used to set the benchmarks by which we would define a successful approval plan.

Phase 2: (December 2021 – March 2022)

The project team gathered and analyzed data, including five years of approval plan and firm order purchase data, bibliometric data about activities of University researchers, circulation and course reading list information, and holdings data for other libraries in our network. This analysis was centred on answering how well our plan was performing against the benchmarks defined in Phase 1.

Findings
Through the focus groups, the project team found that a diversity of assessment approaches was needed for approval plan assessment and that a consultative, systematic, data-driven approach was necessary. This approach allowed us to set benchmarks, identify patterns, uncover the reasons for those patterns and determine what can be improved. While specific benchmarks varied by discipline, we addressed the following questions through data analysis:

Are there any gaps in geographical coverage?​
Are there any gaps in the subject areas we collect?​
Are there any significant differences in usage by acquisition types?​
What is the level of overlaps in print purchases between UTL and non-central libraries?​
What percentage of materials used for teaching are covered by the approval plan?​
What percentage of slipped material do we select?

Practical implications or value
For libraries using approval plans for print collections, these plans represent a significant portion of an acquisition budget. They are complex, assembled by multiple stakeholders, and bound by the limitations of vendor systems. This multidimensional approach to assessment can be used as a model in other libraries who are hoping to undertake such a review.

Opting In or Out of Checkout History: What Drives Patrons’ Decisions about Their Library Data
Craig Smith (University of Michigan)
Paper
Show Abstract

Purpose and goals
Traditionally, libraries have been cautious about retaining patron data. In recent years, however, some academic libraries have embraced analytics as a way to understand and make inferences about patrons’ library-related behavior. These types of studies require the preservation of large amounts of patron data; in many cases patrons are not given explicit choices about whether their data are retained or not. At the University of Michigan, we recently began offering patrons a choice about whether their checkout histories are retained or not. We used this as an opportunity to explore the following research questions: What reasons motivate library users to either retain or delete their checkout data, and how do these decision relate to other factors such as campus role, patron demographics, and disciplinary area?

Design, methodology, or approach
To explore these questions, we sent a survey to patrons whenever a patron logged into their account and opted in or out of retaining a checkout history. The survey asked each patron why they made the choice they did, and also explored issues of trust in data management. Although the study is ongoing, we have already analyzed the responses from 180 members of our campus community.

Findings
One key finding is that, on average, patrons trust the library significantly more than the broader university with regard to data management, and they further trust the university significantly more than internet-based companies (e.g., Amazon, Netflix). This may be one reason why, when offered the choice about whether to retain their checkout histories, 90% of patrons have thus far chosen to have the library store their checkout data. The richness of the survey data lies in the reasoning supplied by patrons, when asked about the choices they made. These open-ended responses were coded independently by two raters who achieved acceptable agreement (all Cohen’s kappas > .70; discrepancies were easily resolved through discussion). Prominent reasons why people wanted the library to retain their checkout data included using their checkout histories as helpful reading lists, using their histories to facilitate research endeavors, and feeling confident that their data would be managed carefully. Prominent reasons why others chose to have their checkout data deleted included deep privacy concerns, the view that the library should not be storing such data, and even some astonishment that the library had historically been storing checkout data. We also explored whether patrons in historically marginalized groups had different levels of trust in the library’s data management, compared to people not in such groups; thus far we have not found differences.

Practical implications or value
These findings are quite unique in the area of library assessment and research; understanding how people think about their library data will allow libraries to offer appropriate choices to patrons, and may even point the way to new services that can be offered for patrons who do want some of their data retained.

Importance and Impact of Library Resources and Services for Faculty
Jung Mi Scoulas (University of Illinois Chicago)
Poster
Show Abstract

Purpose and goals
This research aims to examine faculty’s perceptions of the university library programs and services, frequency of library resources and services use, and assess library impact on faculty research productivity.

Design, methodology, or approach
An online survey was distributed to all faculty at the University of Illinois Chicago (UIC) in Spring 2022, asking about the use and importance of library resources and services, as they relate to faculty teaching and research. Participants’ demographic information was obtained in advance from the Office of Institutional Research. This study was approved by the Institutional Review Board at the UIC (research protocol #2021-1409).

Findings
A total of 557 faculty responded. 176 respondents were assistant professors, 136 were associate professors and 125 were professors. Regarding the importance of library support related to teaching. the results showed that “Ensure that students who graduate from my program are skilled at locating, evaluating and applying information” was ranked the highest in importance (75.2%), followed by “Have a link in Blackboard to UIC Library resources” (53%). Respondents perceived the least important in the area “Assign course readings in print” (30.8%)

With respect to the importance of library support related to research, the top four resources/ services rated as “Very/Extremely important” were: “online journals” (97%), databases to find literature (92%), “interlibrary loan” (75.8%) and “Ebooks” (70.5%). The top three areas resources/ services rated as “Not important” were: “Special Collections” (43%), “Digital images” (30.8%), and “Print Books” (28.5%).

Faculty engaged in research or scholarship were further asked how often they used the library resources for their research using a four-point scale from “Never” to “Weekly or more often”. Among the 11 resources listed, “Online Journals” (89.1%) and “Databases to find literature” (70.9%) were the most frequently used. More than 60% of the respondents used “E-Books” once a month or more, followed by “Interlibrary loan” (51.7%) and “Print Books” (51.5%). About half of the respondents indicated that they never used on the following resources: “Special Collections” (54.4%), “Comprehensive literature search support” (48.9%) and “Subject and Course guides” (46.4%).

A Spearman rank correlation was employed to examine whether their frequency of library resource use correlated with their research productivity. Only certain library resource that faculty reported using in 2021 were correlated with their research productivity: print books (rs [407] = -.136, p < .01), online journal (rs [418] = .194, p < .01), database (rs [419] = .124, p < .05), and subject and course guides (rs [400] = -.099, p < .05); however, the directions of the correlations were different.

Practical implications or value
The findings from the research will be invaluable for the library to better understand our user community’s needs, make strategic changes to library resources, and demonstrate the impact of library value on their academic success.

Do Online Library Collections Impact Faculty Productivity?
Sandra De Groote (University of Illinois Chicago)
Poster
Show Abstract

Purpose and goals
Over the past 25 years, how faculty search for and access information has changed greatly. Where once faculty had to enter the physical library to access information, now online database and journals dominate the information landscape in many disciplines. This research seeks to demonstrate how the availability and use of online library collections has affected faculty publication patterns and increased and impacted faculty productivity.

Design, methodology, or approach
The publications of faculty at a large urban Research 1 institution were examined over a 25-year period. In the late 90’s the university library had approximately 16,000 active print journal subscriptions, compared to 2019, when the library had approximately 28,000 online journal subscriptions. To explore the impact of library collections on faculty collections use and productivity, the following information was captured: collection use (measured by number of references in the publications), grant funding (measured by whether the article was funded), co-authorship size (measured by number of co-authors), faculty productivity (measured by number of publications by faculty member), and faculty demographics (e.g., rank and years at the institution). Retrospective journal publication data was collected to determine how publication patterns of faculty have changed over time, as the size of the journal collection at UIC increased. This research project was initiated as a part participation in the Association of Research Libraries (ARL) Framework Project funded by IMLS.

Findings
The number of articles written by faulty over the years has increased significantly over time, as has the average number of references that are included in the articles. The findings also suggested, the more productive a faculty member, the less use is made of library collections, as measured through the references included in publications. It was also observed that as the number of articles per author increased over time, so too did the number of co-authors per article. Grant-funded articles included more references than unfunded articles, and grant-funded publications also had a higher number of co-authors, compared to non-grant funded articles.

Practical implications or value
Demonstrating a relationship between library collections use and faculty productivity may help to better illustrate to university administrators the role and importance of the academic library in the research lifecycle. Understanding the change in faculty publication patterns and disciplinary differences can provide insight into how faculty use the collections and growing participation in team science.

From Collections to Collaboration: Building for the Future
Chan Li (UC Berkeley)
Poster
Show Abstract

Purpose and goals
UC Berkeley Library maintains one of the most comprehensive research library collections in the world and strives to build deep and broad collections across many subject areas. At the same time, the Berkeley Library faces many challenges and changes, such as budget cuts, staff shortage, increasing demand for digital access, and the increasing need for research and teaching support. Undertaking deep and holistic collections analysis is critical to foster a better understanding about Berkeley library collections, including identifying needed areas for digitization and collaborative collecting. The insights will help the Library develop a strategy to broaden access to uniquely held collections and expand the scope of Berkeley collections.

Design, methodology, or approach
OCLC’s GreenGlass is one tool that can help with this analysis. The snapshot data that GreenGlass provides can assist in identifying both distinctive and widely held collections, by conducting numerous holding comparisons with other UC libraries, ARL peer institutions, as well as other major academic institutions in both California and the U.S. The data can also shed some light on the user demand by providing circulation history. HathiTrust coverage is critical to identify collections that don’t currently have digital copies. In addition to GreenGlass, OCLC’s Worldshare Collection Evaluation Service will provide more detailed holding comparisons with other UC campuses. All of those data points and information can draw a more comprehensive picture of Berkeley collection in relation to the other institutions, user demand and digital access.

Findings
For areas for digitization, we are hoping to create multiple lists of titles based on a range of criteria that could be used for future digitization projects from high priority to low. GreenGlass is a tool that can help identify categories of collections to consider for digitization. The primary objective of digitization is to expand access to collections, and GreenGlass can be used to isolate collections that are rare, not under copyright, and not already available on HathiTrust. Although not a preservation strategy, digitizing print collections may reduce handling and, in turn, wear and tear on rare items in our general collections, particularly when print materials are frequently used.

For areas for collaborative collecting, we are hoping to develop a few subject areas that are suitable for collaborative collection development with targeted intuitions.

Practical implications or value
The analysis and findings will provide insights into different potential strategies on maintaining and developing sustainable collections, enable us to perform targeted digitization and preservation, and foster collaborations with peer institutions to continue to build deep and diverse collections and provide enhanced access to scholarly resources in a myriad of formats.

OverDrive User Experience Within a Small, Rural Academic Library
Michelle Owens (Rogers State University)
Poster
Show Abstract

Purpose and goals
Due to the continuous evolution of higher education and the decline in the size of public university budgets, academic libraries have become increasingly creative to ensure they can offer the widest possible range of services and resources to their user base. This study assesses the experience of library users with OverDrive in a small, rural, regional university. The goal of this study is to raise awareness of the uses and benefits of this resource for academic libraries.

Design, methodology, or approach
The study utilizes an emailed quantitative, online survey that is completely voluntary. It does not request any personally identifiable information. The results will be used to understand the way different user types at Rogers State University (RSU) do, or do not, use OverDrive and the Libby App, available through RSU Libraries. The data gathered will be analyzed with the goals of identifying patterns of use among different user types, areas for collection growth, and new ways of improving resource marketing.

Findings
RSU Libraries launched OverDrive at RSU during the late fall semester of 2018. Now that the RSU community has had access to this resource for three years, we hope to see many users who are utilizing OverDrive at a library are also aware of the RSU Libraries OverDrive collection. We tentatively anticipate that many RSU Libraries’ users are accessing OverDrive primarily for academic purposes and secondarily for entertainment.

Practical implications or value
The findings of our survey will be integrated into a full case study to be published and presented at a state library conference. We hope to leverage our study to expand interest in the creation of an academic OverDrive consortium in order to reduce costs and expand access to more public Oklahoma universities. We would like to change the current perception in Oklahoma academic libraries that OverDrive is primarily for public library use.


Assessing Book Club-in-a-Bag: What We Learned from Piloting a Campus-Wide Book Club Initiative
Rachel Dineen (University of Northern Colorado)
Poster
Show Abstract
Purpose and goals
Students, staff, and faculty of our campus community continue to navigate life through challenging circumstances. In the Spring of 2021, in the midst of an ongoing pandemic and increased racially motivated violence, our academic library launched a campus-wide Book Club-in-a-Bag program. Modeled after public library programs, this pilot reading initiative was designed to promote a forum for safe and open discussions on issues of inclusivity, equity, and diversity through leisure reading. Our continuing goal for Book Club-in-a-Bag is to promote communication and respect among our diverse campus community through the shared experience of reading. This poster will outline our process for assessing this pilot, what we’ve learned, and our plans for the future of the program.

Design, methodology, or approach
Our purpose for assessing the Book Club-in-a-Bag pilot was to understand the following:

If the titles offered were engaging/interesting/appropriate
If the number of book titles/number of copies is sufficient for participating groups
If the check-out period is sufficient
The quality and value of the supplemental materials provided
The overall satisfaction of the participants

For the assessment, we designed a post-participation survey instrument of 13 questions. Questionnaires were included in the folder included in the bag or could be completed digitally through an online survey system. We encouraged all participants to contribute to the assessment.

We chose this assessment method because of its convenience and its effectiveness. Data has been easy to gather and quick to analyze. Qualtrics, the survey system our campus uses, allows the program coordinators to be notified when a submission has been made, which saves time. The survey can also be quickly edited to include new book titles or additional assessment questions as the book club matures.

Findings
Our findings for the assessment of the pilot indicate great satisfaction with the program, and everyone indicated they would check out a different title. Some of the benefits of using the program include getting to know others in their departments, being able to participate in good discussions about complicated issues, and enjoying leisure reading. The suggestions for additional titles also help guide us in new purchases as we look to expand the program to have new titles each year.

Practical implications or value
Our objectives for this book club initiative were to:

Provide an opportunity for students, faculty, and staff to engage with each other through a book club
Encourage self-reflection and analysis of our personal and shared impact on perpetuating social inequities, systemic racism, and oppression
Highlight voices of those from marginalized communities or identities
Promote reading as an accessible and engaging activity

By developing a successful and sustainable assessment method, we can continue to ensure that we are working toward our overall objectives and grow our book club programming in the future. We hope to give other academic library practitioners some insight by sharing our experiences and processes openly.

Effects of Campus Closure on DDA
Keri Prelitz (California State University Fullerton)
Poster
Show Abstract

Purpose and goals
This paper will compare two DDA assessments conducted at Pollak Library at California State University, Fullerton, one in early 2019 prior to the campus closure due to the pandemic, and one in 2022 following the return to campus and in-person classes. The assessments look at statistics comparing expenditure, usage, how short-term loans impact ownership, changes in publisher costs and participation, and how to use the information to determine the best settings for moving forward. This comparison will provide insight on how a multi-disciplinary, robust Ebook Central DDA was impacted by off-campus research and instruction, and how this information can be used to help future planning and DDA expansion.

Design, methodology, or approach
The paper uses statistics pulled from LibCentral, the admin site for Ebook Central DDA, as well as

Findings
The hypothesis is that the campus closure will have increased our ebook DDA expenditure and particularly that much of the expenditures would be for short-term loans and not as much toward ownership. However the initial review suggests that the expenditure did not increase as much as hypothesized. The study will hopefully suggest some reasons as to why the expenditure remained lower than anticipated, possibly due to publishers no longer participating.

Practical implications or value
This paper will hopefully provide some methods for assessing DDA that can be replicated by other institutions who find that ebook DDAs are difficult to predict and worry about increased spending. It will also provide insight for other institutions to make DDA setting adjustments based on the takeaways of the current publisher settings and results found by the two assessments in the study.

On the Assessment of Acquisitions Methods and Usage of Monograph Collections: UCB & UCLA Libraries
Osman Celik (UC Berkeley Library)
Poster
Show Abstract

Purpose and goals
Assessment of monographic resources is a critical prerequisite for the development of user centric and sustainable collections in academic libraries. The relationship between acquisitions methods and their impact on monograph usage has been understudied in the literature, which this research intends to investigate in a comparative style. The specific research question motivating this study is “what is the nature of correlation between various acquisitions methods and usage of print monographs.” The study develops an acquisitions methods/user-centric spectrum and argues that the more users/selectors are involved in selecting scholarly resources it’s more likely that such resources will tend to have higher usage on average. More formally, it is hypothesized that the more user-centric an acquisitions method is the higher circulation/overall usage will be for library resources.

Design, methodology, or approach
In evaluating the hypothesis, the design of our comparative study includes cataloging, acquisitions, and usage data for print monographs acquired between 2009-2019 fiscal years at UC Berkeley and UCLA Libraries. The dataset includes various acquisitions methods, bibliographic data, and multiple categories of usage statistics of monograph collections – in English and some select European languages – that allow us to assess various acquisitions methods such as blanket, approval, firm, demand-driven acquisitions-DDA – which helps determining user-centricity of various acquisitions for print monographs at both libraries. Data for usage analysis is extracted from ILS and its various modules, and hence it is limited to the extent and categories of usage statistics that are designed to capture and retain by the ILS in use.

English and non-English monographic collections have been built through various acquisitions methods at both libraries. There is an interesting variance across both libraries’ methods of collection development that make this comparative study more intriguing. For instance, area studies collections at both libraries were built through major approval plans and firm orders. While various acquisitions methods were used for UCLA’s English monographs, UC Berkeley library has solely relied on firm ordering during the same period.

Findings
As expected, the empirical analysis identifies DDA with the highest usage statistics among all acquisitions methods included in the study. The analysis also points out slightly higher usage statistics for firm ordered English materials than those acquired through approval plans at both libraries. Usage of select European language monographs is found to be lower than English materials, however, firm ordered non-English materials register higher usage than historical approval plans, partially confirming our hypothesis on user-centeredness of various acquisitions methods spectrum.

Practical implications or value
This research intends to contribute to the filling of the gap in the assessment literature of monographic collections. Original contribution of this study is the analysis of acquisitions methods and their implications for the usage of monographic resources. In addition to its comparative approach to the two largest collections of UC system libraries, it outlines assessment techniques that can be replicated in academic and research libraries. Lastly, the study draws attention to the need to be cognizant of acquisitions methods and their connection with collection usage.

Narrowing the Gap: Using Data We Have to Add Content We Don’t
Liz Bober (Case Western Reserve University)
Poster
Show Abstract

Purpose and goals
Collection strategies and assessments have always been critical to research libraries, but even more so now in this post-pandemic world. With budgets slashed, libraries are challenged to think differently and work collaboratively to build the best collections possible. This project focuses on how members of the assessment, finance, and technical services teams worked together to identify key data points that could assist research services librarians with making strategic collection decisions that support the research and curriculum needs of our faculty and students. In order to drive our collection development decisions based on user needs, we analyzed trends from the COUNTER Report that focuses on Journal Access Denied, reviewed ILL and eJournal request data, vendor-supplied reports, and the cost of resources to determine which were worth the investment. In analyzing this data we can identify materials that are most pertinent to our users and help narrow the gaps in our collection in a practical and effective way.

Design, methodology, or approach
Data is provided through COUNTER TR_J2 Journal Access Denied reports, vendor supplied reports, ARL Peer Review, ILL data, cost/value analysis and literature review.

Findings
We are starting to see that by linking turnaway data, ILL and cost/value analysis we are able to identify gaps in the collections driven by our researchers. Our findings are being shared with the research services librarians for consideration of future purchases.

Practical implications or value
By connecting the different acquisition data with a cost breakdown, librarians will be able to consider the purchase of items that are critical to our user’s needs. This is a different way to approach collection development that provides user-driven collection strategy decisions to be made since many budgets have been significantly reduced.

Services/Usability

Assessing Scan and Deliver during COVID-19 and Beyond
Lisa Levesque (Toronto Metropolitan University)
Paper
Show Abstract

Purpose and goals
Toronto Metropolitan University Library implemented the scan and deliver service in June 2020, during the COVID-19 pandemic. With this service patrons can request a portion of text, such as a chapter of a book or a journal article, be scanned by a library staff member and emailed to them. This and other complementary services were implemented because of limited access to the physical library building and the print collection due to pandemic lockdowns. As the Library space reopened to patrons, we assessed this service to understand its impact and plan for future service offerings. This paper addresses why Toronto Metropolitan University Library patrons have used the scan and deliver service: what benefits does it offer them, what role do scanned materials play in their scholarly research, and what barriers does it help them overcome? These questions will inform the future of this service at our library and can be used to address similar questions facing other libraries regarding services implemented mid-pandemic.

Design, methodology, or approach
This assessment used an email survey and qualitative coding to explore this research question.

Findings
Almost all survey respondents (97.5%) replied that the service had enabled them to overcome barriers to their academic pursuits, with one calling the service “critical to mission” during a time of limited access to print resources. The scan and deliver service extended access to Library resources in a manner that respondents describe as fast, convenient, reliable, and which made research possible even at great distances. Scan and deliver was used strategically by patrons in combination with other library services, such as interlibrary loan and online course reserves, to broaden access to print resources. We found that the scan and deliver service addressed patron needs that extend beyond those experienced during lockdown, as our patrons continue to travel long distances to reach campus, face challenges in accessing critical resources needed for research and coursework, and have health concerns that limit their access to the physical library.

Practical implications or value
This assessment has practical value for libraries that are considering implementing a scan and deliver service, or that are assessing services implemented mid-pandemic. In a time when there are no longer COVID-19 lockdowns, library assessment must consider the future of services implemented during the pandemic. Patrons told us that they want scan and deliver to continue; a “return to normal” should not undo service improvements. We envision the library assessment community drawing on our findings in relation to assessment projects that focus on barriers to access and the impact of the COVID-19 pandemic on sustained service improvements.

Optimizing a Library Website for Student Research: Comparing User Metrics between Encore and Google Scholar
Lindsay Ozburn (Utah State University)
Paper
Show Abstract

Purpose and goals
The paper addresses the methods and general conclusions portion of an experiment that evaluated user preference and search experience between using Google Scholar and Utah State University Libraries’ Encore discovery layer as a starting point for research. USU’s 2019 Ithaka S+R Faculty survey highlighted that our faculty utilize Google Scholar more as a starting point for their research. To triangulate these findings, the experiment attempts to identify which search methods undergraduates prefer.

Research questions:

  1. What is the average completion time for tasks performed in Google Scholar versus tasks performed in Encore?
  2. How many actions or clicks does it take to perform tasks in Google Scholar versus in Encore?
  3. What are the benefits and/or drawbacks that users perceive when using Google Scholar versus Encore to search for information?
    Design, methodology, or approach

Methodology:

This research study included a pre-survey, task analysis, and post-survey for two groups of randomly selected undergraduate students.

The task analysis portion of the experiment utilized A/B testing, with the two groups evaluating two search platforms:

  1. The Library’s current search environment (Encore)
  2. A mock environment with a dual search tab set-up and direct access to Google Scholar.

Each group inversely performed 10 common information search and retrieval tasks, split between the two search platforms.

Loop11 (website usability testing software) tracked the number of clicks and completion time for each task and was also used to collect the data for the pre-survey and task analysis portion of the project.

Following the task analysis exercise, all participants completed a post survey that asked them to reflect on the pros and cons of each interface.

To ensure generalizability to the campus population, the team utilized random stratified sampling methodology with the follow strata: STEM, non-STEM, and undeclared majors

Using a random number generator, the team statistician randomly selected several sets of 112 students from a total list of undergraduates in such a way that the proportion of STEM, non-STEM, and undeclared students in the sample was proportional the campus population.

The research team then sent out numerous recruitment emails to the samples, offering the experiment on 2 different days. The research team was prepared to accept up to the first 56 participants based on a Power calculation that identified what number of participants would provide the strongest data, statistically.

Findings
Expected conclusions:

  1. Our discovery layer, while adequate, doesn’t offer the best search experience based
  2. Undergraduates prefer to use a combination of Google Scholar and a more curated discovery layer to search for information
  3. A dual-tab search interface facilitates more streamlined information searching as opposed to a single search box experience.

Practical implications or value

  1. Our work provides a straightforward example of implementing A/B testing to evaluate user search experiences.
  2. The study provides easy-to-understand statistical sampling methodology that enhances the validity of data user testing data. Our methodology can be easily replicated in other research.
  3. Our outcomes will contribute to bodies of assessment work that evaluate ever-changing user preferences with information search and discovery.

Integrated and holistic project assessment for a library website redesign
Heidi Burkhardt (University of Michigan Library)
Paper
Show Abstract

Purpose and goals
When assessing a project it can be difficult to decide what to focus on, and the things you want to measure are often multidimensional and do not fit into a single assessment method. For the University of Michigan Library’s 2-year website redesign project, assessment was integrated into the work and timeline from the start. The goals for our assessment plan were to be able to measure the legacy website against the redesign across a variety of metrics, as well as know how well the structure of the project team worked and whether our internal communication and outreach were successful.

Design, methodology, or approach
The assessment plan was split into seven metrics: usability, accessibility, mobile experience, content authoring experience, site performance, project management and structure, and internal communication and outreach. Within each metric, we articulated a desired outcome and a set methods that we’d use to evaluate it. Each method included “plan statements” that laid out exactly what we would do. In some cases the statements were specific tasks (e.g. conduct tree testing on draft information architecture), while others stated an intention or practice (e.g. follow WCAG 2.1 (A and AA) guidelines, or use JIRA Project to track the progress and completion of deliverables and tasks). Overall, we used a combination of formative and summative assessments, while also employing programmatic strategies for how we worked and built the site to support achieving the desired outcomes.

Findings
We found the website redesign demonstrates significant improvement over the legacy website in usability, accessibility, site performance, and the content authoring experience. In addition to the results of our assessment metrics, the praise that came in from colleagues in the library and broader community illustrated an overwhelmingly positive response to the redesign. The project also demonstrated best practices in project management, stakeholder engagement, and internal communication, not only in what we did, but how we learned and adjusted throughout the project.

Practical implications or value
This paper will illustrate how building assessment into a project plan from the beginning benefits the project as a whole and makes the assessment manageable and meaningful. It will also demonstrate that there are lots of ways to measure success and being intentional about it is arguably the most important factor. The lessons learned that came out of this effort are broadly applicable to library assessment work, especially for time-bound projects.

Liaising through Numbers: Implementing a Transparent and Sustainable E-resource Assessment Plan
Taylor Ralph (Oregon State University)
Poster
Show Abstract

Purpose and goals
The purpose of this poster is to outline a guide for implementing a sustainable, systematic assessment plan for electronic resources. By providing a necessities checklist and a list of questions to consider when creating any collection assessment plan, this poster will act as a concise, low-barrier entry point for this complicated process and its results. With library budgets being cut and journal prices continuing to increase there is a need to look more closely at expensive electronic resource subscriptions, and make consistent decisions about renewing, cancelling, or unbundling. Data abundance and staff shortages also add to this equation, leaving library employees wondering where to begin. This scalable guide results from trials in creating an assessment plan from scratch, consulting techniques in relevant scholarship, and current efforts put toward refining and translating collection data.

In this poster I hope to:

Provide a plan outline for sustainably assessing electronic resources, including tips on resource prioritization, timelines, and tools
Suggest questions to consider about current data collection practices and communication processes
Identify stakeholders with which to engage in discussion and decision-making to increase process transparency and enhance feedback
Share a necessities checklist to be consulted before beginning electronic collection assessment

Design, methodology, or approach
To kickstart an assessment plan for electronic resources I consulted recent literature through Library journals such as Collection Management and Serials Review as well as ALA CORE presentations. With basic outlines in place, the assessment process went through trial and error. We collected informal feedback and asked questions: Which questions were asked that indicated a missing data points? What was the feedback concerning our communication method? Do participants have the capacity to keep up with this work? Which viewpoints are we missing? In the future, I hope to conduct a survey to Library employees about satisfaction with transparency and efficacy of the process.

Findings
Having a defined electronic resource assessment plan makes collection assessment more manageable and consistent, resulting in better reporting and ability to track trends over time. This, combined with having a clear communication plan and prioritizing transparency about collection decisions increases trust with the larger Library and the institution. Ultimately, better decisions about collections are made with directed feedback and input.

Practical implications or value
From this poster I hope that Library employees identify electronic resource data collection and organization processes, tools, and communication methods that can be implemented at their own institutions. Library employees that are new to, or have been asked to engage in this type of assessment may feel overwhelmed or lost at the prospect at starting the process. There are many excellent resources on qualitative and quantitative collection assessment, holistic collection assessment, and case studies; however, a basic starting guide is an important and often overlooked aspect of electronic collection assessment. This information is especially valuable at this time, because with budget cuts at many institutions it is important to be able to share consistent and credible data to justify decisions.

Assessing Outcomes of an Undergraduate Archival Scholar Research Program (ASRA) at the University of Pittsburgh
Berenika Webster (U Pittsburgh)
Poster
Show Abstract

Purpose and goals
In 2016 University of Pittsburgh’s Library System (ULS) and Office for Undergraduate Research (OUR) partnered in launching Undergraduate Archival Scholar Research Program (ASRA). The program is intended to introduce undergraduate students to research. Students are awarded a $1,000 stipend to pursue an independent research project using ULS’s special and archival materials. Each student is assigned an academic mentor and a librarian or an archivist to support them in completing their projects. In the intervening years, 74 students completed the program.

While we had anecdotal evidence about the success of the program, from participant feedback, it became clear that a more formalized approach of assessment was needed, in particular as we were increasing our fundraising efforts to support the program.

In 2019, both ULS and OUR colleagues agreed on initiating work that would allow us to consider ASRA outcomes that both partners were interested in achieving, and creating mechanisms for assessing if these outcomes are met.

Information gathered through the assessment could be used by both partners to advocate for continued funding for the program and/or identify areas in which the program may need to change to better meet its goals.

Design, methodology, or approach
To develop and map program outcomes, we used a logic model framework. It allowed us to think about the project outcomes in three stages:

1.immediate ones that focus on knowledge acquisition,

2. intermediate ones that deal with application of that knowledge, and finally,

3. long-term outcomes, that attempt to show if any material changes took place in program participants’ lives.

Once these were agreed on and documented (in a formal logic model document), the ULS team began its work on considering best mechanisms for assessing if the program outcomes, as defined in the logic model, are being met. Since most of the program outcomes focused on the acquisition of research skills in general, and specifically, when working with primary materials, we developed an assessment rubric, based on the ACRL- RBMS/SAA guidelines.

We used the rubric to design additional assessment tools including, pre- and post- participant surveys, prompts for participant research outputs, and an interview schedule for past participants.

Findings
The logic model proved a clear articulation of goals and outcomes of a multi-partner project. It allowed us to develop tools for assessment which can be consistently applied to measure project outcomes. The rubric proved to be invaluable in guiding the development of additional assessment mechanisms that are consistent with program outcome goals.

Practical implications or value
We believe that the logic model framework for articulating program outcomes and the rubric for operationalizing these outcomes can be applied successfully as constant assessment tools.

In our project we used the ACRL-RBMS/SAA guidelines to develop our rubric, as ASRA focuses on research skills for work with primary sources. However, the rubric can be adapted to other types of literacies, based on the focus of the evaluated program or service.

Dear Diary Study: Student and Participant Leadership in a Library Assessment Project
Truc Ho (University of Washington)
Poster
Show Abstract

Purpose and goals
In 2022, the Assessment team at the University of Washington explored a new model to incorporate student voices in decision-making and assessment of the Libraries reopening. Due to the shifting nature of reopening plans and the staff time that previous collaborative, cross-portfolio Participatory Design projects demanded, the Assessment department piloted a ten weeks long, semi-asynchronous online diary study research project with 6 undergraduate students. The project employed ethnographic and UX research methods to gather students’ insights in order to improve Libraries spaces. This poster presentation draws on the evolution of the online diary study project and will provide attendees with key takeaways from implementation, the process of incorporating student leadership and analysis, and making improvements based on student responses. Additionally, the poster will also highlight skills and learning opportunities that were important outcomes of the project from the perspective of a current graduate student who engaged in the project as a facilitator and co-designer.

Design, methodology, or approach
This poster highlights how diary study methods can be employed to engage students and provide meaningful insight on students’ experiences in library spaces and how libraries can incorporate participant feedback and ideas into project design and implementation. Drawing from theories on participatory design, ethnographic methods and library assessment, the poster will offer attendees a case study on students’ reflections and perspectives that helped pinpoint and address key pain points in their library experiences. In addition, the poster will provide strategies for how to solicit meaningful feedback from students on the project itself and how the team incorporated the lessons learned from each session to improve the following one. This poster will be designed to engage attendees in discussions about meaningfully engaging undergraduate students in assessment work.

Findings
As a result of the various activities, we found that:

There won’t always be new data: Feedback from students often reaffirms findings with other studies.

Different methodologies incentivized students to engage with the design and feedback process in creative ways: Students responded positively to prompts that allowed for more creativity and engagement in the design process (meme creation, library space design, etc.).

Relationship building is crucial: As the diary study progressed, students showed more comfort in sharing their thoughts and feedback. Students shared in a post-session reflection that over time, the environment created through the feedback sessions encouraged them to share their thoughts and feelings. Additionally, ongoing communication and coordination with Libraries staff facilitated immediate improvements students could see and experience.

We also found this to be a successful model for sustainably incorporating more complex and values-driven methodologies into existing assessment projects.

Practical implications or value
This poster contributes to the current scholarship in assessment and user design by discussing how users’ voices can be incorporated into the project design and analysis stage of library assessment projects, as well as sharing the findings on how online creative and participatory assessment methods can engage students. We aims to create an opportunity for the community to discuss their own experiences in incorporating student-centered design approaches in their assessment work.

Creation and Implementation of a Library-Wide Survey
Holt Zaugg (Lee Library BYU)
Poster
Show Abstract

Purpose and goals
There are several large-scale surveys that help libraries examine patrons’ experience within their library and to compare to similar institutions (e.g., LibQUAL, Ithaka, MISO). While these surveys provide valuable input, they often miss the mark of specific library operations (services, spaces, resources, patron/personnel interactions) for any specific library. This paper describes the efforts to develop and validate such a survey. It will also report results from the initial administration of one section of this survey.

Design, methodology, or approach
This effort used a combination of a survey to all employees followed by clarifying interviews to determine what questions each library employee would like to have answered by undergraduate students. Using this input, we created a pool of potential questions and categorized them by topic. Key library employees were asked to validate the questions. We also conducted a field test of the questions by library student employees to ascertain the validity of the question. Both groups were also asked to identify and clarity or grammar issues with the question.

Following these steps all questions were submitted to library leadership with options for administering the survey. Library leadership opted for creating four, topic centric sections of the full survey with each section to be administered in a subsequent semester. Once these sections were approved, we began the university level approvals including, approval from University Assessment and Planning for survey administration and sample size, scientific review, and IRB review.

Following the administration of the survey and in addition to reporting descriptive statistics of the findings, we also conducted reliability measures for quantitative questions. This was a final validation step that helped us ensure that questions were sound and able to be repeated in future survey administrations. The additional analysis also helped to identify questions that could be improved. This process will be repeated for each subsequent administration of the survey.

Findings
There are two parts to findings. First is the description of what we did and why we did it. Second, is the analysis data, which will happen between the submission of this proposal and the LAC conference.

Practical implications or value
The extra validity and reliability steps help to ensure that the survey will provide data to help library employees describe their value to the university and to assist with decision-making and planning. The process will also be helpful as surveys specific to graduate students and faculty are also developed and implemented.

They Came, They Saw, but Did They Actually Get Help?: Using Simple and Effective Mechanisms to Go Beyond Usage Data and Gather User Feedback to Assess the Value of Research Services
Morgan Stoddard (George Washington University)
Poster
Show Abstract

Purpose and goals
Almost all libraries gather usage data on research services. Libraries diligently track things like how many questions were asked at the reference desk and the number of people who attended a workshop. While this data provides insight into the demand for a service, it does adequately capture the value of the service. Libraries recognize this limitation, but face various impediments to gathering feedback on whether a service is actually helpful to users. For example, holding a focus group or launching an annual survey requires a potentially large investment of staff time and other resources. There are, however, some relatively easy ways to acquire qualitative feedback from users on library services. This poster will share how George Washington University Libraries and Academic Innovation (GWLAI) implemented a few simple, automated mechanisms to gather feedback from users on the value of its consultation services and virtual research support services.

Design, methodology, or approach
Feedback mechanisms were integrated into GWLAI’s system for scheduling research consultations, email reference service, and chat reference service. For the latter two services, a link to a short satisfaction survey was embedded and presented to users during the engagement. For research consultation services, GWLAI leveraged the automated follow-up message functionality in its scheduling system (Calendly) to send users a message with a link to the survey shortly after a consultation. The survey gathers the following information: (1) with whom did the users meet (if known), (2) did the users get the help they needed, (3) did the users feel welcome and respected, and (4) did users have any other comments. Responses are anonymous unless users choose to include their name. Feedback specific to a particular library staff member is shared with that person, and a report summarizing responses is shared with everyone involved in providing the services, their managers, and GWLAI senior leadership.

Findings
These mechanisms have enabled GWLAI to obtain meaningful data on the value of research support services beyond usage statistics. The response rates vary but are higher than anticipated. The feedback has been almost 100% positive. Most users choose to include written comments expressing deep appreciation for library staff members and their support. Because these mechanisms were relatively simply to implement and are now almost entirely automated, this work has a very high return-on-investment.

Practical implications or value
Poster session attendees will have the opportunity to learn how they might deploy similar mechanisms and strategies in their library and share their own successes in going beyond usage statistics to better understand the value they provide.

In addition to the poster itself, which will tell the story of this particular project using visually appealing and meaningful images and graphics, the session will also provide opportunities for attendees to learn and share from each other. Using a virtual space (e.g., Google Jamboard), attendees can share their ideas, questions, successes, and challenges related to finding simple and useful ways to get user feedback and other qualitative data on research services or other library services.

Improving a Library FAQ: Assessment and Reflection of the First Year’s Use
Vanessa Arce  and Michelle Ehrenpreis (Lehman College, City University of New York)
Poster
Show Abstract

Purpose and goals
In 2020, the Leonard Lief Library used Springshare’s LibAnswers to create a searchable online knowledge base (FAQs) as a complement to virtual reference during the library’s pandemic-related closure. After the first year of use, user search queries recorded in the system were analyzed to assess the online knowledge base. This paper will discuss the assessment’s findings and detail planned improvements to the FAQs. The authors sought to answer the following research questions:

  1. What type of information are users seeking in the FAQs?
  2. Are users finding the information they seek?

Design, methodology, or approach
A content analysis of user queries from the online FAQs site during fall 2020 and spring 2021 was conducted to learn about the information users are seeking in the knowledge base. The deductive coding process was informed by classification models of reference interactions derived from the one developed by Katz (1997). A total of 106 queries were coded into ten categories.

The study also examined the actions taken by users after conducting a search within the FAQs page. Queries that resulted on clicking on a question were considered successful, those that did not were deemed unsuccessful. This information was used to determine the knowledge base’s success rate.

Findings
During its first academic year of implementation the knowledge base was successful in answering user questions roughly half the time. Fall semester data showed a 47% success rate, with a slight improvement in the spring semester (51%). The top three query categories were access, non-library and instructional. The frequency of access-related queries was to be expected, given that the library building remained closed during the 2020-2021 academic year due to the COVID-19 pandemic. The prevalence of questions related to other campus units, such as admissions, information technology, and human resources (non-library category) was unexpected. This finding suggests there is a perception of the academic library as a source of campus information and supports including this type of information in the library FAQs. It also presents the library with an opportunity for outreach to other departments, which can foster collaboration and assist with consistent information and messaging to the benefit of the campus community.

Practical implications or value
This study adds to the body of assessment research within reference services, an area not well represented in the LIS literature (Allen et al., 2018). The methodology employed provides a model for assessment of online FAQs that can be easily adopted by other libraries. It also sheds light into the types of information academic library users expect the library to provide. Further research to build on this study’s findings include usability studies of online FAQs and analysis of online or in-person reference questions to determine how often academic library users reach out to the library for information related to other campus departments.

References

Allen, E. J., Weber, R. K., & Howerton, W. (2018). Library assessment research: A content comparison from three American library journals. Publications, 6(1), 12. https://doi.org/10.3390/publications6010012.

Katz, W. A. (1997). Introduction to reference services. McGraw-Hill.

3:30 p.m.–5:00 p.m.: Concurrent Sessions

Diversity, Equity, and Inclusion

An Equity Audit for DEI Data in an Academic Library
Ashley Lierman (Rowan University)
Paper
Show Abstract

Purpose and goals
When our university libraries’ DEI Committee first formed in late 2019, an initial self-assessment revealed a need for data regarding barriers to inclusion and equity arising from our libraries’ policies, services, and resources. As a framework for approaching this investigation, we decided to adapt a tool more commonly employed in K-12 schooling: an equity audit. While school equity audits tend to focus on such areas as equity of teacher quality, programs, and student achievement (Skrla et al., 2009), we adapted our audit to investigate areas of library service that would be more relevant for our organization: virtual collections and spaces, physical collections and spaces, and interactions with library personnel.

Design, methodology, or approach
Following the principles of Green’s (2017) Freirean-based model of community equity audits, we sought means of collecting data that would elicit the genuine concerns and needs of our university community. Our data was collected using a three-stage mixed-methods approach: 1) a short, primarily quantitative survey focused on online collections and services, as those were the only resources available at the time due to the COVID-19 pandemic; 2) a second, similar survey including all collections, policies, and services, after on-campus operations had resumed; and 3) a set of in-depth interviews with community members for qualitative data on their experiences.

In each case, our strategy for these surveys and interviews has been to avoid potentially loaded questioning about specific incidences of discrimination (which many community members may be understandably reluctant to name), but instead to inquire generally about positive and negative experiences of the library’s collections and services, collect demographic information from participants, and cross-analyze the results to reveal any patterns of inequity in users’ experiences by demographic group.

Findings
While our first survey had a low rate of return, from its results we have already tentatively noted what appear to be patterns of inequity in users’ experiences of the libraries, with particular patterns of less positive experiences among Black and/or African-American users and among users with disabilities. As the second survey had far more responses, particularly from marginalized community members, as we analyze its results and those of the interviews, we anticipate finding more definitive patterns in the quantitative results and specific issues to address in the qualitative results.

Practical implications or value
Our experiences can help other institutions use similar methods to investigate the status of equity and inclusion in their libraries, and what areas most need improvement. Equity audits have been little discussed outside of K-12 education, so this study will provide a valuable example of how this strategy can be used as a library assessment technique.

Assessing the Needs of Users with Disabilities in Pursuit of More Accessible, Inclusive Libraries
Emily Daly (Duke University Libraries)
Paper
Show Abstract

Purpose and goals
In 2021 and 2022, Duke University Libraries formed a cross-departmental research team of library staff to explore two primary questions: To what extent are the Libraries supportive of disabled users, caregivers, and allies? How might library staff make library spaces, web interfaces, collections, and services more supportive for these users? To answer these questions, we designed a multi-faceted study that included a literature review and environmental scan, informational interviews, a user survey, and follow-up user interviews. This is the third mixed-methods user study that Duke Libraries staff have led to learn more about the experiences and needs of marginalized or underrepresented students.

This paper will summarize the research team’s methodology, focusing on ways we engaged with users with disabilities. We will briefly describe our findings and highlight ways that library and campus stakeholders might implement the team’s recommendations to make libraries more inclusive and welcoming for individuals with disabilities, as well as caregivers and allies.

Design, methodology, or approach
This study began with a literature review and informational interviews with campus stakeholders to better understand current support for the target population and any prior research conducted. The research team then distributed a user survey and conducted four follow-up interviews. Team members used affinity mapping to analyze the interview transcripts and develop themes based on the data. The team then used findings to develop recommendations for improving library spaces, services, web interfaces and collections.

Findings
Preliminary findings indicate that while the university may outwardly welcome people with disabilities, there are many barriers that prevent people with disabilities from having their needs met. The library is often seen as more inclusive than campus as a whole, but there are clear pain points, like transportation difficulties and inaccessible electronic materials. Awareness of existing services is also low. Preliminary recommendations include updating library webpages on services available to users with disabilities, increasing support for individual private study spaces, and providing sensory-friendly spaces.

Practical implications or value
Findings and recommendations will be incorporated into a report to be shared with library staff, campus stakeholders, higher education communities interested in providing support for students with disabilities, and potential donors interested in funding relevant library services. This mixed-methods study also serves as a model for other libraries who wish to use research methods such as surveys or interviews to learn more about users from marginalized groups. Finally, this project highlights ways to collaborate with students and staff without formal assessment training at every stage of the research process, from literature review to recruitment to interviews to analysis.

Black/African American Perceptions of Inclusion and Engagement: Reassessing Campus Data to Reflect Current and Coming Realities
Jung Mi Scoulas (University of Illinois Chicago)
Paper
Show Abstract

Purpose and goals
The goal of this paper presentation is to re-examine a campus survey conducted during the COVID-19 pandemic, which assessed the culture climate of university students in order to better understand how Black/African American undergraduates engaged with university life and their perceptions on feelings of inclusion compared with other racial and ethnic groups. While there have been efforts to understand the needs and challenges of university students using various assessments at the university, college and unit level, little has focused primarily on a deeper understanding of the Black/African American college experience.

University of Illinois Chicago (UIC) is committed to achieving racial equity at all levels. As part of a coordinated campus-wide initiative, the University Library recently completed an Achieving Racial Equity (ARE) plan for 2022. This paper presentation draws on broader campus data to develop support and resources to improve Black/African American perceptions of engagement and inclusion.

Design, methodology, or approach
This project reuses a survey conducted Spring 2021 by the Office of Institutional Research (OIR) in collaboration with the Office of Diversity and the Office of the Vice Provost for Academic and Enrollment Services. This survey focused on undergraduate perceptions of overall university life, academic engagement, interactions with campus resources, and services. The research team received de-identified data from OIR and analyzed it for patterns and gaps, comparing student experiences across racial and ethnic groups.

Findings
912 undergraduates participated in the survey. Among them, 32% were Hispanic, followed by White (26%) and Asian (24%). 7% were International Students, and an additional 7% self-identified as Black/African American. Results showed these students had the highest average scores rating quality of interactions with other students, academic advisors and faculty. Black/African American students also recorded the 2nd highest average scores related to quality of interactions with student service staff and other administrative offices.

Black/African American students were most likely to seek academic support and other resources, and indicate the highest level of comfort speaking with academic advisors. They also had the 2nd highest perception of receiving sufficient academic support, and were most likely to feel they had a positive family/school balance.

However, they also had the lowest perception of fitting in, and had the largest percentage of being dissatisfied in terms of emotional support.

Practical implications or value
The findings may guide the University and the UIC Library to strengthen the current programs, resources and services, and make adjustments as needed, however specific questions about students’ perceptions of the Library were not included. Overall, members of UIC’s Black/African American community have expressed their experiences with racism, systemic bias, and exclusion in many forms. As a part of the University’s commitment to a more welcoming and inclusive campus environment, the Library plans to conduct a climate survey focusing on Black/African American students to gauge explicit feelings about experiences within the libraries to determine what additional services can be provided to support their needs.

Their Challenges are Our Challenges Too: Librarian Involvement in the Grand Challenges in Assessment Project
Megan Oakleaf (Syracuse University)
Poster
Show Abstract

Purpose and goals
The Grand Challenges in Assessment Project (https://assessment.charlotte.edu/excellence-assessment/grand-challenges-assessment-project) is a national, collaborative effort to create national strategic plans to address pressing challenges facing assessment in higher education. Endorsed by key higher education assessment organizations including the Association for the Assessment of Learning in Higher Education (AALHE), the American Association of Colleges and Universities (AAC&U), and the National Institute for Learning Outcomes Assessment (NILOA), as well as the Assessment Institute, assessment practitioners and community members across higher education participated in the identification of three grand challenges facing assessment in academia: (1) using assessment findings to increase equity, (2) using assessment findings to direct immediate pedagogical improvements, and (3) producing visible and actionable assessment findings that drive innovation and improvement. These goals are shared by library assessment professionals, and a small number of librarians are included on the project committees. This poster seeks to share the goals, work, and outcomes of the Grand Challenges project with the assessment community and highlight the contributions librarians are making—as well as the broader benefits library assessment practitioners may gain—from this undertaking.

Design, methodology, or approach
This poster will focus on the strategic plan and structure of the Grand Challenges project, the role of librarian participants, and the project’s intended outputs and outcomes that can augment library assessment efforts focused on equity and improvement.

Librarian members of the Grand Challenges project have contributed in multiple ways. For example, some librarians have worked to identify existing equity-focused assessment practices, opportunities, and structures in order to benchmark or document them. Future contributions in this vein include promoting equitable assessment practices nationally, creating a database of professional development resources, developing strategies to empower students in higher education to assess their learning, and identifying inequities in existing national surveys and datasets. Other librarian contributions focus on improving the quality of data, identifying professional development and resource needs related to data visualization and storytelling, and identifying and/or creating best practices and professional development opportunities for how assessment professionals can carry out this work so that stakeholders can focus on implementation strategies rather than understanding the data.

Findings
Thus far, the project has led to the identification of numerous equitable assessment practices, related professional development needs, resources, and existing equity-increasing structures. Librarians participating in this project are also positioning themselves to contribute to wider conversations at their institutions about all these challenges, and equity-focused assessment in particular.

Practical implications or value
By identifying and documenting these existing practices, resources, and structures the project will support higher education assessment practitioners. Librarian involvement in national assessment projects seeking to improve assessment approaches, particularly those focused on equity and learning improvement, can serve as connective tissue amongst higher education assessment practitioners and infuse practices employed by general assessment practitioners into the library assessment community.

Inter-Institutional Mobile Hotspot Lending Program
Rebecca Croxton (University of North Carolina at Charlotte)
Poster
Show Abstract

Purpose and goals
In 2021, seven North Carolina universities were awarded a State Library of North Carolina LSTA EZ Grant to develop an inter-institutional mobile hotspot lending program. Grant funding has enabled partners to purchase 180 mobile hotspots, cover monthly service fees, and loan these devices to students, faculty, and staff who are dispersed throughout the state who do not have reliable internet access in their homes and require this connectivity to participate in courses, complete coursework, or fulfill work obligations.

This presentation will share how this program is helping to alleviate connectivity barriers for individuals across North Carolina and provide recommendations for (1) managing lending policies and guidelines, (2) balancing supply and demand needs across institutions, (3) managing pooled resources and service contracts, and (4) assessment strategies.

Design, methodology, or approach
The primary outcome for this grant is the improved ability for students, faculty, and staff across the state of North Carolina to participate in online courses, complete course assignments, or fulfill work obligations. Both circulation and Verizon usage data are serving as indicators that the project is advancing towards its goal of reducing or eliminating a barrier, at least for some individuals, in achieving academic and/or work success. User follow-up surveys, particularly the qualitative responses, are helping to provide rich, contextual information that will help partners understand whether/how the devices are helping to fulfill an unmet need by reducing or eliminating the internet connectivity barrier.

The secondary goal of the project, to develop a sustainable model for a shared mobile hotspots lending program, will be measured by the completion of a handbook that outlines recommendations for how other collaborative entities can launch a similar inter-institutional lending program.

Findings
Preliminary assessment findings from check-out and Verizon usage date indicate an increasing use of the hotspots since they were first made available and marketed to campus populations in Fall 2021/Spring 2022. Comments from the user feedback survey indicate that the hotspots are fulfilling a vital need for many students. Survey respondents noted that without the hotspot, they would need to “use their cell phone data” or go to “coffee shops / libraries / parking lots” to complete their work. Others indicated that without the hotspots, they would be “unable to complete required tasks,” or would have to go to the library “late at night and do not feel safe walking back [home].”

Practical implications or value
The inter-institutional lending model and assessment strategies developed through this project are those that can be readily adopted by other libraries and partnerships to help alleviate barriers to success and level the playing field for students, faculty, and staff.

Find the Gap: The Bibliography as a Starting Point to Assess the Diversity of A Library Collection
Eva Jurczyk (University of Toronto Libraries)
Poster
Show Abstract

Purpose and goals
The University of Toronto Libraries holds Canada’s most comprehensive research collection. This collection serves each broad discipline as either the raw material for research or the overarching structure that places its scholarly production in context.

Within the humanities, the raw materials of research that are available to scholars are often limited to the traditional Western literary canon. This creates a tension for scholars when their research interests fall outside that traditional canonical scope.

The goals of the project were to carry out an assessment of the library’s collection to understand how well it had been built to meet the needs of scholars in the area of literature by Black Canadians and to more specifically identify the “gaps;” the authors and publishers that had been neglected over the years and could now be added to the library collection. The project sought to expand the availability of the creative, political, documentary and scholarly writings of black Canadians that have been insufficiently studied due to the difficulties scholars have encountered in accessing the works in question.

Design, methodology, or approach
While new methods have emerged in recent years, the traditional method of assessing the diversity of a collection is to compare that collection with a standard bibliography. The project team used the work of a renowned scholar in the area of Black Canadian literature, Odysseys Home: Mapping African-Canadian Literature by George Elliott Clarke, as the basis for assessing the library collection. The bibliography in this seminal text was OCR’d and ingested into Zotero and then project teams compared the bibliography with the library’s holdings, applying tags depending on whether the work was available in print, electronically, or not at all.

Findings
The conversion of the bibliographical information into tabular format, and the application of the tags, allowed the project team to “find the gaps” – that is, to identify the authors, publishers, or even geographical areas where the library’s holdings were weak and to set priorities for acquisitions.

Practical implications or value
Considering the diversity of a library collection must be done in partnership with the library’s users, but at the same time, diversity work should not put additional labor on members of our library user community. Relying on the scholarship created by our faculty allows these scholars to point to the voices that are missing from the library collection and the use of more modern tools and approaches ensures that the gaps, once identified, are filled.

Organizational Issues/Assessment

Supporting Development Initiatives through an Investigation of Stakeholder Insight
Steve Borrelli (Penn State University Libraries)
Paper
Show Abstract

Purpose and goals
In fall 2021, Penn State Library Assessment was asked to conduct a study of one of the Library Development Office’s core initiatives—Donor Community Meetings. While these meetings were aimed at connecting donors with librarians and building community around common areas of interest, donor engagement was proving to be hard, and the sustainability of the initiative was in question. Guided by principles from the appreciative inquiry approach to organizational improvement—a strengths-focused and generative process that engages internal and external stakeholders—Library Assessment sought to investigate concrete ways to improve the meetings, based on stakeholder insight. This investigation resulted in a series of recommendations to improve engagement and sustainability, which catalyzed a taskforce comprised of Alumni Advisory Board members, Development office and University Libraries personnel, which prioritized a list of results-oriented actions to advance the initiative.

Design, methodology, or approach
Six 90-minute virtual focus groups were held over Zoom with donors, prospective donors, Development office staff, and librarians. In all, 30 individuals participated in the study. Participants were asked a series of up to 10 open-ended questions dependent on stakeholder group. The sessions were recorded and transcribed by the research team and QSR International’s NVIVO software was used for coding and analysis. By framing the analysis of this multi-stakeholder study through an appreciative inquiry lens, the research team sought to center the strengths of the community meeting initiative and the opportunities that existed for improvement.

Findings
The analysis revealed numerous opportunities for improvement related to the meeting structure, communications, participant interactions, and other areas. Recommendations from the study were “workshopped” in spring 2022 by the taskforce. In a series of meetings, this group further refined the recommendations and developed a plan of action to re-brand and re-package the initiative, while holding intact aspects of the meetings that the stakeholders valued.

Practical implications or value
The immediate practical implications of this project are that the Development Office was able to use the results of this study to refine and improve one of their core initiatives. Beyond this, the project had value in that it was a participatory and collaborative exercise that engaged donors (stakeholders) throughout the process, such that they were willing to share their time and expertise in working with the recommendations. This may lead to stronger connections between the Libraries and its donor base.

As public funding for higher education (and by extension, academic libraries) shrinks, fundraising plays an increasingly prominent role. Collaborations between a library’s assessment unit and its development office could potentially strengthen an organization’s donor-facing programs and fundraising initiatives.

Responding to Faculty Concerns: An Approach to Salary Equity Analysis and Course Correction
Leigh Tinik (Penn State University Libraries)
Paper
Show Abstract

Purpose and goals
In response to perceived salary inequities expressed at a Library Faculty Organization meeting, the Dean of Libraries charged a taskforce comprised of personnel from Human Resources, Finance, Library Assessment, and Senior Administration to investigate. This paper describes a methodology for identifying and addressing salary inequities, including the development of a process to analyze salaries for market equity and compression and a commitment to corrective action and periodic review to minimize the potential for salary inequities.

Design, methodology, or approach
Cases were systematically identified for an equity review using both market and compression analyses. To identify market inequities, a peer comparison group using the Big Ten Academic Alliance (BTAA) was compiled using the 2019 Association of Research Libraries (ARL) salary survey data. Cases falling below the BTAA 25th percentile by rank were flagged for review. To identify inequities due to compression based on rank and years of experience, a regression analysis was conducted. Cases where the actual salary fell 1.5 times below the predicted salary were flagged for review. A cost to fix salary inequities was estimated and a request for funding was submitted to the provost. Throughout this process, the taskforce sought to minimize subjectivity in selecting and reviewing cases to maintain transparency and trust among faculty.

Findings
A first round of analysis identified nearly a third of all cases as potential salary inequities, requiring a confidential review. And an additional handful of cases were flagged for review at the request of the Administration. A corrective plan of action was developed by a senior finance administrator and employed to begin a multi-year corrective strategy. Additionally, the resulting dataset has informed the process of initial salary offers by reducing the time to calculate by nearly 75%, and catalyzed critical questions informing salary offers minimizing the potential for salary inequities.

Practical implications or value
Academic libraries are increasingly engaging in critical reviews of historical practice to minimize discrimination in operations. This paper provides a methodology to consider when evaluating salary equity which minimizes subjectivity. It illustrates the value of data in equipping Human Resources personnel in calculating initial salary offers, improving upon traditional practices. It spotlights both a commitment to course correction and an approach to consider for multi-year corrective courses of action.

Assessing Student Employment in Libraries for Critical Thinking & Career Readiness
Rick Stoddart (Michigan State University Libraries)
Paper
Show Abstract

Purpose and goals
Academic libraries typically employ a large percentage of student employees at colleges and universities. Students employed at academic libraries can benefit greatly from high-impact practices that contribute to their academic retention, training as scholars, and future employability after graduation. This paper will report on an assessment study to explore the connections between skills acquisition, career competencies training, and high-impact practices in student employment in one academic library. The study seeks to understand how well current library employment practices are preparing student employees for the skills and competencies most valued by employers as measured by the National Association of College Employers (NACE). The paper proposed isolates the top employer-ranked competency (critical thinking) and interrogates the student employment experience for engagement with critical thinking during their library employment, training, and student perceived importance to their academic majors and future careers.

Design, methodology, or approach
The study compares both quantitative and qualitative data from a survey of library student employees (N=30) and student employee supervisors (N=3) to data collected from NACE and other employer surveys. The research analyzes the connection between competencies sought by library supervisors and employers with the training and skills acquired by students while employed in the library.

Findings
While critical thinking is highly valued by the students and by prospective employers, the student and supervisor experiences indicate that training and application of critical thinking in their library employment experiences are low. The paper provides constructive suggestions on how academic libraries might effectively leverage career readiness competencies with their student employees while on the job and how assessment data might be collected to gauge improvement in these areas. The paper proposed isolates the top employer-ranked competency (critical thinking) and interrogates the student employment experience for engagement with critical thinking during their library employment, training, and student perceived importance to their academic majors and future careers.

Practical implications or value
This paper will provide insight into the student employment experience in libraries and how to leverage this to demonstrate the impact of student employment in libraries as well as possible areas to focus student and supervisor professional development. Libraries invest significant resources in student employees and this paper will offer some pathways to enhance the return on investment on student employment in libraries.

Building Collective Capacity for Assessment and Advocacy: A Model for Academic Library Consortia
Anne Craig (Consortium of Academic & Research Libraries in Illinois)
Paper
Show Abstract

Purpose and goals
While a growing body of evidence supports the assertion that academic libraries positively impact student success, libraries, individually and collectively, must make the argument to their higher education stakeholders in ways that are meaningful. Doing assessment is itself challenging; however, then translating assessment data into advocacy strategy is a further challenge. CARLI Counts: Analytics and Advocacy for Service Development is a continuing education library leadership immersion program that took on this challenge. Funded in part by a four-year, $243,885 Laura Bush 21st Century Librarian Grant from the Institute of Museum and Library Services, CARLI Counts prepares librarians to make effective use of research findings on the impact of academic libraries on student success for service development and library advocacy. In three consecutive program cohorts, CARLI Counts participants have learned how to use local library data analytics to improve their services, demonstrate library value, and build their confidence in the ability to do so. Two program cohorts consisted of teams in which each individual worked on an issue or topic specific to their local campus; a third program cohort had teams undertaking a collaborative project that focuses on a specific topic (e.g., OER, online tutorials, library space). This paper compares these two approaches, assessing their relative efficacy and provides insights into the potential contributions of academic library consortia in fostering collective impact.

Design, methodology, or approach
CARLI Counts is being evaluated using multiple methods, quantitative and qualitative, and includes feedback from participants, evaluation of program materials, and performance outcomes from participants. Methods include interviews, document analysis, and surveys. With each of the three cohorts, the evaluations have taken place before, after, and during the year-long experience. Additionally, the concluding year of the project includes participant surveys of the previous two cohorts to determine the impact of the program on the participants since completion of their cohort in the program. The evaluation addresses the participants’ perceptions and uses of evidence-based library practices, their leadership role around evidence-based practices at their library and on their campus, their level of engagement with CARLI, and their perspectives on the roles of an academic consortium in advancing cross-institutional, collaborative assessment initiatives.

Findings
Program evaluation data indicates that CARLI Counts is achieving its twin program outcomes of improving the ability of librarians to investigate and communicate the impact of academic libraries on student learning and success, and of growing the confidence that librarians have in their ability to do so. The evaluation results also document the benefits and challenges associated with multi-institutional team approaches to demonstrating library impact on student success.

Practical implications or value
CARLI Counts demonstrates that library consortia are well positioned to serve as a center for professional development on academic library impact on student learning and success. CARLI Counts will also be releasing an open version of the training curriculum that can be utilized by other groups. CARLI Counts personnel will make themselves available to consult with other consortia or libraries that may wish to adopt and/or adapt this model.

Road to R2: Assessing Library Needs Related to an Expansion of University Research Capacity
Cathy Meals (University of the District of Columbia)
Poster
Show Abstract

Purpose and goals
The University of the District of Columbia, the country’s only urban, public, land grant university and an HBCU, has recently added doctoral programs and is seeking to expand its capacity and public profile in research. The university’s research office has developed a proposal called “Road to R2,” which outlines a vision for this expansion and the attainment of the R2 (Doctoral Universities – High research activity) Carnegie Classification. As part of this proposal, the UDC Library is conducting an assessment project that seeks to compare its current levels of service, offerings, and expenditures to those of R2 peer and aspirational peers, with the goal of identifying additional resources and capacity that the library will need to appropriately serve a university community that is producing and publishing more research.

Design, methodology, or approach
For this assessment project, the library is reviewing several sources of data:

Library Benchmark (ACRL Trends & Statistics) data for 2018–2020 for peer and aspirational peer institutions, to identify possible areas of library service that will require additional investment in order to meet expanded research and publishing needs.

Qualitative data about existing research services at R2 peer and aspirational peer institutions, to include research- and publication-oriented staffing, presence and size of university digital institutional repositories, and special research-oriented library offerings (e.g., workshops, research data management).

Qualitative assessment of unmet faculty and student research and publication needs, obtained through interviews and/or focus groups.
UDC’s institution-level survey data results from a multi-institution survey on faculty knowledge of predatory publishing practices led by the University of Northern Colorado.

Findings
The project has just begun. However, based on existing knowledge and cursory examinations of data, we anticipate that the project will at least identify the need to offer librarian support around scholarly communications, an expansion of print and digital collections, and the establishment and maintenance of an institutional repository. We hope that the analysis will help illuminate the depth of need in particular areas and help prioritize the most needed services.

Practical implications or value
The findings of this assessment will contribute to a proposal for library service and budget expansion to be presented to university leadership; that is, it is anticipated to help the library make the case for additional resources both through comparisons to other institutions and reporting on internal needs assessment. It will provide an example of assessment as advocacy and a project that draws on diverse data sources.

Re-accreditation as an Opportunity for Improving Library Assessment: A Case Study
Becca Brody (Westfield State University)
Poster
Show Abstract

Purpose and goals
Westfield State University’s 2023 re-accreditation self-study generated several questions for Ely Library: how can the self-study inform the assessment efforts of library staff? How can the library utilize data gathered by the New England Commission of Higher Education (NECHE), the Integrated Postsecondary Education System (IPEDS), and other reporting bodies to develop an ongoing culture of planning and assessment? How does the data collected for these reporting bodies connect to the larger goals and outcomes of the library and the university? This poster outlines the work of the Director of the Library and the Head of Library Collections and Content to understand current data collection and reporting practices and determine the best way to develop an assessment model that supports reaccreditation while allowing library staff to measure progress made in achieving their goals.

Design, methodology, or approach
The Head of Library Collections and Content used NECHE self-study guidelines and data collection templates, literature review, and data provided to internal and external stakeholders to re-envision existing data collection practices. The Director of the Library and co-chair of the accreditation steering committee focused efforts on developing an annual reporting template that would be flexible, include progress on measurable outcomes, and tie out to larger organizational goals and strategies.

Findings
A review of the NECHE standards identified requested metrics that were not routinely collected by the library. There is no documentation coordinating all data collection. This review prompted the Head of Library Collections and Content to draft an updated data inventory to support ongoing collection of metrics that are of value to multiple constituencies as well as to document data collection procedures. The Library Director’s internal and external scan of reporting procedures identified both challenges and opportunities for tying data collection to library, university, and broader system goals and objectives.

Practical implications or value
Academic Libraries participating in the re-accreditation process can approach the process as an opportunity for reviewing data collection practices, reviewing or establishing annual reporting procedures, and ensuring that the data collection process is meaningful. Mapping library assessment data to stated outcomes, and then providing a way to routinely assess progress, can improve library services and collections while making data collection more efficient.

It’s Time for an Update: Analysis of Assessment Skills in Library Job Postings
Emma Brown (Syracuse University)
Poster
Show Abstract

Purpose and goals
Although the importance of assessment is acknowledged by librarians, the absence of it is often keenly felt. Due to lack of staff, training, or funding, assessment often falls to the wayside. This poster explores assessment trends within librarianship by analyzing library job postings for assessment skills. Using visualizations, the poster reveals the frequency of assessment skills by library type, job category, and geographic location to determine trends and patterns that emerge.

Design, methodology, or approach
This research project originated with the desire to determine if assessment skills are mentioned more frequently in library job postings in comparison to previous examinations such as “Recruiting for Results: Assessment Skills and the Academic Job Market” by Megan Oakleaf and Scott Walter and “Core Competencies for Assessment in Libraries: A Review and Analysis of Job Postings” by Sarah Passoneau and Susan Erikson. Both examinations were completed prior to 2014. This current study differs from Oakleaf and Walter’s examination which focused on academic libraries and did not include visualizations. Like Passoneau and Erikson’s examination, this poster will focus on all library types and include a visual opponent. The current study uses the ALA JobList ad archive as a data source and focuses on post-pandemic realities by limiting the selection of job postings from January 2020 to January 2021. Finally, the resulting visualizations will provide a more detailed understanding of results from previous examinations. Each job posting was examined to determine whether the job description included assessment skills, and if so, what areas of assessment the skills described fall into (instructional, user services, programming, collection development, and/or general). Once data was collected, RStudio was used to analyze and visualize the results.

Findings
Although the project is still in the data collection phase, data analysis of the first 182 job postings reveals that positions related to information literacy and instruction mention assessment the most, followed by digital projects and initiatives and IT/systems/web. In comparison, collection development, cataloging/metadata, and data services all possess a low presence of assessment skills. Positions related to administration/management, knowledge organization, outreach, and public services, subject specialist/liaison, special collections, technical services, and youth services, as of yet, have none. Overall, assessment seems to be mentioned infrequently in job postings.

Practical implications or value
By failing to include assessment skills within job descriptions, the library field as a whole is devaluing assessment. As many LIS students or early career professionals use job postings as a tool to drive their education and professional development, this absence may cause them to underestimate the role of assessment in librarianship and deemphasize assessment skill acquisition in their learning. By including assessment within job postings, the library community can encourage librarians to develop assessment skills. The hope is that this project will encourage the library community as a whole to realize the importance of including assessment skills within job postings.

Challenge Talks

A “Culture of Assessment”? Or an “Assessment-Minded Culture?”
Lindsay Ozburn (Utah State University)
Show Abstract

Key Project Details
In 2018, I started in a brand new Library Assessment Coordinator position at a mid-sized land grant institution. The position emerged based on loosely identified assessment needs and followed the hiring trends at the time. Primarily, library staff and administration wanted this to define and build a “culture of assessment.” After several dozen conversations, I identified a handful of projects that would, in theory, meet their expectations of cultivating a “culture of assessment.” I initiated and implemented several, with very mixed results.

Central Issue/Problem/Challenge

  1. Not having established “assessment” practices or a “culture of assessment” was, from the get-go, used to rationalize why projects and initiatives across the Library had been failing. Many expected assessment practices to be the sole solution to those issues rather than tool that could lead to better outcomes. Their preconceived notions led to unreasonable expectations.
  2. Library staff could not agree on several key details about cyclical assessment practices that could cultivate a “culture of assessment.”
  3. Practicing assessment was always the least important task on everyone’s work plan.
  4. Buy-in for assessment initiatives was already low prior to the position being formed. Many staff members were intimidated by the term, “assessment,” assuming it meant only quantitative data were examined, and felt their jobs were threatened.

Lessons/Ideas/Questions
All the complications, unfinished initiatives, and lack of ownership over key tasks were not truly the heart of the issue. While the library asked for firmly established assessment practices, guidelines, and processes (formally named a “culture of assessment”), they could not accommodate a “culture of assessment” flipping their workflows on their heads. What really needed were tools to reframe their already existing practices to be assessment-minded. Organically grown, assessment-minded tools were more effective and used more frequently (e.g., rather than a complex assessment plan template, a more general goal planning form was more helpful).

Complications

  1. Because there were preconceived notions about what a “culture of assessment” would solve, any initiative that didn’t result in perfection was heavily scrutinized. This is, in many ways, the antithesis of what assessment stands for: a cyclical practice meant to improve and change over time.
  2. A lack of agreement on several key details about assessment practices resulted in delayed implementation and/or not all units adopting practices in the same way.
  3. Staff not seeing assessment as important meant that initiatives were frequently not done, citing “time constraints.”
  4. A lack of buy-in translated overall to a lack of enthusiasm. Assessment was a chore rather than a tool that could help show impact.

Making a Raft from a Sinking Ship: How to Put It Together When It’s All Falling Apart
Manda Sexton (Kennesaw State University)
Show Abstract

Key Project Details
How can we make data-driven decisions without the data?

In the fall of 2020, the Kennesaw State University Library System began the undertaking of choosing the best fit for vendors to host our large-scale survey, a part of our three-year assessment cycle. Diverting from our typical use of the LibQUAL+ vendor, we decided to partner with the MISO survey team due to their adaptability and use of stringent population sampling.

Despite hiccups with Institutional Review Board and approvals from IT regarding outside vendors, the process continued for a full year until the end of the year when it came time to gather the student emails for population sampling. We were denied.

Central Issue/Problem/Challenge
A new policy for student data protection prohibited the use of student emails for surveys by any university entity not coming from the president’s office. In the days before the MISO survey was to debut, I made a final plea before the Data Governance Committee to allow us to use the student emails for this last assessment. The appeal was denied.

They feared that if the library was permitted to do the MISO survey, other organizations would then use our case as an argument to initiate their own surveys. A year of work and money were now gone, yet the three-year assessment cycle needed to continue.

How do we keep our three-year assessment cycle with no prior planning, no budget, no plan, and no survey?

Complications
After the plans had been approved by library administration and money had been spent on MISO, we were surprised to discover that the university’s Data Governance Committee had objections to the MISO plan based not only on student privacy issues but also on the danger of setting precedents for other university organizations. The MISO survey was subsequently denied just days before its debut.

Several obstacles emerged during this assessment endeavor. Our team was forced to create a new survey-from scratch- and market it on campus using no student emails and no budget while still demonstrating trends from previous survey years. And we needed to do it fast.

Lessons/Ideas/Questions
This experience was a lesson in acceptance, resilience, goals despite the upheaval of our best laid plans. One team member, who assisted with this project for her internship, claims that she learned more valuable lessons about summoning grit and maintaining professionalism during difficulties than she would have if everything had gone perfectly.

We learned that communicating with the right people in the university is more important than communicating with those you believe are responsible for certain decisions by verifying upwards.

Most importantly, we learned that we could pull it together when necessary and prioritize what’s truly important to stay afloat.

The Long and Winding Road of an Assessment Committee
Karen R. Harker (University of North Texas)
Show Abstract


Key Project Details
Develop a library-wide committee with focus on assessment.

  • Members from many library divisions representing different perspectives.
  • Started in 2011.
  • Original purpose was to build a culture of assessment.
  • Most efforts were on training on methods and skills.
  • Centered largely on The Value of Academic Libraries report.

Evolved over time, becoming more project-centric.

  • Ran several feedback surveys
  • Participated in university-wide goals
  • Developed a library-wide data inventory

Revived in 2021–2022

  • Sunset review
  • Recognized our accomplishments
  • Accepted our challenges
  • Revised our mandate—Become Leaders of Library Assessment

Central Issue/Problem/Challenge
The members of the committee felt frustrated by the lack of progress, as well as a sense of apathy and disinterest, most likely due to growing responsibilities and time-commitments.

Complications
Feedback surveys had diminishing returns

Lost focus, energy and drive

  • Unable to hire a full-time assessment librarian

COVID brought work to a standstill

Lessons/Ideas/Questions
The committee completed a sunset review, mandated for all committees, that involved a review of our successes and difficulties. With several new members, we have revised our mission to Become Leaders of Library Assessment, and are developing a strategic plan on how to succeed at this. We hope to learn from others who have had similar experiences, particularly regarding sustaining interest and drive in a time of many commitments demanding attention and time.

The Diverse Library Spaces Survey: A Tale of Unfocused Responses, Low Engagement and Naysayers
Jena Styka Payne (Case Western Reserve University)
Show Abstract

Key Project Details
Case Western Reserve University Libraries launched a Diversity Working Group in Spring 2021 to identify areas for growth and improvement in the areas of diversity, equity and inclusion. One of the areas of focus for the working group was physical spaces for all five libraries. We were specifically tasked to “create a survey to assess spaces and identify structures that work against CWRU Libraries’ anti-racism, equity, inclusion, and diversity efforts.”

During the 2021–2022 Academic Year, a small subcommittee developed and launched a survey, consisting of matrix style and open-ended questions. We hoped to identify student-driven changes in the libraries’ physical spaces that would support diversity and inclusion.

Central Issue/Problem/Challenge
Our hope was that students would be excited about finding ways to make library spaces more inclusive, and that they would take this time to share creative and thoughtful changes they wanted to see in the library. The major issue at hand was that respondents used this survey as a time to voice all their thoughts and opinions of the libraries. We had to weed through unfocused responses to find quality feedback as it pertained to the diversity of library spaces.

Complications
Unfocused Responses:

  1. Students addressed other issues unrelated to diversity (library hours, collection development, etc.)

Response Rate:

  1. We launched the survey at the end of the Fall semester. We recognize this contributed to the low response
  2. We did provide an incentive with a gift card raffle. However, this did not seem to attract students

Naysayers:

  1. In a highly divided political landscape, we had several respondents who were not just neutral, but hostile towards the libraries initiative to diversify our spaces

Lessons/Ideas/Questions
Analysis:

  1. What does it mean when students do not stay on topic?
  2. Were students not interested in creating a diverse and inclusive library space?
  3. Do students believe libraries have more important priorities?

Logistics:

  1. Should we try this again in a few years?
  2. How could we have focused the respondents to get the details we wanted?
  3. How could we have reached a wider audience?

Despite low engagement, we received some positive ideas from our student respondents. For example:

  1. Host a student art contest where students contribute art that reflects their identity/culture/ethnicity
  2. Post a world map where students can identify their home

Signage—An Untraditional Assessment Necessity
Will Cook (Jackson)
Show Abstract

Key Project Details
This project highlights the assessment of library signage. In partnership with our Director of Communications and Marketing, we inherited the need to revise how signage was presented to our users. For different reasons, our jobs brought the informal need for signage to be thought of as more of a priority to increase the chances of a positive user experience. Though intended to be a set task, we turned our attention to a 5-year commitment to making continual improvements to all signage in the building. Oddly enough without starting this project in 2018 we would have been much worse off at the start of the pandemic.

Central Issue/Problem/Challenge
The central issue lies in the fact that signage can be an afterthought. In many cases as long as it is posted and the information is still valid then all is well right? That became the biggest sticking point. Though our signage was in some cases correct and maybe nothing wrong with it from an informative POV there were other issues that had been far too long ignored. Branding was not present as it should be, fonts and organizational details were missing, and even placement had not been considered. In addition to all these tricky elements working with the staff in the midst of some of the changes was also challenging.

Complications
A cliche but true answer is the informal assessment of signage in the building led us to dead ends of tradition. Much of what we worked to replace was installed and created by staff who are no longer here. Trying to keep the integrity of the work that exists was much more difficult than expected. Complications came frequently when trying to make sure as new signage was created that it was produced and installed properly. This took the involvement of many of our partners that we work with such as the print shop and facility services. At the same time keeping the organization informed as changes took place was necessary and our changes were not always understood or met with openness.

Lessons/Ideas/Questions
I learned moving forward to:

  • Keep signage a part of your strategic conversation within the organization
  • Be a champion for the work and changes that come with keeping up with something as fluid as accurate branded signage
  • Make signage a part of normal assessments efforts so it does not lag behind
  • Be creative and take all suggestions from ALL user levels, from leadership down to end-user
  • Work with other campus partners to create new possibilities for materials that might need a new home “Campus Rec, Communications”

Taking Stock of Your Data Holdings: Pitfalls and Pearls
Christopher Hergert (University of North Texas)
Show Abstract

Key Project Details
This talk will cover the process behind, and lessons learned over the course of, a library-wide data inventory. This project’s goal was to create an inventory of what data sets any and all departments were using, were collecting, or already held, as well as which staff or faculty member owned each data set. The UNT Libraries’ Collection Assessment Committee’s higher-level goal in this project was to find where overlaps or discrepancies existed in data holdings, and to attempt to improve efficiency by removing overlaps and other obstacles.

Central Issue/Problem/Challenge
Several problems hit this project in turn: one of the leaders departed UNT for another job, there were considerable territorial attitudes where departments were initially unwilling to share their data, and most significantly, the COVID pandemic began during the departmental interview phase of the project. This final obstacle meant that the early planning for all of the interviews with departmental leaders, which were intended to bring them onboard and secure their buy-in while defining the project’s high-level objectives for those leaders them, were all thrown out the window.

Complications
The complications included major communication difficulties as we all adjusted to remote work, changes to the timeline, difficulties with getting departmental leaders to devote work hours to surveying their data holdings, and finally pivoting when the project’s official labor availability was cut.

Lessons/Ideas/Questions
Lessons learned included the need for achieving buy-in from the top down at the very beginning of a project, rather than starting with the people whose input is most materially critical to the project; those people all have other things to do with their time, and many of them aren’t strongly incentivized to help with an outside project unless their supervisor incorporates that time into in their departmental planning. Similarly, many departments feel a need to justify their costs and their projects by keeping their data products to themselves, but taking the time to sit down with these departments’ leaders to define the goals of the project and account for their concerns is critical in achieving that high-level buy-in.

When Empowerment becomes Overwhelming: Strategies for Collaborative Institutional Assessment
Krystal Wyatt-Baxter (University of Texas at Austin)
Show Abstract

Key Project Details
In 2018, I presented an LAC paper focused on implementing an assessment program that distributed responsibility for program assessment across the library system. The intention of the program was to take advantage of my colleagues’ domain expertise while spreading assessment knowledge and skills throughout the organization. I held info sessions, workshops, one-on-one meetings, drop-in sessions, and basically supported my colleagues through the process of writing and implementing assessment plans in any way I cold think of. Plans, then outcomes and strategies for improvement were submitted to the campus assessment office as part of our institutional assessment framework.

Central Issue/Problem/Challenge
As the program continued, it became apparent that while my colleagues serving as assessment representatives had good intentions for assessing their programs, they struggled to make time for the kinds of impactful assessment they would like to be able to do. Between the ongoing challenge of making time across the organization to focus on assessment and the onset of the pandemic, we completely changed our institutional assessment strategy to be more centralized and carried out by the library assessment team.

Complications
Simply put, even when the will to assess was there, the bandwidth was not. Assessment was often seen as an add-on and I saw our distributed model causing my colleagues stress rather than empowering them. As an organization, we had bitten off a little more than we could chew.

Lessons/Ideas/Questions
I learned that the best intentions may not match what is possible, and that sometimes it’s best to let a strategy go while thinking about ways to improve it. I would like to hold a discussion with attendees on strategies for including library staff across the organization in assessment work without overburdening staff who already have full plates.

The Challenge of Capturing and Communicating Meaningful Data on Open Educational Resources (OER) Outreach Initiatives
Morgan Stoddard and Hannah Sommers (George Washington University)
Show Abstract

Key Project Details
In 2016, George Washington University Libraries & Academic Innovation (GWLAI) began an initiative to promote and support the use of open educational resources (OER) in courses at the university. The team dedicated to this initiative established and experimented with various mechanisms for capturing and sharing relevant data on its work. This included not only data on outcomes (e.g., OER adoption rates and value of OER adoption), but also data on efforts to raise awareness, establish relationships, and understand impediments. It was important to establish various metrics for evaluation so the team could assess the efficacy of its outreach and library leadership and campus administrators could appreciate the level of effort involved in increasing the use of OER among faculty.

Central Issue/Problem/Challenge
As with many initiatives where data collection is not automated, GWLAI had difficulty tracking data on OER outreach because it required staff to manually record information, such as conversations with faculty about OER. Staff did not always remember or have time to capture this information, in addition to everything else they were asked to track. Additionally, GWLAI defined “OER outreach” broadly to include promoting affordable course materials like those available from library subscriptions. Staff throughout the organization supported faculty adoption of affordable course materials, but they didn’t always think of, for example, acquiring an e-book for a course as something worth recording. Finally, to determine the impact of outreach, follow-up was often required. It proved difficult to establish a good mechanism for tracking the status and outcome of a particular effort, such as whether a conversation with a professor eventually resulted in OER adoption.

Lessons/Ideas/Questions
GWLAI is in a period of transition as it relates to OER outreach after the departure of librarians and staff who had been leading OER initiatives. As new colleagues join GWLAI, there is an opportunity to re-envision how the organization approaches OER outreach and what data it gathers to assess its work and demonstrate value to stakeholders. Presenters are interested in learning from attendees: What data do they collect in addition to, for example, OER adoption rates and student savings? What tools are used to capture and analyze this data? What data points have stakeholders found most useful or persuasive?

Complications
In addition the challenges noted in other parts of this grid–manual data entry, defining what efforts to capture, and staff departures–the main challenge has been that promoting the use of OER at GW is not anyone’s primary responsibility. Like most universities, GW does not have a position in the library or elsewhere dedicated to open education. Even for members of the library team working on promoting textbook affordability, OER outreach was just one part of their many responsibilities. Not having a position dedicated foremost to promoting and supporting OER adoption may be an impediment to establishing better systems, capturing meaningful data, assessing strategies, and communicating value and impact. How can libraries like GWLAI that don’t have an “open education librarian” position overcome this complication.

Recruiting a Diverse Participant Pool for an Academic Library Web Testing
Laura Spears (University of Florida)
Show Abstract

Key Project Details
A team of academic library staff and faculty are tasked with examining a set of web pages that have proven problematic for users in prior usability testing. After many revisions were made to the web pages, the Usability Task Force (UTF) developed five key tasks to examine ease of use and ability to complete the task in a new round of testing. Additionally, the UTF sought to ensure that the study sample represented the diversity of the campus community by targeting recruitment at student services centers (Disability Resources Center, Office of Multicultural Affairs, Department of English Language Learners) on campus serving vulnerable populations (i.e., people of color, non-native English speakers and those with disabilities).

Central Issue/Problem/Challenge
As students are increasingly engaging with academic libraries’ online resources, it is important that a library’s website engage all types of users with varying understandings of core library functions and how to find and use them. Testing has previously not monitored the representation of different student and faculty users but accessibility must be understood on many dimensions that span physical and neurodiverse capacities. However, recruiting vulnerable populations undergoes heavy scrutiny with the Human Subjects Committee and requires significant coordination with the centers and adaptation of the language in recruitment emails. But it remains an important criteria for the library’s assessment program to establish successful recruiting methods that provide better representation of the diverse stakeholders using the library’s web sites.

Complications
For this study, the UTF had to rely on others to distribute a recruitment email and the sample ended up not meeting the study needs in terms of size and demographic makeup. Recruitment emails were requested to be edited for each specific student services center and then had to be resubmitted to the Human Subjects Committee for approval of any changes.

The recruiting time was also not long enough to allow time to repeat recruitment or try other means. The only backup the UTF used to recruit was through social media (Twitter and Facebook). This also had limited success.

Lessons/Ideas/Questions
The Assessment Program director must establish long-term relationships with student services centers on campus that exist for more than study participant recruitment. While the intent of the UTF in targeting specific populations of students is developed with good intentions, the leaders of the centers need to feel confident that their stakeholders will not be abused. This trust takes time to build.

Include enough recruitment time to adapt the message and utilize other channels.

Develop backup channels as part of the original protocol.

5:10 p.m.–6:00 p.m.: DEI Discussion

Thursday, November 3, 2022

11:30 a.m.–1:00 p.m.: Assessment Accelerators

Developing and Implementing Library Inventories
Holt Zaugg (Brigham Young University)

Learning Analytics: It’s Coming…Get Ready!
Ken Varnum (University of Michigan), Megan Oakleaf (Syracuse University), and Rebecca Croxton (University of North Carolina at Charlotte)

Six Dimensions: Evaluating and Planning Your Assessment Portfolio
Gregory A. Smith and Kory Quirion (Liberty University)

1:30 p.m.–3:00 p.m.: Concurrent Sessions

COVID-19

Envisioning our Future Phase III: The Pandemic Changed Everything
Nancy Turner (Temple University Libraries)
Paper
Show Abstract

Purpose and goals
The Envisioning Our Future project is a case study of how space supports how staff work at Temple University Libraries. We explore how physical and virtual spaces and technologies accommodate our individual work, as well as with colleagues and with users. We aim to understand how the hybrid environment has impacted our work and our organization.

Design, methodology, or approach
The project builds on previous research conducted to explore how physical space supports library work. The research took place in three phases, from Spring 2019 as the library staff were preparing for a move to a new library space, to six months post-move (Winter 2020), to Phase III, conducted in the fall of 2021. Each phase consisted of semi-structured interviews with staff (n=86). While Phase III was not part of the original project design, the pandemic and subsequent increase in hybrid work introduced important new aspects to the question of how space, increasingly digital, supports our work as individuals, with colleagues and users.

Phase III consisted 28 interviews of staff working at a range of levels, functional areas and work spaces, from fully onsite to fully remote. All interviews were conducted via Zoom, audio-recorded, fully transcribed, coded and analyzed to discern themes.

Findings
Staff working remotely experience benefits in work productivity. They enjoy the ability to focus on individual work and have more control and flexibility around their day’s structure. Staff working remotely appreciate their supervisors’ trust and respect in providing this opportunity. If the technology is working, staff communications via Zoom and other means are seamless, and in some ways afford improved interactions with colleagues and with users.

While staff enjoy the privilege of working remotely, they describe feeling lonely and isolated at times. They describe less interaction with colleagues outside of their immediate department or project team, and less serendipitous connection. Some sense a lack of cohesion across the organization and a growing gap between onsite workers and those working from home, exacerbating what some perceive as an already siloed organization.

Staff working onsite during the height of the pandemic developed a special comradery from this shared experience. At the beginning of pandemic when vaccinations were unavailable, health and safety concerns contributed to anxiety and frustration for onsite workers, as well as their remote-working colleagues.

Working in the same physical space offers opportunities for serendipitous meet ups and informal socializing. Being onsite also allows for a direct connection with students and the community who depend on the libraries’ physical spaces and resources, allowing for immediate assistance and communication with patrons, perhaps harder to accomplish through virtual means.

Practical implications or value
The project resulted in recommendations for next steps, from staff discussions of best practices for effective hybrid meetings to review of job descriptions for remote work opportunities across all levels of the organization. Of interest to all libraries, the research generates questions about how hybrid work environments impact organizational culture. Now that students are back on campus, we need a better understanding of their needs for in-person interactions with library staff.

Sharing is Caring: Empowering Voice and Engaging Library Staff During the Pandemic
Susanna Cowan (University of Connecticut)
Paper
Show Abstract

Purpose and goals
From March 2020 through spring semester 2021, the majority of our library staff worked wholly remote schedules. For the 2021-22 academic year, staff worked mostly hybrid schedules resulting in, on a typical day, only some fraction of staff being onsite concurrently. Like many academic libraries, work was mainly accomplished through virtual interaction launched from a wide array of offsite home offices or de facto (improvised, sometimes shifting) work spaces. Although our Library staff had a variety of means for collaborating and communicating in both formal and informal ways – from WebEx and Teams to Slack, Miro boards and more – gauging the experience of the pandemic along both work-professional and home-lived-experiential axes was anecdotal more than intentional.

Several questions persisted (and persist) across the span of the “pandemic months” – including not only the very human “how is everyone doing out there?” but also the organizationally critical “how is collaboration going?,” “how well are we connecting to each other and stakeholders,” “how engaged are staff as this continues,” and “how successfully are Things Getting Done”?

In the pandemic, things have, even when in planning mode, mostly been reactive: shifting institutional and public health codes had to be mashed up rapidly with library norms and care for the wellbeing of our staff. In this climate, there was no time to launch formal large-scale climate surveys or service assessments. Instead, over the course of the pandemic, we conducted a series of assessments that complemented and extended each other, although that was not there intent initially. It would be misrepresenting this assessment sequence to describe it as planned – like most things in the pandemic, it emerged.

Nonetheless, in aggregate these assessments became critical pieces of feeling our way forward through this period. Powerfully, what also became evident was that staff were willing, perhaps eager, to be offered opportunites to share their experience of and perspective on the pandemic as it was affecting their work and personal lived experience. In this sense, these assessments, which had functional intent, turned out to be powerful tools in giving space for individual Voices to be heard in ways that pushed past the loose, informal exchanges in public forums like Slack. During a period of overall withdrawal, this engagement with questions about the workplace, work interaction, and the work itself, was striking.

Design, methodology, or approach
There is no one tradition of scholarship or method that these assessments, or that this reflection on the organizational impact of these assessments, draws on, although it is fair to say that both are indebted to the methodology begind ARL’s ClimateQUAL organizational climate and diversity survey. We had only just finished running the ClimateQUAL several months before the lockdown. As the library and University shut down in March 2020, we were finishing data analysis and writing first drafts of an executive summary of that survey’s findings to share with staff (work that would, like many things, be completed later than planned, as we had to push pause in the face of the immediate emergency).

ClimateQUAL was formative for our library, and every all-staff assessment implemented since has drawn on both principles in its design and findings specific to our organization. ClimateQUAL is a survey built to elicit often very personal experiences of an organizational culture in a manner that, in structure, content, and implementation, emphasizes care for privacy and personal autonomy. We launched ClimateQUAL in an organization that was suffering from rifts caused by distrust of leadership, feelings of disempowerment and resulting disengagement. Although the findings of ClimateQUAL showed those organizational characteristics to still exist, the way we ran ClimateQUAL worked to bolster organizational trust and individual empowerment. In pre-survey education, survey rollout, and post-survey analysis and data sharing, we were careful and meticulous. We made ample time for questions about survey data stewardship, and chose to share the full survey data report, with the exception of a single comment that attacked another staff person by name.

As a consequence of running ClimateQUAL, all of the organization-wide assessments designed during the pandemic, which included several surveys and a series of facilitated topical conversations, were intentionally structured to solicit feedback in ways that were empowering, transparent, and safe.

The experience of ClimateQUAL, and its almost uncanny placement on the eve of the pandemic, uniquely prepared us to design as-needed assessment over the course of the two years of remote and hybrid work. The “sequence” of assessments comprised an initial, quick, almost on-the-fly survey of “remote work.” Running across fall semester 2021, a group picked up a piece of CLimateQUAL findings by conducting a series of “conversations around teams and teamwork” aimed at uncovering why our organization had scored relatively low on the “structural empowerment of teams” climate, a topic that felt particularly timely in this time of remote work. At the end of all 2021, as we completed our first semester of mostly hybrid work schedules, we conducted an extensive evaluation of remote and hybrid schedules and work, asking staff to share their perspectives on several key aspects of “working,” including communication, productivity, engagement, and relationship/community building and maintenance. Finally, a strategic project group conducted a survey of the pandemic’s impact on library work and services in winter 2022.

Findings
Each of these instruments has its own local findings, and we are still drawing out of the cumulative data what we have learned, in sum, about various aspects of our organization and the work of its staff during this period.

But as important as the per-instrument findings, for this study, is what we learned about running intentionally-designed and empowering instruments at a time of individual partial or complete physical isolation. During such a period, one would expect, perhaps, to find exacerbated organizational issues such as disengagement and distrust. This would seem especially likely during a period in which “emergency” decisions were often top-heavy (or at least originating from narrow circles around leadership), as urgency dictated fast and definitive response.

But what we found, and will continue to tease out, is that staff were, regardless of specific feedback, eager to participate when invited to do so. When conducting ClimateQUAL, the work was to woo staff to participate, and the method of wooing was to be transparent, make promises (about confidentiality, data curation, and communication of results) and to stick to those promises. The surveys and group conversations we conducted over the months of the pandemic took these lessons from ClimateQUAL – methods were transparent, protection of participants was paramount, and findings were share as wholly as possible. When conducting the most sensitive of these assessments, the Fall ’21 (alternative work arrangement) Staff Self-Assessment, we designed a survey with an “employee” section and a “supervisor” section, but allowed any staff person to click through the questions of both to see what was being asked. Open-ended questions, often the “ask” that asks too much, were highly successful, and staff answered even multiple open-ended questions in single instruments.

So although the “findings” of each instrument are valuable, and this paper will highlight some of them, the “finding” that may be most lasting is the degree to which we can use assessments to cultivate Voice and engagement at a time when communication is literally channeled and constrained beyond our control.

The idea of self-assessment in particular, as critical to organizational health has, of course, been around for a long time, and informs not only ClimateQUAL but also powerhouse organizational-corporate approaches such as TQM and its many iterations. That background is relevant to this paper, although the focus here is on how self-assessment was particularly powerful at a time of overall constraint of individual voice and empowerment.

Practical implications or value
We have learned many things about what works in assessing one’s own staff that can translate easily across organizations. There will be many, many institution-specific and regional/national (and international) studies to emerge about “what happened” in the pandemic, from services to staff experience. There is no doubt that trends will emerge from these studies that our local investigations will echo.

But the bigger lesson that will have more lasting impact on our organization – and more importance as we share with the greater assessment community — will be how “intentional” and organizationally-aware question asking can serve two purposes at once: the satisfaction of an immediate data need and a more “meta” purpose of making participants feel listened to and empowered to use their Voices, a feeling that in turn leads to continued participation (and empowerment). We have two imperatives, as we emerge from what we hope has been the worst of the pandemic: to learn about what happened and to learn from what happened.

This paper hopes to contribute to the latter. It’s a question worth asking: how does what we experienced locally extend beyond our own community? As we emerge into the next-normal, will such intentional, organization-savvy question-asking similarly elicit feedback from stakeholders such as students, faculty, and the public while also making them feel more empowered in the process? We will be running LibQUAL in the coming year: it may be our first broadly-impactful opportunity to consider both what we’re asking and what we’re accomplishing by asking. We have learned something about how question-asking, when designed on principles of personal empowerment and organizational transparency, can accomplish something undesigned (at least initially) within our organization. Can we draw on what we have learned to empower new voices, perhaps, that we have not heard from traditionally in library service assessments?

“They’ll Still Come, They Still Need You, Right?” Library Value after COVID-19
Amy McLay Paterson (Thompson Rivers University)
Paper
Show Abstract

Purpose and goals
This paper will discuss the work experiences of Canadian academic librarians during the COVID-19 pandemic, as they relate to participants’ thoughts on the value of libraries and librarians going forward. Throughout the semi-structured interviews, almost all study participants shared thoughts on how libraries should change as a result of COVID and how their work was valued (or not) by their patrons, colleagues and administration.

Design, methodology, or approach
As the goal of this study was to explore in-depth individual experiences, it was determined that semi-structured interviews would be the best method of capturing our participants’ thoughts, feelings and understandings of their work during the COVID-19 pandemic. Previous attempts to capture the phenomenon of librarian work during the COVID-19 pandemic have been through surveys, which inherently capture a wider breadth of experience; however, we wanted the chance to both probe into the depths of our participants’ experiences and to follow up or clarify any points that were raised. Our scope was limited to those working in non-administrative librarian positions at Canadian post-secondary institutions. While the observations of other library workers, such as library technicians or assistants, would undoubtedly be interesting and noteworthy, it was determined that their work and experiences would be distinct from that of librarians.

Findings
Most participants were resistant to returning to the “old normal” without myriad changes inspired by the COVID-necessitated adaptations. Proposed changes were varied and often specific to the participants’ work area but often focused on either the future of remote work or the reevaluation of core services. However, there were concerns raised about whether or not their ideas would be implemented or even heard. Additionally, many participants felt caught between proving their value through productive (and measurable) labour and the care-work that felt necessary and pressing but was not externally validated.

Practical implications or value
Libraries often fall back into the refrain of “just the way we’ve always done it.” Furthermore, austerity and resilience are constantly evoked as a crisis response for libraries. When we interviewed librarians in March and April of 2021, there was a resounding belief that COVID-19 was a different sort of crisis, that it was an opportunity for real change–changes that the librarians in our study were actively hoping for. However, in order to achieve these changes, library decision makers need to reevaluate their conceptions of both library value and core services.

Uncovering More Treasures in the Data
Carolyn Dennison (University of Hawai‘i at Mānoa Library)
Poster
Show Abstract

Purpose and goals
E-resources are a ubiquitous part of the educational and research activities of institutions of higher education. The need for e-resources became even more apparent when universities and colleges were forced to shut down their campuses with the onset of the COVID-19 pandemic in March 2020. Instructors taught their classes virtually. With very few exceptions, researchers conducted their work from off-campus locations. Campus libraries responded by providing access to e-resources and online services and to very limited, if any, physical resources and in-person services. This presentation explores how local evidence obtained from an ezProxy server can uncover the impact of COVID-19 on patron behaviors, as well as reveal changes in e-resource usage over an extended period of time.

Design, methodology, or approach
The University of Hawai‘i at Mānoa Library serves a land-, sea-, and space-grant research institution and requires almost all of its patrons to be authenticated through an ezProxy server in order to access e-resources. Using ezProxy log data captured over a five-year period (January 2017 to December 2021), this presentation will answer who is accessing resources, where and when they are accessing resources, determine whether patron usage patterns have changed over time, and to see whether COVID-19 has affected patron usage.

Findings
An initial analysis of the log data indicates consistent usage patterns for all patron groups over time. We expect that circumstances stemming from COVID-19 will have some impact on when and where patrons’ access the Library’s e-resources.

Practical implications or value
Analyzing usage data over an extended period of time may provide more generalizable conclusions regarding patron usage of library e-resources at a Research 1 university.

Shifting to Hybrid Work: An Evaluation of Flexible Work Arrangements in the Library
Krystal Wyatt-Baxter (University of Texas at Austin)
Poster
Show Abstract

Purpose and goals
In Fall 2021, the library system at our large public university participated in a flexible, university-wide pilot implementation of Flexible Work Arrangements (FWAs) for staff. FWAs in this case are defined as “a variation in where a job is performed (e.g. teleworking) or the time the work is performed (e.g. flexible schedule).” The pilot included a mandate to assess the impact of FWAs on the organization, but didn’t specify goals or metrics. With the latitude to decide what success of FWAs meant for our libraries, we devised a two-prong approach to answering the following question: What challenges and opportunities accompany a shift to more flexible work arrangements in an academic library, and how can supervisors best implement these changes?

Design, methodology, or approach
To thoroughly explore staff perceptions and experiences around the implementation of FWAs, we designed two surveys. The first survey consisted of open-ended questions designed to illicit information about how supervisors decided which FWA options would be available to their staff, what methods they used to roll out the implementation, what (if anything) they were considering changing, and their perceptions of how work in their unit was or was not affected by the changes. The second survey went to all library staff members and consisted of a mix of multiple-choice questions and open-ended questions designed to learn about how FWAs affected job satisfaction and respondents’ abilities to achieve their work goals. Analysis of both surveys was shared with the entire staff through brief reports and a set of visualizations designed to spur discussion on how the pilot implementation of FWAs could be improved upon.

Findings
Findings show that the vast majority of staff members report being able to adequately achieve their work goals and communicate with colleagues, and staff who benefit from individual FWAs demonstrate the most satisfaction. Staff who had more flexibility expressed appreciation for FWAs, but many staff suggested seeking ways to extend flexibility to all staff members by finding creative ways to add flexibility to, for example, front-line service positions.

Practical implications or value
As workplace expectations shift following the onset of the Covid pandemic, flexibility will likely be essential to the retention of library workers. While flexibility offers great promise, it can also lead to further inequality in the workplace if not carefully considered and designed. This study offers a case study in how to assess flexible work arrangements and contributes to a larger conversation about what we want our workplaces to value.

Adapting to “New Normal”—Response to COVID-19: Village Libraries’ Services
Jelena Rajić (Radislav Nikčević Public Library)
Poster
Show Abstract

Purpose and goals
The aim of this paper is to outline the impact of COVID-19 on the services that the village libraries provide their patrons with and to demonstrate the adaptability of village librarians to drastic changes during the COVID-19 pandemic while the facilities were closed and then re-opened. It was important for the „Radislav Nikčević“ Public Library to gain a better understanding of the shift from in-person services to online and door-to door services that the village librarians provide since these libraries are meeting places and cultural and informational hubs of their local communities for patrons of all ages.

Data Presentation & Visualization

Visualizing the Intersection of Impact and BTAA Libraries Investments in the Research Enterprise Using Open Government Data: An Exploratory Model Using Tableau
Sarah Murphy (The Ohio State University)
Paper
Show Abstract

Purpose and goals
Grant funding serves as an important proxy for quantifying the the value of the academic library. Past studies rely on researchers to self-report whether they used and cited library resources when crafting successfully funded research proposals. (Kaufman 2008, Tenopir et al. 2010) More recent studies seek to quantify library support for the grant seeking process using data gleaned from Scopus, Web of Science, Journal Citation Reports, and other tools. (Boukacem-Zeghmouri, et al. 2016; Monroe-Gulick, Currie, & Weller 2014) With the advent of the Federal Funding Accountability and Transparency Act of 2006 and the NIH Public Access Policy, libraries can now leverage open government data to explore the relationship between grant funding and investments in library collections and services. This study explores modeling open data on government spending and federally funded research outputs to 1) visually demonstrate how libraries contribute to the research enterprise by providing information scholars need to both develop and sustain their research agendas and 2) allow libraries to visualize and utilize this same data to inform the development of library services and collections.

Design, methodology, or approach
Open government data now allows libraries to identify publications authored by their institution’s faculty that result from federally funded research. Libraries may model this data to determine what journals faculty chose to publish their research in, as well as what journals faculty cite, and the value of these individual research outputs. The model for this project was created by first identifying all NSF and NIH project grants awarded to BTAA faculty between FY2010 and 2018 using the Federal RePORTER (now USAspending.gov) data portal. BTAA schools collectively expended more than $11 billion on research in FY2019 and have a robust program for optimizing researcher’s access to member libraries’ collections. A list of publications associated with these grant projects was then downloaded with the corresponding link tables from the Federal ExPORTER and enhanced by pulling lists of citing papers and reference papers to identify what journals were used to inform the author’s research and what journals cite the author’s research. NlmIds were added wherever possible using the journal title for each publication to later identify and use MeSH terms as visualization filters. The Relative Citation Ratio (RCR) value was also pulled for each publication to use as a supplemental metric of value. All data was then modeled in Tableau using a series of relationships and joins and visualized in a series of interactive dashboards.

Practical implications or value
Modeling and visualizing the outputs of successful grant-seeking using Tableau allows libraries to explore this data at both a high aggregate and lower level of detail. This project demonstrates how to assemble and utilize such data to both illustrate libraries ongoing contributions to the research enterprise and inform library collections and services.

Visualizing Value of Library Collections Relative to the University Teaching and Research Enterprise: An Application of the CDL Journal Weighted Value Algorithm by Three BTAA Libraries
Sarah Murphy (The Ohio State University)
Paper
Show Abstract

Purpose and goals
Academic libraries license many e-resources through state or regional consortia. Differences in school demographics, disciplinary emphases, budgets, priorities, and licensing restrictions can make the analysis of use and cost patterns in shared collections challenging. Studies have shown that a minority of e-journals in publisher packages get a majority of the downloads. However, not all articles downloaded are later used in teaching or research, leading to the question: What other metrics of use should be considered and how can they be presented to support decision-making?

Design, methodology, or approach
The Big Ten Academic Alliance (BTAA) recently purchased a subset of the Clarivate Web of Science dataset, which includes the entire WOS database. The authors, representing three out of the fourteen schools in the Alliance, adapted for analysis and visualization the California Digital Library (CDL) Journal Weighted Value Algorithm, which generates a value score for individual journals using authorship, citation, and usage data. This project shows how to model in Tableau COUNTER download data with bibliographic, authorship, and citation data from Web of Science, plus average cost data from Library Journal’s annual periodicals price survey and then effectively display this data through a series of dashboard visualizations. Specifically, the authors created three prototype dashboards to visually answer several questions by subject discipline and publisher package at both broad and more granular levels. The questions included:

Title dashboard

What journals do BTAA faculty and researchers publish their research in?

What journals do BTAA faculty and researchers cite?

What journals do BTAA faculty and researchers download?

Do 20% of the titles represent 80% of total downloads?

Publisher dashboard

Do a publisher’s titles in a subject area represent better value compared to the average score for all publisher titles in that subject?

Subject dashboard

Do authorship/citation rates vary widely by subject discipline?

Findings
Each dashboard shows how many BTAA schools subscribe to each title. Dashboard users can also limit the view to a particular school.

Practical implications or value
Tableau offers an opportunity to automate the packaging and display of large datasets, allowing librarians to design useful visualizations and then set a schedule to refresh this data, on a daily, weekly, or monthly basis or on a customized schedule. This saves library analysts significant time, when information is needed to inform decision-making. This project shows how to model, analyze, and assess the value of publisher journal collections held locally or by the consortia by visualizing trends for downloads, citations, and authorships. It also provides a proof of concept for the long-range potential for automating the visual analysis of big data to enhance academic library collection development.

Development of an IPEDS Academic Programs Dashboard: Leveraging Public Higher Education Data for Strategic Insight
Clair Johnson (University of Pennsylvania)
Paper
Show Abstract

Purpose and goals
This paper addresses the question of how academic libraries can gather intelligence about patterns within the institution and across the ecosystem of higher education. More specifically, it explores a particular approach to monitoring changes in academic programs and the emergence of new areas of study.

Design, methodology, or approach
To demonstrate this approach to tracking academic programs, this paper explains how data from the Integrated Postsecondary Education Data System (IPEDS) were accessed and processed into an interactive dashboard developed in Microsoft Power BI. The final dashboard is publicly accessible and displays data gathered from the 132 institutions with a Carnegie classification of “doctoral university: very high research activity.”

Findings
This paper demonstrates the value of utilizing public data with data dashboarding methods to leverage crowdsourced insights from a massive trove of information. Insights that have already emerged from these efforts include:

  • The growth of multi- and inter-disciplinary areas of study
  • The emergence of data-focused areas of study alongside the decline of content-bound areas of study (e.g., data science in contrast to economics)
  • The unique identities of Ivy League institutions based on their predominant areas of study

These insights (and others that have or will continue to emerge) provide information that can be used to inform the development of academic libraries. This paper will discuss the implications for hiring, organizational structures, and services provided. One such implication is the need to hire librarians able to support multidisciplinary research and curricula and create an organizational structure that does not position them in service to individual subject areas.

Practical implications or value
This paper provides a proof of concept for the use of public data in an interactive dashboard and demonstrates how other institutions can replicate this approach with any number of datasets and visualization tools. Furthermore, in an era when massive troves of data serve as an overwhelming source of information, this paper demonstrates how those working in library assessment can crowdsource insights most relevant to the context in which they work. Finally, this paper demonstrates the importance of thinking about library assessment with an outward-looking lens, assessing not only how the institution is functioning internally, but also monitoring the external trends and patterns that ultimately impact the work of the library.

The Missing Data: The Absence of Data Literacy Standards in Librarianship
Heather Charlotte Owen (Syracuse University)
Poster
Show Abstract

Purpose and goals
Data literacy is the ability to read, understand, work with, analyze, and communicate with data. Within these last few years the importance of data literacy has grown exponentially, with many libraries and institutions recognizing data literacy as a new priority. The ability of librarians to instruct users and assess their data literacy skills is encumbered, however, by the lack of an official data literacy standard or framework. What skills should individuals need to possess to be considered data literate? What ethical or social dilemmas should they reflect on before they work with data? How can users gain the skills necessary to share their data with the world, and become part of the data conversation? In order to answer these questions, this poster proposes an official working group should be created to draft a data literacy standard or framework.

Design, methodology, or approach
According to Chapter 14 of ACRL’s Guide to Policies and Procedures, the creation of a new standard or framework requires analyzing current standards to see if any satisfy current needs. Although there are no data literacy standards within the field of librarianship, this project will examine general information literacy standards such as ACRL’s Framework for Information Literacy in Higher Education. Following this, it requires an analysis of standards outside of the field of librarianship, and the development of a working paper suggesting a structure for the proposed document, a development timetable, and supporting requirements.

In order to satisfy these requirements, this poster will analyze current data literacy standards, such as the SLDS Data Use Standards, and Pre-K-12 Guidelines for Assessment and Instruction in Statistics Education II: GAISE II. This project will follow in the footsteps of Maybee and Zilinski’s article “Data informed learning: A next phase data literacy framework for higher education,” and will propose several questions to the poster audience. Should an official working group be created to draft a data literacy standard or framework? How can we ensure this data group possesses diverse perspectives, and incorporates data ethics and data bias into the framework?

Findings
This project anticipates discovering that no library standards currently support data literacy, and a new standard will need to be created, using the following standards as a guide. GAISE II, which is focused on PreK-12 education, centralizes the main points of formulating statistical questions, collecting/considering data, analyzing data, and interpreting the results. While SLDS, which is focused on educator education, features awareness of data quality, data types, data sources, data analysis, data collection, data assumptions, data privacy and ethics, data management, data discovery, data interpretation, data presentation and visualization, data laws, and data collaboration.

Practical implications or value
The goal of this project is to inspire other individuals to create a working group so the process of writing a working paper for ACRL can begin. By presenting at LAC, a diverse group of individuals, including individuals from marginalized communities, will hopefully be galvanized to participate.

Finding a Home for A–Z and Everything In Between
Anne Koenig (University of Pittsburgh)
Poster
Show Abstract

Purpose and goals
This poster describes how analytical and visualization tools were used for planning and charting collection moves during Hillman Library (main library at the University of Pittsburgh) renovations. This is a follow-up to our presentation at the last Library Assessment conference which outlined how we used analytical and visualization tools to determine what collections to send to storage and enable users to locate materials as they moved to various swing spaces.

Now, that we knew what would be moved to our high-density storage facility as part of the renovation of Hillman Library, we needed to develop the final mapping to move collections from various swing spaces to specific shelves within the renovated space. Our goals for the mapping were to ensure that:

1) The items remaining in Hillman would fit in the space allowed and meet a maximum shelf capacity of 75% to allow for growth;

2) Call numbers would follow the prescribed order and be divided logically between floors;

3) Contractors engaged to move the collection would know exactly which shelves would contain specific call numbers.

Design, methodology, or approach
We used inventory reports from Alma Analytics and Tableau visualizations to estimate the footprint of the collection remaining in Hillman Library after criteria for sending items to storage was applied. We also examined recent acquisition rates across call number ranges to estimate future growth in different subject and physical areas. This information was incorporated into a spreadsheet that calculated how many shelves would be needed in the renovated space for each LC call number class. Using this information along with building floorplans, a color-coded spreadsheet was developed to illustrate each shelf, the call number range of its contents, and also illustrate empty shelves incorporated into the mapping to allow for growth and ease of collection shifting and adjustments.

Findings
We are currently in the final stages of transferring material to Storage and initial stages of the collection shift within Hillman. As more items are transferred, it is easier to discover mismatches between inventory reports and what is actually on the shelves. This has led to some quick catalog clean-up projects as well as plans for future inventory scanning and updates to the catalog to fine-tune reporting and ongoing collection maintenance and weeding for Storage. Building buffer space in the mapping was critical to account for any inventory anomalies, and this also provided targeted start and end ranges for call numbers for the movers to use and to ensure we had enough room for material to be shifted.

Practical implications or value
All our techniques are easily adaptable, replicable and can use freely available tools like spreadsheets and any visualization or graphing software. They can be used for large or small moving projects alike and can aid with evaluating the composition and distribution of collections.

Download, Clean, Repeat: Creating a Sustainable Data Dashboard Workflow
Kaypounyers Maye (Tulane University)
Poster
Show Abstract

Purpose and goals
In academic libraries, data dashboards usually present internal library data related to spaces, collections, personnel, and services in an easily digestible visual format. These dashboards can be used to inform data-driven decisions and strategic planning… but they can also be labor-intensive to build and maintain. For this pilot project, Clemson Libraries used several common academic library software platforms as well as user-friendly data visualization tools to create a simple dashboard, with a special focus on prioritizing sustainability of the workflow. The goal was to maximize automation so that the dashboard can remain current with minimal labor; it should serve as a resource and complement to existing library assessment initiatives, rather than another new initiative competing for employee time.

Design, methodology, or approach
Before building the dashboard, the project team met with some key stakeholders (Dean of Libraries, Head of Reference, etc) to determine exactly what visualizations they would like to see, especially for decision-making, planning, or outreach purposes. From there, we took an inventory of existing data sources; these included Sensource (gate count data for the main library on campus), Alma Analytics (collections data from our ILS), and Springshare (instruction and reference consultation data). We used Tableau for our visualizations, as it provides one interface that can connect to live data sources, transform them into visualizations, customize the look and feel of the dashboard for our university branding, and is easily embedded in our library website.

Findings
As is often the case, what seemed initially like a straightforward project quickly became more complicated. Although Tableau supports linking directly to data sources, our proprietary library systems do not currently have the corresponding functionality to accomplish this. As a workaround, we developed a schedule of specific reports to run and a shared Google Drive for these reports (mostly in the form of CSV files) to link to Tableau. We are currently partnering with our library IT to set up automated reports and university server space for the data so that it is not linked to an individual’s Google account.

Practical implications or value
For our own institution, the dashboard will have the immediate value of providing quick visualizations for commonly requested statistics, such as the gate count for a certain date range, number of checkouts, or reference statistics. This will save time for employees that need these numbers, and present them in an official format consistent with university standards. More broadly, many academic libraries have either recently created or are in the process of creating data dashboards to communicate with stakeholders to show the impact of their work, and many libraries use the same software systems. Although no two institutions will have exactly the same dashboards, we hope to shed light on the work behind our dashboard so that others may use it as a starting resource.

Organization/Space/Critical

A Sense of Place
Holt Zaugg (Lee Library BYU)
Paper
Show Abstract

Purpose and goals
Recent efforts to help all students feel included and welcome on campus, especially in the library. However, before embarking on efforts to help students feel more included and welcome, one needs to know the current state of inclusion and welcomeness. This effort describes the first efforts to establish baseline data for our library.

Design, methodology, or approach
Following a literature review, six measures were identified six measures that we collectively call Sense of Place measures. These include:

A student’s sense of belonging
A student’s connection to the library
How respected the student feels in the library
How safe a student feels in the library
A student’s level of comfort in talking with a library employee
How welcome a student feels in the library

Using a survey and a random stratified sample of undergraduate students, we invited students to indicate the level to which they experienced each of these measures. We examined mean ratings for all students and then disaggregated by ethnicity, gender, and university status. We also used a principal components analysis to determine how the Sense of Place measures clustered for each group.

Findings
There were strong mean ratings for five of the six Sense of Place measures with one, connection to the library, having a lower mean rating. Disaggregation helped to indicate specific groups that may have higher or lower mean ratings. The principle components analysis typically indicated that all measures were in one or two groups, indicating a common Sense of Place measure.

Practical implications or value
The mean ratings provide baseline data for comparison following future initiatives. The six measures create an initial understanding of components that help students feel like the library is a positive part of their lives and that students are a positive part of the library’s life. Most importantly, the results offer value about what we are doing well with an eye towards where and how we can improve.

Re/envision, Re/imagine: Student and Employee Assessment for a Library Space Redesign
Laura Uglean (Jackson University of Northern Colorado)
Poster
Show Abstract

Purpose and goals
This paper focuses on creative methods to gather employee and student opinions for use in a possible interior redesign of a 50-year-old academic main library, and to generate useful and shareable data for administrators and architects. The UNC Libraries Assessment Committee partnered with a campus research unit, the Social Research Lab (SRL), to create interactive and visually compelling ways to engage both employees and students, moving beyond the traditional questionnaire, to learn what features, design elements, and services are desired in a remodeled library. This partnership utilized a variety of hands-on activities for library employees and an online survey for students.

Design, methodology, or approach
Our approach was based on previous UNC Libraries space assessment projects and existing literature about working with architects and gathering student feedback for remodels. We also partnered with colleagues in SRL, who are experts in survey instrumentation and data analysis. These resources were invaluable for determining the best approaches for soliciting feedback in a fun, creative, and engaging way.

The first method focused solely on library employees. During a monthly all staff meeting, employees had four activities to help reimagine the building’s interior: draw or mark up blank floorplans; place comments or stickers on printed images of public and staff spaces to express likes and dislikes; write their top three priorities on a form; and comment anonymously through an online form. Employees were encouraged but not required to participate, and their responses were anonymous. Three months after this meeting, we deployed an online survey to collect student feedback. The survey ran for two weeks and asked students to rank images, comment on their selected images, and upload other images they would like to see included in a newly designed space. The Assessment Committee distributed the survey via signs and posters throughout campus, sharing with classes and groups, and tabling at the dining hall during lunch hours. As an incentive, students could enter a drawing for a $25 Amazon gift card.

Findings
Nearly 500 students responded to the survey. Based on guidance from SRL, we had set a goal of 100 student responses; therefore, based on these high participation numbers, we think the interactive and visual approaches worked for persuading students to complete the survey and demonstrates the importance of the building to students. Preliminary results show that students want more private study spaces and seating, and offer insights into popular (and unpopular) design elements.

Practical implications or value
This work provides useful examples and information for gathering employee and student data about space redesign, working with an on-campus research entity, and sharing information with an architect. By the conference date, we will have solid conclusions from the survey and will be able to address the types of information useful for an architect and which data was used for architectural renderings.

Semiotic Analysis of a Science Library: Inclusion and Messaging
Sarah Fitzgerald (University of Massachusetts Amherst)
Poster
Show Abstract

Purpose and goals
The purpose of this study is to investigate what semiotic analysis can discover about how welcoming and inclusive a science library space is for patrons. Semiotic analysis examines the meanings that individuals interpret places as having. It involves the study of objects, which can range from images and words to physical items, and their meanings as individual interpreters understand them (Hall, 1997). We chose to study a science library space because the lack of racial and gender diversity in STEM is a persistent challenge despite the growth in the number of STEM jobs and STEM degrees earned (Pew, 2021).

Design, methodology, or approach
We conducted a semiotic analysis of a science and engineering library to determine how well the signs and signifiers in the space reflect its goals. To that end, we examined how diverse groups of patrons might interpret elements in the space, which behaviors are encouraged and discouraged, and whether the space promotes scientific disciplines to its visitors. The space we investigated serves as a case study highlighting the ways in which library spaces can communicate messaging to patrons of various backgrounds.

Findings
Based on our semiotic analysis, the library space’s communication to its patrons succeeds more in promoting science and encouraging desired behavior (or discouraging undesired behavior) than in promoting diversity and inclusion.

Practical implications or value
In keeping with universal design, libraries should provide clear and visible signage for the library itself, as well as its elevators, exits, restrooms, quiet study spaces, group study spaces, and browseable stacks. Libraries should provide gender inclusive restrooms and clearly marked spaces for religious reflection. Libraries can make an effort to choose inclusive art and display artifacts to appeal to patrons from a variety of backgrounds.

Libraries can learn from our findings that library signage that precludes activities frequently performed by patrons should be accompanied by library signage that directs patrons where they may participate in these activities without disturbing other patrons. This will balance the prescriptive, negative messaging in libraries with positive, inviting messaging. Libraries should consider their priorities in terms of safety versus a feeling of surveillance for patrons when designing study spaces. While glass can help library staff monitor activity in the library, glass walls can also lead to a lack of privacy and a feeling of distrust. Likewise, libraries must make decisions between the security for their materials provided by wired glass and theft detectors versus a more welcoming atmosphere of trust.

A science library should be updated with modern, clean, furnishings in good condition to show respect for its patrons and their work. It is important to represent the interests of patrons a library wishes to welcome in a balanced way. A science library should not have a predominance of science items from any particular science discipline it serves, but provide appealing displays from a variety of disciplines representing both the history of science and modern advances. Educational disciplinary displays can incorporate diverse scientists to promote the inclusion of diverse patrons.

Building a Culture of Assessment: Library Data Days
Steve Borrelli (Penn State University Libraries)
Poster
Show Abstract

Purpose and goals
As Library Assessment has matured as a sub-domain of librarianship, institutions have responded by creating assessment focused positions, committees, and departments. While structures and approaches differ, a commonality is the aim of developing a culture of assessment across an organization. Assessment work aims to catalyze change, which implicitly threatens the status quo. To combat a resistance to change, institutions across higher ed have aimed to develop a culture of assessment which Ennis (2010) asserts “is code for not just doing assessment, but liking it”.

This poster describes Penn State University’s Library Data Days (Data Days), an annual in-house professional development event in its sixth year which promotes developing an assessment culture by spotlighting efforts which have informed decision-making across the organization. Data Days provides a safe environment for staff and faculty to share about projects which inform decision-making across 24 physical campuses. Data Days is a keystone of the Library Assessment Department’s efforts to build a culture which embraces assessment work, which often manifests as inclusion of assessment as a standard component in planning efforts, service evolutions, and research design.

Design, methodology, or approach
To demonstrate the value of the event to participants, six years engagement metrics will be explored including the extent and breadth of participation, broad scope of content delivered, and historical post-program evaluation data.

To evidence the developing “culture of assessment” a quantification of assessment work integrated across the University Libraries strategic plan, as well as consultation metrics for library personnel supported by the Library Assessment Department will be presented.

Findings
Data Days is an in-service event which functions as a low-cost, highly participatory component of the Library Assessment Department’s efforts to develop a culture of assessment. It is an effective means of promoting and developing culture, sharing organizational knowledge and informing day-to-day decision-making.

Practical implications or value
Developing any culture requires deliberate actions. In-service events designed to engage staff and faculty alike in sharing out their work which informed decisions, instills the concept of assessment being the responsibility of everyone across an organization. When combined with additional support for integration of assessment broadly, events like Data Days contribute strongly to developing culture, in this case a culture of assessment.

Did We Get It Right? Foundations for Ongoing Space Assessment
Matthew Barry (University of Western Ontario)
Poster
Show Abstract

Purpose and goals
The main branch of Western Libraries is undergoing a major renovation. While the renovation itself was inspired by user feedback, design decisions were primarily made by the contracted architects. This space assessment project seeks to assess the success of the renovation by gathering data about how newly renovated spaces are being used by students and comparing those data to the intentions for each space as stated by the architects. The results of this assessment will be used to identify necessary modifications to the newly renovated spaces; baseline data about our users’ space use gathered in this assessment will also be used to guide future renovations and furniture purchases in libraries across the system.

Design, methodology, or approach
Renovated spaces will be observed at key points of the term: early term, mid-term (avoiding midterm exams), and during final exams. These observations, grounded in a typology of library spaces based on presentations by the architects, will identify to what degree users are using spaces and compare usage and observed noise levels to the intention of each space as set by the architects. In addition to space counts, qualitative observations will be noted for each space to capture themes like creative space modifications and unusual use of furniture.

Findings
Certain furniture configurations or the environmental elements may have an impact on which spaces library users choose to spend time in, as well as the way users interact with the space around them and with other users. These interactions may also change based on the current academic workload of the user.

Practical implications or value
Understanding when and how different types of furniture configurations are used will help libraries to select furniture and furniture arrangements that better meets students’ needs. We hope to also inform the next stage of our own library’s renovation with our findings.

Share the Space—Actions to Make Library Spaces More Accessible
Greg Davis Iowa State University
Poster
Show Abstract

Purpose and goals
In the fall of 2021 the ISU Library surveyed students supported by our campus student accessibility services department. The goal of the survey was to collect ideas for how to make our library spaces more accessible. The goal for this LAC poster is to describe our survey and share the action steps we took based on the information collected.

Design and Methodology
A mixed methods study was conducted using a Qualtrics survey sent to 1400 students served by our campus student accessibility services department. The survey allowed students to self-identify their disabilities and to provide comments related to the accessibility of library spaces. The project was a collaborative effort between the library’s assessment department, the library’s Assistant Dean for DEI, and the campus Student Accessibility Services department.

Findings
The project findings are grouped into the following categories:

Need for more quiet study spaces & procedures for maintaining quiet
Need for more low-to very low distraction study spaces.
Mapping and Signage –Knowing what to expect and where it’s at
Need for dedicated accessibility spaces and areas
Mobility Issues

Conclusions: We have determined ways to make our library spaces more accessible, especially related to the needs of neurodiverse students.

Practical implications or value
Many academic libraries are increasing their DEI efforts. Ensuring library spaces are accessible to everyone is an important part of this work. We believe other libraries will be interested in hearing about the action steps we took based on the feedback we received.

‘What We Celebrate Workshops’: A Participatory Approach to Aligning What is Celebrated in Strategic Planning and Other Evaluative Contexts
Steve Borrelli Penn State University Libraries
Poster
Show Abstract

Purpose and goals
In response to learning that personnel across the libraries felt that the recently updated University Libraries Strategic Plan was narrowly focused such that many struggled to see how their roles and efforts contribute to advancing planning initiatives, a taskforce of the Strategic Planning Implementation Committee developed the ‘What we Celebrate’ workshop series. The workshops engaged personnel in brainstorming exercises to surface activities ‘celebrated’ in evaluative and other contexts (e.g., annual performance, and promotion and tenure reviews) to inform a revision to our strategic plan which aim to integrate and align activities ‘celebrated’ in contexts outside of strategic planning into the planning context. In total, 67 library personnel including 35 faculty and 22 staff, across 11 campuses participated. The “What we Celebrate” workshops informed dozens of modifications to the plan including over 20 refinements and additions to key performance indicators. Workshops illustrated that many contributions personnel were interested in sharing, were well aligned with the University Libraries Foundational Values catalyzing the integration of “Libraries Foundational Values in Action” as a story telling mechanism spotlighting how the Libraries live our foundational values in annual university reporting.

Design, methodology, or approach
Workshops were designed to engage stakeholder groups including library personnel broadly, staff supervisors, faculty supervisors, faculty who have served on promotion and tenure committees, and the Dean’s Library Council. Each workshop consisted of small group activities and report outs, focused on what’s celebrated presently and what we’d like to celebrate in the future. Results of the activities were analyzed to inform new and updated measures to the strategic plan.

Findings
In addition to informing plan revisions, two key findings emerged from the workshops which improve the ability for personnel to see their efforts reflected in planning. 1) That the broad scope of activities library practitioners engage in are often more easily represented through effort-based measures which allow for articulating actions taken to advance on an objective when quantitative measures are insufficient to illustrate progress. 2) Roles often support operationalized activities seemingly absent from advancing strategies for the future, as such, providing opportunity for peers to see their efforts reflected, can assist with perceived connections to planning efforts.

Practical implications or value
Often developing and or implementing a strategic plan can leave personnel feeling absent from the plan, as if their roles have little to contribute to advancing the future of an organization. A core challenge in strategic planning is maintaining stakeholder engagement throughout the duration of the plan. Committing to treating the plan as a ‘living document’ provides opportunity to consistently engage personnel in the content of the plan. Spending the effort to listen and respond to concerns can create additional opportunities to promote engagement with strategic planning in addition to leading to iterative improvements. Finally, the relevance of effort-based measures in allowing for a broader scope of library personnel and departments to contribute to reporting out to enhance connections with planning efforts.

Mental Models of the Organization of Scholarly Information: A Theoretical Framework
Joy Nam (University of Glasgow and the Bodleian Libraries, University of Oxford)
Poster
Show Abstract

Purpose and goals
Existing literature on user experience with library search and discovery focuses on the difference between user expectations and what is delivered; differences between “digital natives” and “digital immigrants”; and individual preferences for interface design. However, research at the Bodleian Libraries hints that different user experiences might be explained by disciplinary differences in expectations—something not covered in the literature.

This poster presents a new theoretical framework for investigating information-seeking behavior, which will provide a novel lens with which to examine user experiences of library discovery systems.

Design, methodology, or approach
The research presented in this poster draws on multiple disciplinary traditions including psychology, organizational studies, cognitive anthropology, library and information science, and transcultural studies. Existing models of information behavior and mental models theory from the literature in these disciplines were synthesized into a holistic, qualitative and interpretivist framework that inter-relates individuals, groups, and systems in context.
Findings
The research paradigms in information behavior studies and mental models theory have evolved to a socio-cognitive and context-oriented approach.

The growing recognition of the importance of social and cultural factors in the field of information behavior has motivated the move towards a more holistic understanding of the human being as a situated, embodied, and complex agent, acting in a dynamic environment with an agency of its own.

The theory of mental models has also shifted from a focus on an individual’s cognition in interaction with a single system to a notion that attempts to factor in the shared, collective aspects of human thought and action.

This poster brings together these two perspectives into a holistic theoretical framework for investigating disciplinary differences in the mental models of the organization of knowledge in the academy. The holistic framework presented in this poster integrates individual differences with social and cultural ones in order to arrive at a contextualized understanding of disciplinary influence on information-seeking behavior.

Practical implications or value
There is currently an incomplete understanding of differences in users’ approaches to using library search and discovery, specifically the impact of disciplinary differences.

The theoretical framework described in this poster will be applied to the next stage of the research to investigate the information-seeking behaviors of library users at the Universities of Glasgow and Oxford.

It is anticipated that the framework will also provide a novel lens for other researchers to investigate user experiences with other areas of academic libraries.

A Value Statement For Social Science Data Services
Cameron Tuai Drake University
Poster
Show Abstract

Purpose and goals
This poster presents a value statement for improving librarian’s ability to ethically analyze moral dilemmas within social science data-oriented practices such as learning analytics; meta data standards; or data curation and preservation.

Design, methodology, or approach
The value statement draws upon the following materials:

Theories of identity and social identity
Social construction of technology and social informatics
Institutional theories of legitimacy
Concepts of community and the common good
Aligned professions’ value statements
Ethics of care

Findings
The value statement focuses on “freedom of social identity.”

Freedom of social identity occurs when we recognize and support the right of individuals represented within the data to self-identify and to associate into communities based upon those identities without fear of oppression, othering, misrepresentation, or other harm.
Freedom of social identity ensures that the application of social science data results in communities being able to plan and control the social processes through which internal community values are defined and realized, and the external common good strengthened.

Practical implications or value
The application of the value statement will guide ethical analysis of moral dilemmas within the practices associated with social science data.

Creating, Administering, and Responding to a Library Workplace Climate Assessment
Craig Smith University of Michigan
Poster
Show Abstract

Purpose and goals
This poster will lay out the steps we recently took at the University of Michigan library to create, administer, and respond to a homegrown workplace climate and culture survey. The poster will lay out the journey involved in creating and piloting our comprehensive survey, the steps involved in administering it in a way that instilled a sense of safety in respondents, the process we used to form a diverse team responsible for reporting on the results, and the ways that the library community and its leaders have responded to the findings and recommendations. We hope that sharing these experiences will help and inspire other libraries that want to assess their own workplace climates and cultures.

Design, methodology, or approach
The poster will display some example questions from the survey, and will also show examples of how we reported on the survey results using various types of written reports, presentations, and data visualizations.

Findings
Some very basic examples of survey findings will be presented in the poster, but the key ‘findings’ here will be an account of both the challenges and successes we experienced when going from initial conceptualization all the way to responding to the survey data. It is hoped that others interested in workplace climate and culture work will gain valuable insights from the learning that our library did along the way. Key lessons learned involved best practices for communication, how to build and work with a diverse survey team, how to coordinate with library administration, and how to move from survey findings to organizational change.

Practical implications or value
The hope is that other libraries that are interested in assessing workplace climate and culture can benefit from hearing about both our missteps and our accomplishments. We are also happy to share our instrument with others, and will have copies of the whole instrument on hand.

An Intersectional, Mixed Methods Analysis of First-year First-generation Students’ Library Perceptions and Use
Megan Hodge Virginia Commonwealth University
Poster
Show Abstract

Purpose and goals
Though first-generation students are acknowledged in the library literature to be a heterogeneous group, this recognition rarely informs scholars’ methodologies, resulting in researchers drawing inferences tacitly assumed to apply to all first-generation students. The purpose of this study was to enhance understanding of first-generation students’ perceptions and use of academic libraries and how students’ intersecting identities manifest in differing library needs: How are first-year first- generation students best classified based upon their academic capital and library anxiety? What does each cluster of students perceive as the most and least helpful library services and spaces for themselves?

Design, methodology, or approach
To effectively support first-generation students, academic libraries must first understand their needs. However, limited research has been conducted into how these students, who often embody multiple marginalized identities, perceive and use academic libraries or how first-generation students’ intersecting identities manifest in differing library needs. This study builds understanding of first-year first-generation students’ library perceptions and use. To do so, an explanatory sequential mixed methods design informed by a critical pragmatism worldview was used. First-year first-generation students at a large university were surveyed using Winkler’s (2013) Academic Capital Scale and Anwar et al.’s (2012) AQAK: A Library Anxiety Scale for Undergraduate Students. Cluster analysis was used to group Participants were grouped by their subscale scores via cluster analysis. Explanatory interviews with demographically representative informants from each cluster provided deeper understanding of the ways in which students experience their academic library and how these differ by cluster.

Findings
Because little library research has been conducted on first-year first-generation students or on how students’ intersectional identities manifest in differing library use, the number of clusters that will emerge cannot be determined conclusively in advance. However, it is anticipated that three clusters may emerge. One will have high levels of academic capital and low levels of library anxiety due to enrollment in selective high schools or courses. A second cluster will draw confidence from their first-generation status and be more willing to seek help. The third cluster will have students who aren’t able to spend as much time on campus developing relationships with faculty and academic staff, and who are more likely to experience library anxiety and rely on friends and family for assistance with college tasks such as finding sources for research papers.

Practical implications or value
The practical significance of this study derives from academic libraries’ imperative to serve all their students. Previous researchers examining first-generation students’ library use have primarily taken a deficit approach that identifies students rather than institutions as problematic. These scholars have also generally inferred that individual factors, such as race or first-generation status, explained their findings when it’s likely from their research designs that other, related variables may be equally if not more plausible explanations. Library services informed by these studies’ findings may therefore be of limited efficacy and could even be counterproductive for the students they are intended to support. Study findings will enable librarians to identify services that are problematic, or needed but as-yet nonexistent, to better support their first-generation students.

The Creativity Index: Developing a Scalable Library Space Study Model
Laura Spears University of Florida
Poster
Show Abstract

Purpose and goals
In January 2020, librarians from a large, academic research library and researchers from the university’s interior design department, College of Design, Construction and Planning (DCP), collaborated on a study to develop an effective and scalable research framework to examine how library spaces can support students’ problem-solving and innovation processes . The team from a public university located in the Southeastern U.S. developed a survey designed with a “creativity matrix” that featured adjective pairs designed by DCP researchers to examine student views of an environment that supports the creative process. Combined with open-text questions, the survey offered a measurable view of conditions that students desire with tangible characteristics that future renovations can easily incorporate. This survey was conducted in Fall 2020, delayed by the response to COVID which closed the campus for 5 months. Findings revealed several statistically significant differences between the current library setting and an ideal one that students indicate would facilitate their creativity, resulting in greater innovation and problem-solving.

Design, methodology, or approach
Study participants selected terms from a validated instrument used in construction design, Yarnell’s Adjective Checklist (ACL) Creativity Scale, which was adapted for library use. The ACL framework was designed to complement the model of creative thought articulated by Graham Wallis, that exhibited five stages: preparation, incubation, intimation, illumination and verification. The survey also included traditional queries used in library space studies including frequency of use, activities engaged in and specific spaces utilized. Descriptive details collected included class status and major. .

Findings
Results include students’ needs for a variety of spaces (System-wide Diversity) and flexibility of use (Choice and Control) which provides them the type of space they need depending on the activity they are engaged in or the stage of the creative process in which they find themselves. Also, while graduate students are expected to require silent spaces for more intensive research, they want to be a part of a community of users rather than work in isolating spaces. Analysis revealed that eight of the adjective pairs could be considered an index that identifies ideal space characteristics. The ACL results aligned with the qualitative text comments that provided tangible details for renovating library spaces.

Practical implications or value
Analysis of the study findings allowed the research team to present qualitative comments that accompanied the quantitative results that demonstrated a statistically significant difference between the current space and the students’ vision of an ideal space. Using the comments, the team was able to articulate design elements that would become part of a current renovation of another floor of the science library.

This presentation will describe the creativity matrix, the concepts of “choice and control” and “system-wide diversity” that guide thinking about the creative process and present findings that map the framework to the practical suggestions by participants.

3:30 p.m.–5:00 p.m.: Concurrent Sessions

COVID-19

Overcoming Technology Barriers, Particularly for Historically Underrepresented Students
Travis Teetor (University of Arizona Libraries)
Paper
Show Abstract

Purpose and goals
This paper will describe efforts at the University of Arizona Libraries to improve access to internet and technology during the pandemic and adapting to an ongoing hybrid instructional modality. We will highlight how our institution leveraged campus data and new partnerships to better meet student needs, particularly for underrepresented and first-generation students.

Design, methodology, or approach
The University of Arizona Libraries analyzed anonymized student demographic data, including race/ethnicity, first generation student status, and Pell grant recipients to determine how existing service utilization aligned with the campus population. Our goal was to reach more underrepresented populations and students in need. Located on the U.S. Mexico border, 80% of University of Arizona Distance Education students identify as Hispanic/Latinx and 74% are first-generation college students. University of Arizona typically uses Pell Grant eligibility to identify students as low-income, and in fall 2021, 40% of the total University of Arizona student population was Pell eligible or received a Pell Grant at some point during their undergraduate academic career.

Findings
Reliable broadband internet and technology became increasingly essential educational resources during the COVID-19 pandemic. This is particularly true of students who work in remote regions where access is limited or multigenerational households where the environment is not conducive to learning. Inequities in accessing these resources were exacerbated by the pandemic and stay at home orders, especially for marginalized communities which have been technologically disadvantaged. Similar to national surveys, a Fall 2020 UA survey indicated that one in three students faced limited internet access and two in ten reported that a lack of access to technology or software reduced their ability to perform well in classes delivered online. In order to address these needs, libraries can embrace new roles collaborating in new partnerships and creating spaces to reduce students’ barriers and increase academic success.

Practical implications or value
Traditionally, libraries have served as a central hub for information and resource dissemination across the campus community. New roles include bringing together units that have not traditionally worked together in order to provide increased access to technology and spaces for students. Our approach intentionally prioritizes students most in need, while acknowledging the historic inequities that our community members face. We will share approaches to looking at demographic data in order to better leverage partnerships between university units with a goal of designing efforts with significant reach and impact.

Finding a ‘New Normal’ for Library Assessment: Lessons & Reflections from COVID-19
Jackie Belanger (University of Washington)
Paper
Show Abstract

Purpose and goals
This paper explores how the COVID-19 pandemic and the focus on equity and social justice over the past two years have changed library assessment activities at a large research university. The paper discusses concrete project examples undertaken since the start of the pandemic through Spring 2022, highlights lessons learned about our assessment approaches, and explores how lessons will result in longer-term changes to our program. The paper also raises broader questions about the role of library assessment in the context of continued uncertainty and change, with the aim of sparking dialogue among practitioners about their own lessons learned and the implications for the future of library assessment.

Design, methodology, or approach
Firstly, the authors use a case study approach highlighting activities and changes in one program. Secondly, we offer a literature review from institutional research, higher education and library assessment designed to identify changes to assessment practices and their potential implications. This provides a wider context for the case study and highlights emerging areas of interest for the library assessment community.

Findings
Between March 2020 and April 2022, our library undertook a number of projects that form the basis of our conclusions:

A review of institutional, library, and higher education data sources to illuminate student experiences during COVID-19 and the Black Lives Matter movement.
Creating a data portal with multiple sources of previously siloed information to strengthen data-informed decision making.
A year-long mixed methods study of student needs during reopening, including an online diary study.
An 11-week online participatory design project with undergraduate students.
Participating in an institutional survey on student preferences for online, hybrid, or in person campus services.

The projects often focused on answering the question: “is this working (right now)?” in order to make rapid changes in response to user needs. However, the pivot to fully remote assessment and a focus on equity during this time also highlighted opportunities for program development after we moved out of crisis mode. Our activities resulted in lessons shaping our program for the future:

Prioritizing equity through our project choices and by raising the visibility of existing external data sources, enabling action-oriented conversations about how libraries can respond to wider trends in student needs.
Continuously demonstrating how existing library data can be used effectively.
Strengthening capacity for online assessment projects, including interactive, participatory methods.
Investing in library and institutional partnerships to support taking action on results and participating in campus-wide assessment efforts.

Looking forward, we may continue to face an environment of heightened ambiguity and change related to user needs, expectations, and library use – which, in turn, will require us to continuously examine our assessment approaches.

Practical implications or value
This paper contributes to an overall body of work in assessment by providing examples of how future assessment practices and programs may change in response to what we learned during this unprecedented time. It provides an opportunity for the community to share their own lessons and engage in discussions about the future of library assessment.

Understanding Changing Needs of Students and Faculty: A Comparison between 2018 and 2022 User Surveys
Grace YoungJoo Jeon (Tulane University)
Poster
Show Abstract

Purpose and goals
In the spring 2022, Tulane University Libraries (TUL) launched a survey of all Tulane students and faculty. This survey followed up on a 2018 user survey and was designed to gather feedback from key stakeholders on how well the Libraries support their coursework, teaching, and research. Results will help us learn more about their needs and make improvements to our spaces, resources, and services.

Design, methodology, or approach
This study employed an online survey of students and faculty, our primary user community, to understand their evolving needs. The survey instrument was built upon the questionnaire used for our 2018 survey as well as the questionnaires used at Duke University Libraries and Boston University Libraries with their permission. The survey consisted of the following six sections: Introduction, Demographics, Library Spaces, Library Resources, Library Services, and Conclusion, including 21 to 33 questions depending on the respondents’ answers.

In March 2022, an email invitation to the survey was sent to Tulane students and faculty via various listservs with help from campus partners including student organizations, the Center for Engaged Learning & Teaching, and Office of Assessment and Institutional Research. The survey remained open for five weeks. During that time, three reminders were sent to encourage participation. The data collection has just been completed with the survey closing on April 10, 2022.

Findings
Data analysis will consist of descriptive statistics identifying library user perceptions of the libraries overall as well as by user group (i.e., students and faculty), users’ status (i.e., years in case of students and track in case of faculty), and school. In addition, for questions comparable to the 2018 survey, comparison of survey results from 2018 and 2022 will be conducted to understand the changes in the needs of Tulane students and faculty. The poster will specifically focus on findings from this comparison between 2018 and 2022 survey results. We believe that these findings will allow us to understand how much the pandemic shape the needs of students and faculty in particular and to evaluate changes we have implemented since the pandemic and identify ways to better meet the needs of students and faculty.

Practical implications or value
This proposal will allow us to share our experience using campus-wide surveys to gather feedback from students and faculty and to share our findings related to the impact of the pandemic on their needs and library use. For those who may consider conducting a comprehensive user survey at their organization, this session will help with survey methodology and best practices. This will also provide a venue where the community can share their experiences and insights related to the pandemic’s impact on their organization through interactions with the presenter as well as with other conference attendees, promoting mutual learning among community members.

The Importance of Library Assessment in HR: A Case Study of Transitioning to a Hybrid Work Environment after the COVID-19 Pandemic
Sally Bowler-Hill (University of New Mexico)
Poster
Show Abstract

Purpose and goals
The purpose of this study was to explore and understand employee perceptions of remote work as a potential option for normal operations, based on experiences from the COVID-19 pandemic. The goals were to obtain feedback from library faculty and staff about remote work and telecommuting in order to develop a guideline, and to measure satisfaction with the hybrid work environment several months after the guideline was implemented.

Design, methodology, or approach
Two web-based surveys were sent to all library faculty and staff, one in April 2021 and the other in March 2022. The first survey included multiple-choice and free-text responses regarding how often employees wanted to work remotely; what tasks they believed could be done productively from home; what equipment they needed at home; and what challenges they believed the library faced with employees working remotely. This feedback was used to develop a remote work and telecommuting guideline that was implemented in July 2021. The second survey assessed employees’ experience with telecommuting or remote work since the guideline was implemented, including questions about schedules, satisfaction, equity, what worked better than expected, and continuing challenges. Both surveys were declared minimal risk by our institutional IRB.

Findings
The second survey showed satisfaction with the hybrid work environment, including a high degree of satisfaction with how telecommuting schedules were implemented and allocated within library units. The average amount of time employees spent working from home closely mirrored how much time they responded that they wanted to spend in the first survey. Significant challenges to implementing a hybrid work environment identified in the first survey had been resolved, while other challenges had arisen or continued in the months since the guideline was implemented and the library had resumed normal operations.

Practical implications or value
Librarians responsible for assessment within their libraries should look for assessment and research opportunities related to the administrative operations of their organizations. When integrated at the beginning of a major operational change, these studies can provide valuable data that help inform needed adjustments after the change has been implemented and, longitudinally, as operations are continually evaluated. While libraries’ administrative operations can vary greatly, sharing these studies can provide useful insight to other organizations looking to implement similar changes.

Raising an Already High Bar: Post-COVID Expectations in the LibQUAL+ Survey Dimension of Information Control
Kirsten Kinsley (Florida State University)
Poster
Show Abstract

Purpose and goals
To examine whether there is a statistically significant increase in expectations in the
LibQUAL+® dimension of Information Control (IC) by comparing pre-COVID-19 with post-COVID-19 cohort results. The hypothesis is that faculty and graduate student expectations increased in this dimension after the COVID-19 pandemic began affecting academic library services across the globe following March 2020.

To ascertain whether this is a national and/or global trend with other LibQUAL+® cohorts who have pre and post COVID results.

Explore implications for academic libraries if this is a national and/or global trend. Should these high expectations in the LibQUAL+® Information Control dimension continue how should libraries advocate for improved access for users to be able to find the required information in the format of their choosing?

Design, methodology, or approach
We will reach out to approximately 12 North American and European institutions from around the world with an ARL Memorandum of Agreement (MOU) to request participation in the comparison. Using the MOU, we will coordinate our efforts with ARL to collect de-identified and aggregate LibQUAL+® results in the dimension of Information Control.

Compare pre-COVID and post -COVID LibQUAL+® cohort means in the dimension of information control for the following groups: Faculty, Graduate, & Undergraduate. For example:

  1. Examine minimum and desired means in the dimension for all items related to Information Control (IC 1- IC 8) between cohorts using ANOVAs as appropriate
  2. Compare Superiority Gap Scores between cohorts
  3. Analyze overall IC dimension means between years and groups using ANOVA or t-tests as appropriate

Conclusions/Hypotheses Being Tested

Expectations have significantly increased in the area of LibQUAL+® IC overall for institutions included in the study from Pre-COVID -19 cohorts compared to Post-COVID-19 cohorts. Both minimum mean scores and desired Mean scores will have increased between cohorts.

The gaps between what is perceived and what is desired (Superiority Mean) in IC have increased comparing Pre-COVID-19 to Post-COVID-19 LibQUAL+® LibQUAL results.

Expectations in IC may have steadily increased over time since 2015 –which may not be related to Post-COVID results, but a steady trend upward overall.

Practical implications or value
We will envision that this could spark conversations about rising expectations. Perhaps library administrators could share these results with stakeholders to discuss some of the following questions:

If expectations for faculty and students are increasing in the LibQUAL+® dimension of
Information Control, which is related to access to materials that they need for their scholarship, what implications does this have for libraries?

How do we meet these rising expectations while library materials budgets remain flat?

How might we address the fact rising expectations in IC might remain at higher levels regardless of whether the COVID-19 pandemic continues to change demands for remote access and services?

Crowdsourcing Safety: Using a Dashboard to Keep the Library Safe
Hector Escobar (University of Dayton)
Poster
Show Abstract

Purpose and goals
In fall 2020, an academic library in the Midwest was tasked to cautiously reopen to the campus community. The library faced reopening at a time where unprecedented guidelines were put into place while at the same time trying to keep staff and students safe from Covid-19. As part of the protocol, employees conducted hourly floor counts and safety checks.

While guidelines were put in place, adherence was a different story. Roesch Library developed a reporting mechanism and dashboard that allowed all staff, including student employees, the ability to report and see what floors had problem areas of non-compliance in real-time. Non-compliance to safety protocols included refusal to wear facial coverings, lack of social distancing, and room occupancy violations. The purpose and goal of this poster is to highlight a library’s use of a crowdsourced dashboard and how it served as an awareness tool for public health and safety during the pandemic.

Design, methodology, or approach
Designed initially as a reporting form, data collected was automatically shared via a public dashboard. All staff were expected to be stewards and enforce safety protocols, reporting instances of non-compliance. As part of the protocol, employees conducted hourly floor counts and safety checks. A reporting form allowed employees to report the type of violation, the floor the violation took place on, if students were asked to comply, and any additional comments that aided in identifying problem areas.

Findings
The dashboard allowed employees to see in real-time which floors experienced instances of non-compliance. The data included the number of instances by floor and the number of individuals. We had quantifiable data about violations by library location, time and day of the week.

Violation types were described in a comment box on the form. As a result, we discovered that violators would often use empty food containers as a way to navigate around non-compliance to wearing masks. Some violators would plead ignorance even when previously reminded. We also recorded instances when individuals were rude or verbally aggressive.

The data gathered also allowed the library to make changes and address issues as the semester progressed and also enact changes for the following semester. Adjustments also took into consideration the stress of staff dealing with instances.

Practical implications or value
Library dashboards are still an emerging area. The notion that one person can contribute data while others can view in real-time helps illustrate the utility of displaying live data, but also the impact that data submissions have in highlighting specific activities. The hope is that this pilot could serve as a catalyst for future dashboards and other information gathering that could be crowdsourced for decision-making capacities or awareness.

Centering Transparency and Empathy in Employee Surveys to Build Community and Generate Dialogue
Steve Borrelli (Penn State University Libraries)
Poster
Show Abstract

Purpose and goals
When the Covid-19 pandemic emerged in March of 2020, library personnel transitioned to remote work. Personnel who were used to working on-site, side-by-side not only needed to figure out how to work remotely but to learn to live for a time confined to their homes, disconnected from friends and family. For many, the erosion of personal agency amplified feelings of organizational mistrust, illustrated professional inequities, and catalyzed anxieties that typically are not at the forefront of employer concerns.

Libraries responded to the moment in part by surveying personnel about their work-related needs as well as their personal well-being. At the Pennsylvania State University, the Libraries surveyed all personnel consistently between March 2020 and April 2021, securing over 2,100 responses to questions aimed at identifying workplace challenges, and to better understand and react to supporting personnel needs. This poster discusses survey techniques employed which leveraged the reporting function of Qualtrics and related practices to maintain a sense of community, communicate organizational empathy, and enable action at the department level while maintaining and building trust in assessment initiatives.

Design, methodology, or approach
Survey instruments typically include several elements to communicate their purpose: the reason they’re being conducted, how results will be used, how anonymity or confidentiality will be addressed, and how long the surveys will take to complete. These elements felt insufficient given the moment.

Qualtrics features once overlooked became components of an evolving survey strategy aimed at soliciting sensitive information, promoting community, facilitating empathetic practice, and enabling action highly valued by respondents.

Updated practices communicated through a change log integrated in the survey introduction aimed to enhance transparency, address concerns over anonymity, allowed for broad real-time sharing of results, and for directing responses to leaders closest to responding personnel to enable responding to expressed concerns. Results of the survey(s) were summarized and presented at organization wide virtual forums after each iteration.

Findings
Library personnel appreciated updated practices which allowed for openly sharing personal struggles supported by enhanced practices to maintain anonymity. Respondents reported feeling more connected to peers who were working remotely and on-site. The practice of real-time sharing of responses initially met with concerns over privacy, although this quickly changed to support as the practice was found to provide relief through the realization that many were struggling similarly. Real-time sharing of results facilitated responding to concerns per the results being available publicly.

Practical implications or value
Evolutions to practice are catalyzed by a myriad of drivers. This case study illustrates how technology advancements and situational contexts can inform practice evolutions. The authors envision participants informing their survey practice to maximize impact of the effort while protecting and communicating respondent privacy in order to build and maintain trust.

Embracing New Normal Set Up: Emerging Challenges and Changes in Library Services during the Pandemic Period in Selected Higher Educational Institutions in Metro Manila: Basis for Library Contingency Plan
Mary Jane De Vera (QCU)
Poster
Show Abstract

Purpose and goals
The rationale for this study is to prove that libraries are an essential part of
institutional operations and can provide different types of services in any situation, even if
physical interaction is limited or not possible.

Design, methodology, or approach
This research employed a qualitative research design to describe the nature of a situation that existed at the time of the study and as a guide for further research. The researcher utilized an in- depth interview wherein participants interviewed individually using a semi-structured interview questionnaire through zoom media platform and gathered data in phenomenological tradition. The data collected was tabulated and interpreted using thematic analysis by manual coding method. Purposive sampling technique were also used for the study. Participants which composed of seven (7) library personnel from selected higher educational institutions in Metro Manila participated in the interview. All of them were actively working either from home or onsite during first semester of the school year 2020–2021.

Findings
Libraries were able to adapt despite challenges in new normal setup through, different library practices and social medias. It also shows presence of reliability, accessibility and consistency in resources and services. Work onsite is still matters while issues on internet, limited budget and resources are the challenges brought by current situation.

Practical implications or value
This study will serve as a guidelines for the libraries in preparing continuity or contingency plan.

Methods & Tools/Digital Libraries

Understanding Library Reach and Impact with a CRM
Ellie Kohler (Virginia Tech University Libraries)
Paper
Show Abstract

Purpose and goals
This paper shares the approach taken by the Data Analytics Team at the University Libraries at Virginia Tech (Blacksburg, VA, USA) to measure library connections with university areas in an effort to understand the depth and scope of the library’s influence on the university. The purpose of this study is to examine the relationships created between library personnel and the Virginia Tech community through information recorded in the library’s Customer Relationship Management (CRM) software. Also included are discussions of necessary CRM modifications and descriptions of tools and methods used to transform and display results.

Design, methodology, or approach
This study will be utilizing 12 months of data collected by individuals in the University Libraries through the use of LibConnect, a Springshare CRM software. All efforts were made to gather information in an ethical manner, and this paper will address necessary modifications to LibConnect and the impact that has on analysis. Analysis methods will include standard quantitative statistical analysis, application of time series algorithms to understand how seasonality affects the data, and clustering and network analysis to generate relationship-based mapping. This study will be looking at both the breadth and the depth of the respective relationships.

Findings
It is anticipated that the findings will demonstrate that relationships correlate closely with the size of each respective college or department within the university. In part, because of the liaison librarian organization structure, it is also forecasted that many relationships will be vertical, involving multiple instances of a single librarian interacting with a respective department or college. It is also acknowledged that since this is a relatively new system that was only recently adopted by the library, compensation will need to be made for gaps in the data.

Practical implications or value
This study details the approaches taken by the Virginia Tech University Library Data Analytics Team to focus on measuring engagement, and is part of a greater effort to understand library users and provide the best possible service and address gaps in outreach efforts. In recent years, there has been an effort to know how physical and electronic resources are utilized. This is incredibly valuable information, however it does not fully show or demonstrate the value of library relationships created between library personnel and other members of their communities. The work will contribute to library assessment as a whole by demonstration of a way to create a system that allows the ability to measure library connections created through instruction events, consultations, collaborations, and partnerships using a CRM. Through the setup and use of data collected through the CRM, this project ultimately hopes to create a blueprint of a library’s influence while respecting ethical data collection principles.

The Meaningful Measurement of Liaison Librarian Services in an Uncertain World
Jennifer Thomas (Queensland University of Technology)
Paper
Show Abstract

Purpose and goals
This paper discusses an approach the Queensland University of Technology (QUT) Library in Brisbane, Australia took to more accurately measure the value and impact of their Liaison Librarian service. This service consists of teams of Liaison Librarians (faculty librarians) who liaise with the university’s faculties and divisions, and an enduring issue has been the inaccurate measurement of this service. Reporting requirements have changed over the years and the library was using a legacy system for tracking work that was no longer fit for purpose. There was also great variation in how individual librarians used the system resulting in further issues with data integrity. This paper discusses how, based on the strategic imperative to report on liaison initiatives and engagement more meaningfully, a small working group within QUT Library redefined liaison data collection in an effort to future-proof and elevate the importance and value of liaison work into the future.

Design, methodology, or approach
The Liaison Impact Working Group (LIWG) was formed, led by QUT Library’s Liaison Service Manager and consisting of four Liaison Librarians and the Library’s Quality and Planning Manager. The group carried out extensive stakeholder engagement, a comprehensive data audit and an environmental scan which included consulting with colleagues from other institutions. The group also gathered feedback on the use of the current system (SharePoint Online) and prototyped several updates (SharePoint Online and MS Forms), resulting in the final product that is currently in place. It was an iterative process that took place over approximately eight months between 2021 and 2022.

Findings
Early findings are promising. While change is hard, having Liaison champions on the working group was key in selling the value of the new procedures and system to its users. The new system has been operating since January 2022. By November 2022 we anticipate being able to share meaningful insights into liaison engagement, a reduced duplication of effort in capturing workload, and an elevated awareness of the value of liaison work which is critical in the current environment.

Practical implications or value
The new system has been implemented at no extra cost to the Library. Working group members volunteered their time, appraised options and chose to update an existing tool. This process could also be rolled out to other library services seeking more effective forms of measurement. The process could also assist colleagues in other institutions facing similar issues.

One-Size-Doesn’t-Fit-All: Differentiated Engagement Pathways for Transfer Student Success
Rebecca Croxton (University of North Carolina at Charlotte)
Paper
Show Abstract

Purpose and goals
Transfer students are an increasing sub-population of college and university students. High-transfer, four-year institutions strive to understand the indicators of transfer student adjustment, retention, and success to inform policies and services to support these students to succeed in their academic goals. As the number of students entering higher education from high schools decreases and the number of adults needing to complete or continue their education increases, we must develop a deeper understanding of the factors that contribute to transfer student retention and success. Which engagement activities should be promoted as critical pathways for success for this student population?

This study investigates which library, other co-curricular and extracurricular activities, pre-college, and demographic factors contribute to transfer versus first-time-in-college (FTIC) freshmen retention and success at a large, public university in the southeastern US with a high transfer population.

Design, methodology, or approach
This project is part of a longitudinal study of undergraduate student engagement and success of students who matriculated in fall 2012 through fall 2020. The dataset contains more than 130,000 student records and includes information about engagement with the library, other co-curricular and extracurricular services and activities, high impact practices, pre-college variables, demographic factors, and measures of student success.

Using Tinto’s “student integration theory” and Hills’s theory of “transfer shock” to frame the study, the researchers conducted an analysis of students who entered the university as FTIC freshmen and transfer students, including a deeper exploration of transfer student data disaggregated based on (1) the number of incoming credits, (2) first generation status, (3) in-state versus out-of-state originating institution, and (4) type of transfer institution. Data were analyzed using Analysis of Variance and binary logistic regression with propensity score matching.

References

Hills, J. R. (1965). Transfer shock: The academic performance of the junior college transfer. The Journal of Experimental Education, 33(3), 201–215.

Tinto, V. (1993). Leaving College: Rethinking the Causes and Cures of Student Attrition, 2nd ed. University of Chicago Press.

Findings
Preliminary findings indicate that the undergraduate subgroups based upon admission status (transfer students and FTIC freshmen) and the number of incoming credits are uniquely different from each other with respect to engagement with the library and other co-curricular and extracurricular activities and in achieving the identified measures of success. The pathways for success are also nuanced based upon the subgroup and the success measures. Additional analyses related to (1) first generation status, (2) in-state versus out-of-state originating institution, and (3) type of transfer institution are underway.

Practical implications or value
This study is the first of its kind to compare out-of-classroom engagement of transfer students with FTIC freshmen that is nuanced based upon (1) the number of incoming credits, (2) first generation status, (3) in-state versus out-of-state originating institution, and (4) the type of transfer institution. Findings will help universities structure support systems and services to help this growing population of students succeed and graduate. Methodologies used in this study can be adapted to explore engagement pathways to success for other student subpopulations.

Best Practices for Assessing Reuse of Digital Content: Educational and Instructional Design Perspectives
Joyce Chapman Duke University
Paper
Show Abstract

Purpose and goals
While digital library practitioners measure “use” of digital collections using access metrics, they rarely measure or assess “reuse” in research, social media, instruction, and other formats. Reuse metrics are often anecdotal and ephemeral, which pose a challenge to collection and comparison to other metrics. To that end, the Digital Content Reuse Assessment Framework Toolkit (D-CRAFT) has developed ethical guidelines and Best Practices for practitioners to assess how users engage with, reuse, and transform digital content. D-CRAFT is a multi-year IMLS grant that began in summer 2019. At the last LAC conference, we presented on the Ethical Guidelines developed for this project. This presentation will present the completed Best Practices and discuss the development of the project’s Educational Tools and online Toolkit.

Design, methodology, or approach
As assessment, access, privacy, ethics, cultural competency, and educational tools are key pillars of the toolkit’s design, the grant provides funds to hire part time consultants specializing in Privacy, Diversity and Inclusion, Assessment, Instructional Design, and Accessibility. Consultants contribute valuable expertise to key product development.

Developing Best Practices and Engagement and Education Tools:

The project team began by conducting a wide-ranging literature review. The team used Dedoose to code, thematically group, and tag excerpts from the resulting corpus.
Sub-teams developed Best Practices and Educational Tools. Each group used the rich data in Dedoose to conduct a gap analysis, and perform further data gathering as needed.
Sub-teams authored the Best Practices for each Method, as well as for Tools associated with each Method. Supplemental Materials were also created where appropriate.
Subject experts from the GLAMR community were hired to review the Best Practices and enhance as needed.
The Instructional Design consultant began to create Educational Tools in March 2022

Findings
The deliverables of D-CRAFT include Ethical Guidelines for assessing digital object reuse, Best Practices around assessment of digital content, and a suite of freely available engagement and Education Tools. Examples of instructional design modules for use cases that focus on methods and tools for digital object reuse assessment will be shared.

Practical implications or value
The D-CRAFT toolkit will be a vital GLAMR community resource that addresses the lack of common practices and instructional resources for assessing reuse of digital materials, provides definitive guidelines on what constitutes use and how that differentiates from reuse of digital content, and develops the first Ethical Guidelines for assessment and reuse of digital content.

D-CRAFT is a product of the GLAMR community. This session will enable the D-CRAFT project team to collect valuable feedback on the project from the assessment community.

Research Library Impact Framework Exploring Value and Impact in Research Libraries
Sue Baughman (Association of Research Libraries)
Poster
Show Abstract

Purpose and goals
The Association of Research Libraries’ (ARL) Research Library Impact Framework initiative encompassed four goals: (1) fostering a culture of assessment through methodologically sound research projects; (2) involving library teams in conducting in-depth investigations into five research question areas selected from the framework; (3) gauging how well the teams’ research effectively provides the Association of Research Libraries community with an understanding of the impact of library services; and (4) building collaborative library research partnerships.

The poster will reflect on the value of the Framework, lessons learned during the process of supporting teams in their research efforts, and evaluation of the initiative’s goals.

Design, methodology, or approach
The Framework was developed by members of the assessment community in 2018 as an aid to organize, prioritize, and focus on research libraries’ common issues and collective solutions. The Framework was intended as a tool to help libraries understand, design, and align their desired impacts, measures and data with that of their broader, evolving ecosystems.

Eighteen library teams conducted one of two types of research activities: (1) a project, which is a formal, original research study; or (2) a practice brief, which documents a case study or research-based information intended to improve library assessment work. Each project addressed one of five questions.

Project teams used a variety of methods including literature reviews, citation analysis, instruction and visitor data, focus groups, semi-structured interviews, and surveys to conduct their research.

Findings
Each pilot project and practice brief focused on different aspects of the research question that it was related to. The overarching product of the RLIF project is a suite of tools and examples that provide inspiration for other libraries to model, reinvent, or apply the findings of the RLIF pilot projects and practice briefs to their own communities. This will, in turn, support the development of a library assessment community of practice. Additionally, there is an extended cadre of library staff with enhanced skills in research methodologies who can develop and contribute to future research endeavors as a result of their participation in the initiative, whether in their own library or as advisors to other library staff engaging in research projects.

Practical implications or value
The research reports and practice briefs are available for the community for their information and use. In the case of some projects, library staff can use the report to create their own study, establish a new process, and/or present the findings internally to key stakeholders. In other cases, a report provides foundational and substantive information that can be used in future research projects by members of the library community. Most importantly, the reports or practice briefs address five different questions that were considered top priority for the library community.

Completing a Quality Assurance Assessment on a Consortial Virtual Reference Service
Sabina Pagotto (Scholars Portal, Ontario Council of University Libraries)
Poster
Show Abstract

Purpose and goals
Ask a Librarian is a bilingual, collaborative virtual reference service for members of the Ontario Council of University Libraries. The service launched in 2011 and many of our practices and training materials have remained consistent over the last decade. In 2021, a working group was struck to develop a tool to assess the quality of service provided by Ask operators. Performing regular quality assurance assessments will ensure we are providing the highest service quality to our users, as well as help to identify knowledge gaps and areas that require additional training for our operators.

Design, methodology, or approach
The working group developed a service standard and codebook to assess the quality of chat responses in 5 categories: approachability, interest, teaching (reference questions only), answering, and wrapping up. For each category, the working group considered the RUSA guidelines, our training materials, the user experience of chat, and other factors to set a service standard. Multiple rounds of testing were conducted to ensure that our codebook encapsulated different scenarios to our satisfaction, and to ensure inter-rater reliability, so that all group members shared a common understanding of what did and did not count as a “yes” for meeting the standard.

Each member of our group reviewed at least 55 chats, over the course of 4 weeks, totaling 387 randomly selected chats, excluding chats in French or submitted via SMS. The data was partially deidentified by removing the institution name and operator name from the metadata. Chats were assessed over the time period of September 2020 to August 2021, and excluding SMS and French queues.

Chats were coded as either reference or customer service questions. For each category, chats were assigned a simple “yes” or “no” based on whether they met the standard as identified in our codebook. Coders could enter a free text explanation of why a chat did not meet the standard, and these explanations were later grouped and analyzed to understand specific areas that needed improvement.

Findings
The development and testing phases were the most important and the most intensive parts of this assessment. We were able to build cohesive understandings of our service standard and a strong codebook; with this, our actual assessment was done quite quickly.

Overall, the vast majority of chats met our standards of service. In the categories Approachability, Interest, Teaching, and Answering, a collective average of 93.5% met our standards. In the category of Wrap-Up, 96% met our standard, but a sub-question of whether the user was invited to return only had a rate of 66%.

Practical implications or value
This assessment identified important areas to focus on during the training and retraining of our operators, as well as the essential service metrics to look for going forward. This initiative provided the first data-driven assessment of our service quality and demonstrated our current high service standard. With the methods and findings of this study, we hope other virtual services can apply similar strategies for quality assurance assessments.

Teaching/Learning

Assessing Synthesis of Information from Sources
Sarah Dahlen (CSU, Monterey Bay)
Paper
Show Abstract

Purpose and goals
When we teach information literacy, much of our attention is focused on students’ ability to find information, evaluate it, and cite it. How students incorporate that information into their papers is equally important, as this allows students to achieve their communicative purpose. Many instructors expect students to go beyond summarizing information from sources to synthesizing that information, showing the reader the connections between sources. Assessment of students’ ability to synthesize information has received scant attention in the scholarly literature, leaving librarians with a desire to assess this area with little guidance. After spending three years working with multidisciplinary teams of faculty on the assessment of synthesis, the author has developed a set of tools and recommendations for assessing synthesis in student work, as well as instructional materials for making improvements to teaching and learning in this area.

Design, methodology, or approach
Information literacy assessment at the author’s institution is a collaborative process in which the author leads multidisciplinary teams of faculty in scoring authentic student work with a rubric. Initial assessments using an adapted version of AAC&U’s Information Literacy VALUE Rubric identified synthesis as an area in which students were not demonstrating proficiency at the desired level. This rubric, however, merely rates the presence/absence of synthesis as part of one criterion, prompting us to create a rubric dedicated to the synthesis of information from sources. The rubric developed by Lundstrom et al. (2015) served as a valuable starting point, but we needed a rubric broad enough to evaluate assignments from different courses, disciplines, and class levels. The first iteration of our synthesis rubric was employed in 2020 for program-level assessment in the Social and Behavioral Sciences major. Applying the rubric led to revisions, and its second iteration was employed in 2021 for campus-level assessment. A final round of revisions resulted in the version adopted by our campus.

Findings
Using a rubric such as the one we developed is a viable method for assessing students’ ability to synthesize information from sources in a way that can lead to improvements in teaching and learning.

Our results showed much room for improvement in this area. In an effort to close the loop, we developed an assignment guide that advises instructors how to incorporate synthesis into their assignment prompts, and a video showing students how to use a synthesis table to identify connections between sources.

Practical implications or value
The library assessment community will recognize synthesis as an important component of information literacy and one that can be assessed by applying a rubric to student papers. The rubric we developed is available to be used or adapted to meet the needs of other institutions, and our assessment methods may be a useful model for those considering similar endeavors.

Rubrics are not merely assessment tools, but also roadmaps for instructors and students seeking to better understand synthesis and its component parts. Our rubric, assignment guide, and instructional video can all be employed as teaching tools to assist librarians and other faculty in their efforts to improve students’ ability to synthesize information from sources.

The Effect of Experiential Learning on Information Literacy Development in Online Doctoral Students
Carolyn Heine California Baptist University
Paper
Show Abstract

Purpose and goals
Doctoral students appear to be under-supported by libraries as both adult learners and students conducting original research. Further complicating this issue is the increase of low-residency or online-only programs that limit librarians’ ability to offer face-to-face or synchronous instruction for all doctoral students. Experiential learning has been shown to be an effective pedagogical approach, but there is little research on its effectiveness with information literacy development in an asynchronous context.

Design, methodology, or approach
Design: This study used a pretest-posttest controlled experimental design to test the effectiveness of fully asynchronous modules that incorporated principles of Kolb’s (1984) Experiential Learning Theory and best practices in online instruction (Darby & Lang, 2019) to develop information literacy in first-year doctoral students.

Participants: Students from the Doctor of Social Work program and the Doctor of Public Administration program were the participants and were randomly assigned to a control or treatment group.

Intervention: The control modules contained only video tutorials, a common type of library support offered in an asynchronous context. The treatment modules employed an experiential learning intervention. Both sets of modules took roughly the same amount of time, and the content was centered around (a) conducting a literature search for original research and (b) strategies for tracking searches for a dissertation over several years.

Data Collection:
Pretest – Confidence (10 Likert scale items) and IL knowledge (8 MC items)
Posttest – Pretest items (confidence and knowledge) plus the application of knowledge in a practical exercise (4 question activity graded using a rubric)

Data Analysis: A MANCOVA was conducted to determine whether there were significant differences in participants’ information literacy confidence, as well as their IL knowledge as assessed on a multiple-choice test, with their overall pretest scores used as the covariate. A MANOVA was conducted to determine whether there were significant differences in participants’ ability to demonstrate their literacy in a practical exercise.

Findings
Although the experiential learning treatment did not yield significant differences between the groups in confidence or IL knowledge, the treatment did produce a significantly greater ability to demonstrate IL in a practical exercise (F(4, 15) = 3.586; p < .05). Student feedback indicated a positive reception to the online modules and that IL development would be beneficial beyond the first semester.

Practical implications or value
A major implication of this research, as it relates to assessment, is that there is a difference between a student’s ability to demonstrate “knowledge” of IL and their ability to apply that knowledge to real-world assignments. This implication informs a second implication that multiple-choice questions alone may not accurately whether a students is information literate. Assessing actual student work will likely be more accurate. A third implication, as it relates to instruction, is that doctoral students who engage in activities that approximate real-world tasks will be better equipped to transfer learned skills to coursework and the dissertation. I hope this study inspires librarians to design assessment efforts that will allow them to identify causal relationships, not just correlative relationships, between their instruction and student learning.

Measuring the Effects of Library Workshops on Success Outcomes for First-generation College Students
Melissa Bauer (KSU)
Poster
Show Abstract

Purpose and goals
Libraries must find ways to demonstrate how they successfully contribute to student learning and support institutional goals. This study assesses the impact of a library workshop series on first-generation college students attending a mid-size community college. The goal of the assessment is to understand the relationships between first-generation college students, library instruction workshops, and the success metrics of retention and grade point average.

Design, methodology, or approach
Drawing on two years of assessment data and analysis, the poster details how the workshop series has affected first-generation college students’ grade point average and retention. Data was obtained from the Office of Institutional Effectiveness and the Office of e-Learning. First-generation student attendance will be compared against their continuing generation peers in an attempt to find statistically significant results in retention and grade point average.

Findings
In the literature, conclusions vary on the relationship between library instruction and retention and graduate point average. Wong and Cmore’s 2011 found a positive correlation between library instruction and student achievement while Soria et al. 2014 found that library workshops had no impact on student retention or cumulative GPA. There needs to be more research on library instruction and student success outcomes. This study tentatively anticipates a positive relationship between first-generation students attending library workshops and higher grade point average and retention. (Results will be available and added to the poster before the conference.)

Soria, K. M., Fransen, J., & Nackerud, S. (2014). Stacks, serials, search engines, and
students’ success: First-year undergraduate students’ library use, academic
achievement, and retention. The Journal of Academic Librarianship, 40(1), 84–91.

Wong, S. H. R., & Cmor, D. (2011). Measuring association between library instruction
and graduation GPA. College and Research Libraries, 72(5), 464–473.

Practical implications or value
This study contributes to the growing body of research on library instruction and student success outcomes. It can be used as an example of how to demonstrate library impact and value with campus stakeholders and encourage other librarians to gather data and conduct research on outcomes assessment.

How Do Faculty Design Student Research Experiences? A Qualitative Study
Cathy Meals (University of the District of Columbia)
Poster
Show Abstract

Purpose and goals
Faculty-created research assignments are the primary vehicle through which students develop research experience and information literacy skills. Librarians, as information experts, seek to teach both research and information literacy skills to students, but typically play little to no role in designing student research experiences. We identified a need to better understand the research assignments that faculty give to students, from the design of the assignment to faculty assessment of the end product.

We are currently conducting a study that seeks to answer the following research questions:

  1. Why do faculty choose to assign a research project to students in a particular class? What are their intended outcomes for students when they assign research projects? How do faculty design and develop these assignments?
  2. What are faculty perceptions of the research assignments students submit to them?
  3. Do faculty teach research skills to their students? If so, what do they teach and how do they do it?

Design, methodology, or approach
We have conducted in-depth, semi-structured interviews with faculty who teach in UDC’s general education writing sequence and plan to analyze our qualitative interview transcript data using inductive thematic analysis.

Findings
We have not yet begun formal analysis of our data, but these themes may emerge:

Faculty typically had little formal training on conducting research, or exposure to information literacy, during their own education. As such, they have had to learn to teach research through the process of teaching itself, by trial and error and by experimenting with assignments. There may be a divergence in assignment types between younger and older faculty.
Faculty explicitly teach, or work with librarians to teach, research and carefully scaffold their assignments because students come to their classes with extremely different previous experiences in research and writing. However, since general education writing classes attempt to achieve large numbers of learning outcomes, finding time to adequately teach research skills is challenging.
Faculty want their assignments to be relevant to students’ lived experiences and hope to increase students’ consideration of themselves as scholars who offer unique and important perspectives.
Faculty find that the quality of student work varies significantly. They would most like for students to deepen their exploration and strengthen their evaluation of information sources.

Practical implications or value
Our project is essentially a needs assessment that will help us identify opportunities to better support the development of student information literacy and research skills through instruction and other interventions. It will also help us assess faculty’s current understanding of student research and how the library can better support their research assignment design and professional development related to information literacy. Ultimately, by contributing insight into faculty members’ planning and perception of student research, we hope our project will help us achieve our goals of supporting student research experiences and learning outcomes.

Getting the Most Information Out of Your Escape Room Program
Leslie Drost (Kennesaw State University)
Poster
Show Abstract

Purpose and goals
Gamified orientation and information literacy programs can help get students into the library and show them a good time, but do they actually teach the students anything? This poster session will show how our team used multiple methods to assess and evaluate a first-year orientation escape room.

Design, methodology, or approach
We designed the program to incorporate a pre- and post-test to monitor the students’ new and known information, an experience survey to allow for improvements during later iterations, and collaboration with the Office of Institutional Research to follow the students through the semester and beyond.

Findings
Current conclusions show that the students gained knowledge of library services and resources that they did not have prior to participating in the program. The findings from the Office of Institutional Research, which include GPA, retention, and searching behavior are still being processed.

Practical implications or value
The value of the multiple assessments, including following the participating students as they move through their college career, is to show that this type of program, which on the surface is just for fun, can make a difference in how participating students interact with the library, to their benefit.

Evaluation and Searching Skills, Generally Speaking: Standardized Information Literacy Assessment in Undergraduate General Education Courses
Cynthia Kane (Emporia State University)
Poster
Show Abstract

Purpose and goals
Embedding information literacy outcomes into undergraduate general education courses is one approach to integrating information literacy across an academic curriculum. However, the measurements of these outcomes is challenging if library time or staff expertise is lacking to create local assessments. This paper analyzes the use of one standardized information literacy assessment, the Threshold Achievement Test for Information Literacy (TATIL) and two modules of TATIL: Evaluating Process and Authority, and Strategic Searching. These modules are currently employed as pretests and posttests in the University Libraries and Archives’ General Education course, UL100 – Research Skills, Information and Technology. In Fall 2020 and Fall 2021, the library collaborated with the Office of Institutional Effectiveness to utilize these modules as pretests and posttests in sections of undergraduate General Education Courses. A goal of this project is to create a benchmark for basic information literacy competencies across the General Education curriculum measured against competencies taught in UL100.

Design, methodology, or approach
Emporia State University’s General Education program includes a Core Skill Goal, “Demonstrate effective skills in Information Technology and/or Information Literacy Skills.” UL100 ifulfills this goal and the curriculum is grounded in the ACRL Framework for Information Literacy for Higher Education. Similarly, TATIL incorporates several of the Frames and crosswalks with UL100 learning outcomes. In Summer 2020, Carrick Enterprises (the company for TATIL) offered a flat fee per academic institution for unlimited use of the TATIL modules in the 2020/21 academic year. The flat fee continued in 2021/22. The ESU Instruction and Assessment Librarian and the Assistant Provost for Institutional Effectiveness reached out early in both semesters to faculty teaching General Education courses which appeared to incorporate the ACRL Framework into their student learning outcomes. In Fall 2020, 106 students took the Evaluating Process and Authority pre-test and post-test, and 105 students took the Strategic Searching pre-test and post-test. In Fall 2021, 77 students completed both modules’ pre-tests and post-tests. Students in the General Education courses who were concurrently enrolled in UL100 sections those semesters took the TATIL tests as part of UL100, and their results were analyzed separately.

Findings
Results from Fall 2020 and Fall 2021 show an overall increase on the posttests in students’ knowledge of evaluating resources and searching skills. UL100 students concurrently enrolled in the participating General Education course sections had a greater increase in both semesters between the pretests scores and the posttests scores. The tentative conclusion indicates that UL100 may have a positive impact upon these students’ information literacy competencies, compared to the General Education students who are not taking UL100 concurrently.

Practical implications or value
The evidence will help the Libraries and Archives faculty connect with other General Education faculty to identify core information literacy competencies for students in the courses. The project is also a key document for ESU’s Higher Learning Commission Comprehensive Review in 2024.

Why Do Faculty Choose Asynchronous Library Instruction?
Karen Reiman-Sendi (University of Michigan Library)
Poster
Show Abstract

Purpose and goals
The COVID-19 pandemic has accelerated creation and use of asynchronous digital learning objects (DLOs) in academic libraries. These DLOs provide library instruction on topics such as Academic Integrity, Searching Databases, Evaluating Sources, and Reading Scholarly Articles.

While there is a strong, renewed emphasis on in-person engagement on our campus, there continues to be strong use of asynchronous library DLOs. Librarians developed a lightweight, sustainable method of assessment to understand this trend, using DLO metadata, surveys, and semi-structured interviews. This assessment allows greater understanding of how faculty integrated library-created Canvas modules into their courses, how faculty characterized the broader learning objectives of the modules in context with their discipline, and what motivated faculty to choose this asynchronous method of library instruction.

Design, methodology, or approach
Librarians obtained data detailing course websites that imported one or more of the library modules from Canvas Commons during the 2021-2022 academic year. A questionnaire was designed and sent to faculty for these courses: 90 Fall 2021 courses taught by 55 individual faculty, and 81 Winter 2022 courses taught by 52 faculty. We removed 11 faculty from the Winter list who were contacted about Fall courses, which resulted in 41 unique faculty to whom we sent the Winter survey. 17 surveys were completed for an 18% response rate. Interviews were then conducted with 6 faculty who volunteered to talk in depth about their experiences.

Findings
Although response rates limit the extent to which findings can be generalized, the following themes were observed:

  1. Instructors like the convenience of student review at any point during the semester
  2. Instructors want discipline-specific examples in the modules, expressing willingness to collaborate with the Library to get tailored content or a specific module recommendation.
  3. Instructors perceived that students prioritized multimedia content, expressing that this generation of students “won’t read things”. Some instructors noticed web accessibility features related to this content.
  4. Instructors expressed that Library modules are valuable when they scaffold ways to practice research skills not necessarily tied to discipline content, and/or when used as a way to communicate and hold students to a common standard.

Practical implications or value
Understanding why and how faculty integrate asynchronous library instruction into university courses can significantly help libraries with strategic planning. Insight from this assessment will factor into future space-planning for onsite instruction; module design which meets the needs of faculty whether the modules are required or optional; differentiation of modules to reflect discipline or level of course; and how to meaningfully assess DLOs as both a communication tool and a learning object.

5:10 p.m.–6:00 p.m.: COVID-19 & Libraries—Past, Present, Future Discussion