2020 Schedule

Thursday, October 29: Diversity, Equity, Ethics, and Privacy

Watch the full session:

12:00 p.m.–12:30 p.m.: Opening and Welcome Remarks

Steve Hiller (University of Washington Libraries) and Sue Baughman (ARL)
Mary Lee Kennedy (ARL) and Betsy Wilson (University of Washington Libraries)

12:30 p.m.–3:10 p.m.: Paper Presentations

12:30 p.m.
On Ethical Assessment: Locating and Applying the Core Values of Library and Information Science (Watch on YouTube; link opens in new tab)
Presented by Professor Scott W.H. Young (Montana State University)

Slides (PDF)
View on Scott’s website (link opens in a new tab)
Show Abstract

PURPOSE AND GOALS
Assessment professionals want to do what is right—but the right thing to do is not always clear. This uncertainty represents the fundamental question for ethical decision-making: what is the right thing to do in certain situations? The current assessment landscape presents a number of intriguing points of ethical reflection, including learning analytics, third-party technologies, data privacy, parent-entity alignment, and impact studies. To navigate through these areas, core values are often cited in professional discourse as a guide for ethical action, with the ALA Core Values functioning as a key reference point within Library and Information Science (LIS). The literature of LIS further indicates that ethics and values are interrelated elements that guide professional conduct. Ethics is a system of determining the right thing to do; values are seen as a key component in this system by serving as the basis of deliberation and decision-making. In applying values to everyday professional scenarios, practical (or applied) ethics is the main theoretical lens through which values are studied in the LIS literature.

The purpose of this paper is to 1) review the full extent of LIS literature related to practical ethics and professional values, dating from Ranganathan’s Five Laws of Library Science in 1931, and 2) to present an analysis of core values and values-in-conflict found in contemporary library assessment practice. The goal of this research is to add a useful, historical viewpoint to our contemporary understanding and application of professional values, and to prompt new discussion and reflection on ethical assessment practices in academic libraries.

DESIGN, METHODOLOGY, OR APPROACH

This paper charts the historical development of shared values within LIS, drawing on the literature of LIS, including publications related to professionalism, value studies, impact assessment, and applied ethics. The paper is structured into four parts:

  1. Defining a profession: the paper begins by briefly tracing the outlines of professional identity as a way of staking out a claim to values.
  2. Defining a value and defining the purpose of values: the definition and purpose of values are then discussed.
  3. Enumerating the common values: this main section presents and discusses the range of values found in the published discourse of library and information science, focusing on the period 1931-2003 and culminating with the release of the ALA Core Values in 2004.
  4. Core Values in practice: Finally, the paper presents an overview and analysis of the professional conversation from 2004-2020 related to ethics and values, focusing on the practical application of core values in the context of academic library assessment.

The main approach of this research is a review and analysis of LIS literature related to ethics and values, focusing particularly on the period 1931-2003, marking the period between Rangathan’s Five Laws in 1931 and the release of the ALA Core Values Statement in 2004. Literature from 2004-2020 is further discussed with the ALA Core Values as the primary reference for a discussion of practical ethics in library assessment.

FINDINGS

The findings from this research represent a new analysis and discussion of LIS Core Values. Notably, the findings include a comprehensive list of professional values discussed throughout the 20th century, ranked by commonality. For example, values such as service and access are well represented in the literature and appear at the top of commonality ranking. Values such as tolerance and beauty, on the other hand, are each mentioned only once and appear at the bottom of the ranking. This exhaustive enumeration of core values deduced from the literature stand both in complement and contrast to the ALA Core Values. Areas of assessment practice where values are in conflict—for example, providing access to data while respecting privacy—represent particularly constructive points of consideration and discussion. (This research is completed and will be published for the first time in the Library Assessment Conference proceedings, if accepted.)

PRACTICAL IMPLICATIONS OR VALUE

This research contributes to the body of knowledge in the field by advancing our understanding of professional ethics in support of an assessment practice that is relevant and sustainable vis-à-vis core values. The consideration and application of professional values can generate insightful and useful ethical deliberations for assessment practitioners, ultimately leading to strengthened professional practice and integrity. In examining values and values-in-conflict, the conference presentation and paper are intended to prompt new reflection on ethical assessment practices in academic libraries.


12:50 p.m.
Mindful Self-Compassion at Harvard Library (Watch on YouTube; link opens in new tab)
Presented by Rachel Lewellen (Harvard Library), Dr. Richa Gawande (Center for Mindfulness and Compassion, Cambridge Health Alliance), and Kim Noh (Harvard Library)

Slides (PDF)
Show Abstract

PURPOSE AND GOALS
This paper presents the results of a staff development course evaluation. Harvard Library piloted a workplace-oriented modification of the evidence based Mindful Self-Compassion (MSC) program developed by Christopher K. Germer and Kristin Neff. Offered as part of the Harvard Library Diversity Inclusion and Belonging (DIB) framework, the class was offered only to library staff and for the first time in an organizational setting. The course met for 2.5 hours per week, for 9 weeks. Library staff learned:

  • mindfulness and self-compassion practices for work and everyday life
  • how self-compassion can strengthen inclusion, belonging, and allyship through bringing awareness to patterns of exclusion, bias, and judgement of oneself and one’s own story and culture; through cultivating compassion for and a sense of common humanity with others; through compassionate listening; and through development of skills related to advocacy and difficulties in relationships
  • how to bring your inner ally rather than your inner critic to work
  • how to motivate yourself with encouragement rather than criticism
  • how to transform difficult relationships
  • how to handle difficult emotions with greater ease
  • how to find joy and meaning in everyday life

This paper summarizes the training program, the evaluation components, the results, and a discussion on the impact of workplace MSC training on Diversity, Inclusion and Belonging.

DESIGN, METHODOLOGY, OR APPROACH

Harvard Library contracted with the Center for Mindfulness & Compassion (CMC) at the Cambridge Health Alliance to teach the course and conduct the program evaluation. Drawing on the Center’s research program capacity and expertise, a robust course evaluation was collaboratively designed to evaluate the impact and value of the class for participants and the organization.

Pre and post course quantitative data include validated measures of:

  • Self-compassion
  • Perspective taking
  • Openness to diversity
  • Inclusion at work
  • Workplace wellbeing
  • Fear of failure
  • Acceptance of change
  • Perceptions of dissimilarity
  • Self-criticizing
  • Emotion regulation
  • Body sensation awareness

Qualitative data includes participant motivation for taking the course, weekly participant feedback, teacher observations, and a post course survey which will include questions about program content, use of program materials and practices at work, and whether participants would recommend the program to be offered again.
FINDINGS

The Harvard Library workplace MSC course is being piloted for the first time. Twenty-seven Harvard Library staff have enrolled, and attendance is at 90% in the six of nine course sessions that have been completed. Pre-course quantitative and qualitative data have been collective and weekly-course feedback is collected on an ongoing basis. Post course data will be collected in March 2020. We will report on the results of measures listed in the design section. Analysis will also consider the overall potential organizational impact of the course and specifically the impact related to diversity inclusion and belonging.

Research on the community-based course from which this program is adapted has demonstrated that MSC significantly increased self-compassion, compassion for others, mindfulness, and life satisfaction, as well as decreased depression, anxiety and stress.1 MSC also increases motivation, performance and resilience.Additionally, emerging research indicates that self-compassion is strongly associated with positive workplace attributes including emotional resilience and the ability to learn from mistakes.3 The paper will report on the findings of the Harvard Library course evaluation.

1. A Pilot Study and Randomized Controlled Trial of the Mindful Self-Compassion Program, Kristin D. Neff and Christopher K. Germer. JOURNAL OF CLINICAL PSYCHOLOGY, Vol. 00(00), 1–17 (2012)
DOI: 10.1002/jclp.21923

2. Kristin D. Neff & Elizabeth Pommier (2013) The Relationship between Self-compassion and Other-focused Concern among College Undergraduates, Community Adults, and Practicing Meditators, Self and Identity, 12:2, 160-176, DOI: 10.1080/15298868.2011.649546

3. https://self-compassion.org/the-research/#areaofstudy

PRACTICAL IMPLICATIONS OR VALUE
The paper will discuss the challenges facing evaluation related to Diversity Inclusion and Belonging programming.

Relevant to other pilot program evaluation the combination of qualitative and quantitative assessment will provide a nuanced understanding of the strengths, weakness, outcomes, and value for investing in a staff development program. The use of pre and post measure is especially applicable to other kinds of program evaluation.

We will also share practical lessons learned and best practices related to contracting for program evaluation. This includes a discussion of data ownership, confidentiality, and licensing considerations for use.

This pilot is undertaken as an innovation endeavor, in part, to learn how MSC programming supports the diversity inclusion and belonging framework for Harvard Library, which aligns with University DIB goals. Results will be shared with colleagues in the office of Human Resources, the Office of Work/Life, Harvard Medical School, and the Center for Wellness & Health Promotion. The data and analysis for this pilot will inform a decision about offering the course university wide.

Results will also be shared with founding MSC program developers to further the workplace curriculum adaptation.

 


1:10 p.m.
Being Black at Duke: Partnering with Black students to learn about their experiences (Watch on YouTube, link opens in new tab)
Presented by Joyce Chapman (Duke University Libraries) and Emily Daly (Duke University Libraries)

Slides (PDF)
Read the Full Report (link opens in a new tab)
Read the Graduate Student Moderator’s Report (link opens in a new tab)
Show Abstract

PURPOSE AND GOALS
In 2019, the Assessment & User Experience (AUX) Department at Duke University Libraries (DUL) worked with Black students to better understand their experiences in the Libraries and on campus, and identify things the Libraries can change to increase the positive experiences Black students have with library services, facilities, and materials. The multi-faceted study included a literature review and environmental scan, informational interviews with campus stakeholders, focus groups and Photovoice sessions with Black students, and analysis of library satisfaction survey data focusing on race. This study is part of a multi-year, mixed methods approach to understand the experiences and needs of different student populations more fully, beginning in 2017 when the Libraries focused its attention on first-generation college students.

This paper will summarize the research team’s methodology, focusing on ways we made the study highly participatory for the student moderators and study participants. We will briefly describe our findings and ways that library and campus stakeholders have implemented the team’s recommendations to improve library services and make spaces more inclusive and welcoming for Black students at Duke.

DESIGN, METHODOLOGY, OR APPROACH

To prepare for this study, staff conducted a literature review and spoke to campus stakeholders to better understand what support exists for Black students at Duke and what prior research had been done to understand the Black student experience. We formed a cross-departmental research team of library staff to design the methodology, draft focus group scripts, recruit participants, analyze results, and develop recommendations. The team pursued a number of questions, including two focal questions: To what extent are the Libraries viewed as an inclusive space by Black students? To what extent is the University viewed as an inclusive space by Black students?

To answer these, AUX staff coordinated two traditional focus groups and three Photovoice sessions that included 32 undergraduate and graduate students. Photovoice is a community-based, participatory research method to gather qualitative data. We hired two Black graduate students with extensive experience in qualitative research to moderate the discussion groups, knowing that in order to have honest discussions without racial power imbalances, White people should not be in the room, either as moderators or note takers. Sessions were recorded with participants’ consent, and a research team member transcribed the five recordings for the team to review. In addition to including Black students in the role of moderator, the research team hired one of the moderators to conduct an independent analysis of the transcripts using NVivo and produce a report of findings from a non-library perspective. Team members also analyzed the transcripts and used affinity mapping to develop themes based on the data. Additionally, the research team analyzed 2,700+ responses to the 2020 biennial library student satisfaction survey by race, paying particular attention to Black students’ comments and the extent to which they agree with statements such as “I feel safe from discrimination, harassment, and emotional and physical harm” and “For me the library is a welcoming place.”

FINDINGS

Research team members used the themes generated from affinity mapping, the independently written supplementary report, and the student satisfaction survey results to develop 34 recommendations that address the issues study participants and survey respondents raised. Recommendations include dedicating a library space to Black scholarship, increasing visual representations of people of color in library spaces, and investigating ways to provide curricular support for faculty who wish to include more diverse scholarship in their course materials. Findings were incorporated into a report to be shared with library staff, campus stakeholders, higher education communities interested in providing support for Black students, and potential donors interested in funding relevant library services.

PRACTICAL IMPLICATIONS OR VALUE

This study was instrumental in helping library staff understand Black students’ experiences at Duke and with the libraries. Staff developed and began implementing more than thirty recommendations for making library services and spaces more supportive, usable, and welcoming for Black students. Additionally, this mixed methods study serves as a model for other libraries who wish to use participatory research methods such as PhotoVoice to reach users from underrepresented minority groups. Finally, this project highlights ways to partner with students at every stage of the research process, from literature review to recruitment to discussion groups to analysis.


1:30 p.m.
The Diversity Stalemate: An Analysis of How Collection Development Policies in Academic Libraries Address Diversity Inequities in Children’s Books (Watch on YouTube; link opens in new tab)
Presented by Andrea Jamison (Illinois State University)

Show Abstract
PURPOSE AND GOALS
Diversifying children’s literature has been a topic of discussion among American schools and libraries for more than half a century. An analysis of the research suggests that children benefit greatly from exposure to diverse books. Despite a significant amount of literature about this topic and an increasingly multicultural society, diversity inequities in children’s books persist. Given that diversity is articulated as a core value by the American Library Association (ALA), the lack of diversity in children’s books is germane to libraries. The purpose of this study is to examine whether or not the policies that govern the selection of children’s books address diversity inequities. Utilizing a mixed-methods research design, a content analysis was performed on collection development policies at academic libraries within ALA-accredited schools that have curricula tied to children’s services. The analysis was performed to examine whether “manifest” messages of diversity exists in policies and the degree of congruency, if any, to the American Library Association’s (ALA) Diversity in Collection Development: An Interpretation of the Library Bill of Rights. A survey was also utilized in this study to help contextualize each policy and determine whether additional questions emerge about diversity in collection policies.

DESIGN, METHODOLOGY, OR APPROACH

This study provides an analysis of the collection development policies of academic libraries located at universities with ALA-accredited programs in library and information studies and that offer Master’s degrees or library programs with areas of concentrations or career pathways in children’s services. Only academic libraries within the Midwest and Northeast region of the United States were included in this study. Using an embedded research design, this study examines the extent that collection development policies address diversity inequities and whether congruency exists between university library collection development policies and ALA’s Bill of Rights.  A content analysis was the primary technique used to gather data. However, a survey was also embedded into this study to gather contextual information about the policies themselves, policy creators, and practices. Both quantitative and qualitative data sets were collected. However, quantitative data is given priority. In this study, qualitative data is used to confirm findings and provide additional insight on policies.

Since policies are the crux of this study, collection development policies governing children’s books were chosen as the sampling units. Using Diversity in Collection Development: An Interpretation of the American Library’s Bill of Rights as the coding unit (Appendix B), the researcher created a checklist. The checklist was used to perform a content analysis of the manifest language of diversity prevalent in policies in order to answer the following questions:

  1. To what extent do collection development policies address diversity
  2. Do collection development policies adhere to ALA’s Interpretation of the Bill of Rights for diversity in collection development (Appendix B)? If so, how?

FINDINGS

Preliminary findings from performing a content analysis on 13 policies show that specific diversity-related terms did not manifest in policies. Nineteen terms and phrases, used to represent the idea of diversity, were included in the evaluation checklist. However, 53 percent of the terms identified did not manifest in any of the policies. Sixty-one percent of the policies examined did not manifest the term “diversity” or a variation of the term. For the policies that did manifest diversity-related terms, they did so by including stand-alone diversity statements that were not embedded as a measurable item throughout the policy.

At current, the researcher is still analyzing data. A more in-depth analysis is forthcoming and will be presented in more detail along with the researcher’s recommendations for moving forward.

PRACTICAL IMPLICATIONS OR VALUE

This research study provides data that can help inform the professional practice of academic librarians. Studies like these are useful to the profession because they raise awareness about diversity, encourage conversation, and help normalize exposure to diverse content (Cooke & Jacobs, 2018, p. 3). It is essential to assess how different types of libraries address diversity through its policies as policies influence practice and can have an impact on service. By providing a glimpse into how some academic libraries address diversity in their respective collection development policies, this study can add to the overall conversation about diversity in libraries.

Librarians who oversee youth collections in other types of library settings may be able to glean information from this study as well. Youth benefit significantly when exposed to diverse content. An analysis of research about multiculturalism suggests that children benefit greatly from exposure to texts that are representative of a variety of cultures (Arsenault & Brown, 2007; Gilton, 2012; Hinton-Johnson, 2005; Strehle 1999). Such exposure helps to develop language skills, creates a culture of inclusion, and engagement in reading (Agosto, 2007; Arsenault & Brown, 2007; Bishop, 1990; Boyd et al., 2015; Chen & Brown, 2015; Newell, 2017). On the adverse, lack of exposure to multicultural literature perpetuates misinformation, inequities, racial stereotypes, and biases that have been systemic in American culture (Boyd et al., 2015; Campbell, 2017; Morgan, 2011; Mosely, 2012; Nilsson, 2005; Turner, 2016). Examination of policy can better position libraries for more inclusivity in library collections and increase access to diverse content.

Policy examination can also inform library culture. This study is situated within the context of academic libraries at universities with ALA-accredited programs in library and information studies and that offer Master’s degrees or library programs, as identified by ALA, with areas of concentrations or career pathways in children’s services (as of December 2019). ALA-accredited library programs provide the requisite skills and training needed to work in librarianship. ALA-accredited programs in children services provide skills and training necessary for providing library services to children and young adults. Academic libraries provide library collections that support curriculum (Dinkins 2003; Gessesse, 2000; Herzon, 2004; Snow 1996). Thus, these libraries can be viewed as a model for building diverse collections. Library students, who interact with these collections, can glean from their encounters with its diverse content and apply them to their professional practice.

This study also has the potential to inform policy. Collection development policies are guidelines that influence how librarians build collections. These guidelines work to ensure that collections are fair, equitable, and supportive of a library’s mission (Lukenbill, 2001; Olaojo & Akewukereke, 2006; Woolls & Coatney, 2018). A policy analysis is a useful tool for determining current practices with regard to how diversity is articulated in academic library policies. Policies analysis can also shed insight on how policies align to the professional values articulated by ALA. Results from of this study can provide a glimpse into the practices of academic libraries and may highlight practices that may or may not align with core values of librarianship as articulated by the ALA. This study can also “spotlight procedures and practices that can inform strategic plans and overall institutional change and growth” (Cooke & Jacobs, 2018, p. 3). Results can also add to the body of literature about academic library practices and highlight diversity strengths or weaknesses within policies.


1:50 p.m.
Tools for determining equitable representation of women in LIS publications (Watch on YouTube; link opens in new tab)
Presented by Amalia Monroe-Gulick (The University of Kansas Libraries) and Marla Schleuder (Robert J. Dole Institute of Politics, The University of Kansas)

Slides (PDF)
Show Abstract

PURPOSE AND GOALS

This project aims to explore one method, along with the tools used, of gathering data to assess the representation of women among authors in Library and Information Science (LIS) publications.

Librarianship has long been viewed predominantly as a profession of women. This distinction still holds true when it is broken down into sub-fields such as academic and public librarianship. However, these stark differences do lessen in the field of academic librarianship when compared to public librarianship, similar to k-12 education and higher education.

A 2017 article “Gender in the Journals: Publication Patterns in Political Science” inspired this current study because we wanted to know if there was a similar gap in publication in a profession dominated by women as there was in a profession dominated by men.

At a technical level, we show how data can be pulled from a database using an API, and cleaned and transformed into a usable tabular format, using open source tools.

More broadly, we discuss data obtained from published sources and its use to fill in gaps which would not otherwise be obtainable, such as demographic data, which can then be assessed and analyzed.

DESIGN, METHODOLOGY, OR APPROACH

Ten journals which cover the predominant subfields of academic librarianship were selected, with a focus on the most recent 15 years-worth of articles for each. Around 9000 citations were pulled using a combination of open source tools.

The EBSCOHOST API was used to access these articles through Library, Information Science and Technology Abstracts (LISTA) and Academic Search Complete databases. Requests for citation data were made through OpenRefine, where resulting raw XML was parsed and cleaned.

We then narrowed our focus to exclude non-scholarly document types such as editorials and book reviews, leaving us with a data set of 6191 citations.

R’s gender package uses U.S. Census or Social Security data sets to encode gender based on sex, first names and dates of birth. This package was used to generate gender information, along with a confidence level, based on each author’s first name. Low confidence or unidentified names accounted for approximately 10% – 15% of the authors listed. These were then hand-coded by the researches.

FINDINGS

We have a complete data set and analysis will be complete in Spring 2020. The structure of the data set will allow for several parts to the analysis, including a discussion of the social implications and validity of assigning gender identity by first name, the percentage of women among authors in multi-author publications, and the prominence of women as authors in major publication topic areas (i.e. collections, information literacy, and scholarly communications).

We have recognized some limitations in this project.

Due to the subjectivity and cultural/linguistic bias of assuming and ascribing gender identity based on a given name, and the focus on biological essentialism inherent in the original data collection the gender package is based on, this study is non-inclusive of individuals with a non-binary gender identity. Furthermore, the number of subjects whose gender could not be guessed may skew the final results.

The gender package did not have a value for all first names, especially non-Western European names. Some authors provide only abbreviations of their first names.

PRACTICAL IMPLICATIONS OR VALUE

The potential application of results of this study will help the library profession understand the current state of data available on sex and gender representation in LIS publications as well as improved knowledge of APIs, OpenRefine, and R.


2:10 p.m.
Supporting Student Success: Pell Grant Student Experiences with Library Technology Lending Services
Presented by Lara Miller (University of Arizona Libraries)

Show Abstract
PURPOSE AND GOALS

As national enrollment trends change, universities are looking to close the achievement gap in graduation rates between Pell Grant recipients and non-Pell Grant-eligible students. In Fall 2019, a large R1 university announced the restructuring of financial aid to make college more accessible for Pell Grant-eligible students. With the implementation of the new financial aid award for in-state Pell-eligible students, the library began a project to better understand and support the Pell student journey to graduation. This research study—an extension of a larger campus assessment project—digs into the role of library peripherals in creating equitable experiences and cutting costs for Pell students. Access to peripherals and other equipment services, like maker tools, in part, aim to close the achievement gap and libraries are central in supporting learning via peripherals. Not only will this presentation provide findings on demographic patterns of technology circulation (laptops, tablets, camera equipment, and calculators), it also outlines a framework for collaborating with the campus assessment community on ways to merge assessment data to support low-income students and campus strategic goals.

Research Questions:

  1. Do Pell-Grant students check out technology at higher rates than non-Pell Grant students?
  2. How can the library develop policies, practices, and services to enhance the Pell Grant student experience and their success?
    1. How can the library lower costs for Pell Grant Students?

DESIGN, METHODOLOGY, OR APPROACH

This study takes a mixed methods approach to understanding technology circulation and Pell student success. The library has begun the process of assessing circulation data from the library services platform, Alma Analytics. Working with the university’s Office of Analytics and Institutional Research, the dataset will be linked up with demographic data to identify technology checkouts by students whose Expected Family Contribution (EFC) is zero.

The next phase of the project will conduct one-on-one semi-structured interviews with Pell Grant students. Students participating in the qualitative phase will be randomly sampled and contacted for interviews from a pool of Pell students. Students will be asked about technology use, interaction with technology lending services, library fee policies, sense of belonging, and other cost-saving strategies. Interviews will be recorded, transcribed, and analyzed with NVivo. Interviews will also incorporate journey mapping—a user experience method to visualize student actions and decision points.

FINDINGS

The Pell Grant student project is still in the data collection phase, and findings are forthcoming. Based on research generated from a campus assessment project and preliminary data from the library services platform analysis phase, we expect to see the library’s overdue and lost technology fee policy as a barrier for Pell Grant recipients. We also expect to see students with an EFC of zero use the library’s technology lending services at higher rates than non-Pell Grant students.

PRACTICAL IMPLICATIONS OR VALUE

Currently, there is little research exploring how libraries can better support Pell Grant students, and more scholarship is needed in this area. With national enrollment trends projected to shift by 2025, it is paramount for libraries to anticipate and plan for services to the changing demographics.


2:30 p.m.
Walking the PII tightrope: Creating a privacy policy safety net (Watch on YouTube; link opens in new tab)
Presented by Kirsten Kinsley (Florida State University) and Susannah Miller (Florida State University)

Show Abstract

PURPOSE AND GOALS

The purpose of this paper is to outline how we at FSU Libraries established a privacy policy to set a standard for how we treat personally identifiable information (PII) of our users. In our practice of collecting user data, we sought to be transparent with our users about what data we use, how we use it, and why.

DESIGN, METHODOLOGY, OR APPROACH

Our approach to establishing a privacy policy was to engage librarians and users to first learn about privacy laws, rules, and regulations, and then to discuss how libraries are unique in bridging the connection between privacy and free speech, thought, association and ensuring open and unimpeded inquiry.

We held a series of information sessions for libraries personnel as follows: The University Registrar regarding the Family Educational Rights and Privacy Act of 1974 (FERPA); the FSU Institutional Review Board (IRB) and Office for Human Subjects Protection (OHSP); the FSU Office of Compliance and Ethics; and the FSU Information Security and Privacy Office. We also consulted ALA’s Privacy Policy Toolkit regarding Fair Information Practice Principles, the ALA Code of Ethics, Florida Statutes, and FSU Policy, and we conducted a literature review.

Lastly, we will be educating our users about our privacy policy and surveying them to determine what they know about the data we collect, why we collect it, what we use it for, and whether they have concerns about our practices.

FINDINGS

We will present our policy and the process of development. We will also share the steps the policy went through to get approved by the University Libraries and as it was reviewed by the university. We would like to discuss if it impacted or was impacted by the campus-wide Data Governance initiative. We will share how other departments, such as Student Affairs, have looked to our approach to shape and guide their own privacy policy. Our experience of creating a policy surrounding issues on the use of PII related to third party data will also be shared. We will share survey results related to user awareness and comfort levels regarding third-party PII data we collect about users as part of our desire to be transparent.

PRACTICAL IMPLICATIONS OR VALUE

This policy will serve as a model for other libraries in the establishment of a privacy policy that provides users with transparency in how their data is used and for what purpose it is utilized. It contributes to the body of knowledge in the field by showing a real and practical example of establishing and utilizing a privacy policy that adheres to Libraries’ values, as well as state and national laws, and campus policy. The policy also serves as a framework to guide assessment work using PII data. It also has implications for data management practices for all PII data and includes provisions for educating our staff about how to implement the policy in their daily work.


3:10 p.m.–3:20 p.m.: Questions and wrap-up

Moderated by Megan Oakleaf (Syracuse University) and Martha Kyrillidou (QualityMetrics LLC)

Wednesday, November 18: Assessment Programs and Organizational Issues

Watch the full session:

1:00 p.m.–1:15 p.m.: Welcome Remarks

Martha Kyrillidou (QualityMetrics, LLC) and Elizabeth Edwards (University of Chicago)

1:15 p.m.–4:15 p.m.: Paper Presentations


1:15 p.m.
Revisiting a Library Impact Map (Watch on YouTube; link opens in new tab)
Presented by Holt Zaugg (Lee Library BYU)

Slides (PDF)
Show Abstract

PURPOSE AND GOALS

Six years ago, the BYU Library conducted a Library Impact Map (LIM) as proposed by Megan Oakleaf (2012). The LIM identified the degree to which data was collected when cross-referencing areas of university focus with library services. This proposal examines changes between the first and second administrations of the LIM at the BYU Library.

The purpose of this proposal is to compare changes in the degree of data collection within the Lee Library by examining changes in data collection from 2013 to 2019. The comparison helps to indicate the degree to which a culture of assessment is occurring within the library. It also examines patterns of data expectation. That is, where data may or may not be collected within a library.

DESIGN, METHODOLOGY, OR APPROACH

The assessment followed the design outlined in Academic Library Value (Oakleaf, 2012), with modifications for the BYU library. For example, we added or deleted some areas of university focus as they applied to BYU. Similarly, we adjusted library services to fit the BYU Library organization. We also identified individuals within the library who were best situated to indicate the level of data collection. We also added a level of data collection. We split the “could be collecting data” into “could be” and “could be – want to collect” however for analysis purposes, we combined these two levels of data collection into could be.

Using the 2013 LIM as a template, we adjusted library services and areas of university focus on the LIM. In both instances, some services and some areas of university focus were modified, added, or deleted. We then identified library employees who had responsibility for specific areas within the Lee Library services. In some instances one person was responsible for more than one service and in other instances more than one person had responsibilities for a given library service. In the latter case, if the indications of the degree of data collection differed, we used the highest level of data collection. We emailed the identified library employees the tables indicating their library service responsibility and asked them to fill out the degree of data collection for each area of university focus.

As we received responses from each library employee, we built the complete LIM. These results were analyzed in a similar manner to the 2013 LIM. Additionally, we compared the level of data collection between the two years of collection to determine if the level had remained the same, increased levels (e.g. moved from could be to yes), or decreased.

FINDINGS

At this time, we are continuing our analysis, but preliminary results indicate that half of all data collection options did not change between the two administrations of the LIM. Slightly more data collection points increase than decreased. Further analysis will seek to explain the data collection points in terms of intersections of library services and university focus that may never collect data, why there were increases or decreases, and how these points may be used in library decisions and strategic planning

PRACTICAL IMPLICATIONS OR VALUE

There are two practical implications. The first how the LIM is used to provide one indication of how a culture of assessment is evolving within the library. Second, the current LIM serves as a starting point to improve data collection, especially in the instances where “could be – want to” were identified and for the “no” categories. In the case of the “no” categories it would be to determine if data exists and if it would be of value to collect and use to inform and improve library decisions and strategic planning.


1:35 p.m.
Are you ready to assess? Extending Lewin’s model to build readiness into your planning (Watch on YouTube; link opens in new tab)
Presented by Aaron Noland (James Madison University Libraries)
Show Abstract

PURPOSE AND GOALS

Assessment initiatives bring high levels of uncertainty and elevate organizational anxiety. Much attention is paid to how to manage the uncertainty throughout a change or assessment process, but often, it can be too late. This paper will focus on the pre-work of an assessment initiative. Specifically, the paper outlines a holistic model that extends Lewin’s three stage (unfreeze, change, refreeze) change model and discuss how it applies to assessment. The research adds a missing stage prior to unfreezing: readiness. The paper outlines the importance of readiness, how it fits into the model and how it benefits change and assessment initiatives. Finally, the paper provides insights for assessing and increasing readiness in organizations.

DESIGN, METHODOLOGY, OR APPROACH

This paper is a theory extension. I am offering an extension, in this case adding readiness as a stage, to an existing theoretical approach. Thus, the paper does not involve quantitative or qualitative data collection, but importantly, fosters theory development in the process of assessment – a space with sparse research.

FINDINGS

Increasing readiness for organizational change and assessment increases the effectiveness of an initiative by reducing uncertainty, setting expectations, establishing clear expectations, and providing appropriate outlets for organizational pressure release.

The Lewin model falls short of prioritizing humane approaches to change, and in this case, assessment, while integrating readiness recognizes the importance of helping people prepare. This allows for more focus on the assessment initiative when it is initiated instead of initiating an assessment project and having to spend the first several phases increasing readiness or risking a preventable failure.

PRACTICAL IMPLICATIONS OR VALUE

This research extends a theoretical model, but also offers concrete implications for practice. First, the paper discusses concrete ways to assess readiness. Second, it provides practitioners ways to increase readiness in their organizations. Finally, the paper concludes with ways to bridge readiness and the initial phases of an assessment initiative.


1:55 p.m.
Putting a Library Assessment Culture into Practice (Watch on YouTube; link opens in new tab)
Presented by Selena Killick (The Open University)

Slides (PDF)

Selena Killick is the Associate Director at The Open University Library where she leads the Student and Academic Services Team. Her remit includes leading the development and delivery of strategies for the Library to improve service delivery. Her research field is in library assessment and performance measurement. She is the co-author of Putting Library Assessment Data to Work, and a member of the Library Performance Measurement Conference Board.

Show Abstract
Introduction and Context

Established in 1969, The Open University (OU) is the United Kingdom’s largest academic institution dedicated to distance learning, with over 175,000 students. Initially delivering education via correspondence courses through the post, supported by BBC programmes, we were the first online university waiting for the internet to be invented. The Library supports all members of the OU Community through the provision of electronic and printed publications, digital and information literary training, virtual and physical spaces, and 24/7 support.

We have a strong library assessment culture which informs everything we do. This paper explores how we integrate insight into our strategic planning processes, instilling our culture of Library Assessment. There are three key facets to our methodology; insight, strategy and expertise.

Insight

Insight is an umbrella term we use to describe the wealth of qualitative and quantitative data we use to inform our strategy. The sources for this vary and can include our customer relationship management system we capture all Library service desk enquires through (Killick, 2017), resource usage and embedment data (Brock, 2020), and Library Learning Analytics (Nurse, Baker and Gambles, 2017) (Killick, Nurse and Clough, 2018). The richest source of insight we have developed is our Library Research Panel (Dick and Killick, 2016). A representative sample of 500 students volunteer as research panel members annually. Both users and nonusers of the Library volunteer to be members of the panel, with the overall composition closely aligning to our main student population. We use a variety of user experience research methods to gain rich insight into our students needs, expectations and perceptions – predominantly at a distance. Their views inform service improvement strategies, for example the development of our online Library, the procurement of our library discovery system, and our provision of our online digital and information literacy tutorials.

Strategy

The next key component is the parent strategy, in our case the wider University strategy. Academic Libraries do not exist within a vacuum, we are ultimately responsible to our wider parent organisation. Through aligning our insight and the wider University Strategy we can ensure the service improvements support the organisational goal. Without this key component our initiatives would be unable to secure the resource needed to implement them, both in terms of direct and indirect costs. Our role as leaders is to interpret the University Strategy for our teams to enable them to identify where and how our service can support the organisational goals. Conversely, we interpret the Library endeavour to our senior stakeholders so they can see the clear tangible link between their goals and ours.

Expertise

The final key component is the most important one, our own professional expertise. Our staff are the most valuable asset the Library will ever have, however the role our own expertise plays in interpreting insight and strategy is often overlooked and underplayed. We offer the community we serve unique expertise on how to achieve their aims, whether those aims are that of a freshman student or a provost. We develop this expertise through qualifications, years of experience, and a commitment to continued professional development. Though combining our expertise with the needs of our users and organisations we shape the Library strategy.

The Library Assessment Culture in Practice

Through combining insight, strategy and expertise we able to form the Library strategy. For plans to succeed all three elements must be present. Initiatives based only on our professional expertise and insight require us to advocate to stakeholders the vision to secure funding. Services aligned to the University strategy and our professional expertise, but not the needs of our users, will falter without refinement. Where insight and University strategy align but without appropriate expertise, the Library must develop the capabilities to respond to emerging trends.

Conclusions

Strategic planning is an art form which evolves over time with continued refinement. Our approach to assimilating library assessment into our planning was developed over many years, building on the successes initially achieved within an individual library team. Though effective advocacy within the Library, insight informed decision making it is now established practice and highly valued by the whole team; and the wider University we support.

Practical Implications or Value

This paper provides a practical example of how a library assessment culture has been used to inform library strategic planning. It is one key facet of three within the methodology we apply. Our approach can be applied in a variety of service department contexts, ensuring the insight we gain from our users is at the heart of everything we do.


2:15 p.m.
Designing outcomes for multifaceted organizational planning and assessment: A case study of the logic model framework for the UC San Diego Library organizational renewal (Watch on YouTube; link opens in new tab)
Presented by Erik Mitchell (UC San Diego Library) and Jeffery Loo (UC San Diego Library)

Show Abstract
PURPOSE AND GOALS

In 2019, the UC San Diego Library engaged in an organizational renewal activity that included parallel efforts to renew our strategic priorities, recruit new senior leadership, revise leadership portfolios, and evaluate the impact of previous organizational changes for adjustments. To engage in these initiatives in a cohesive way, we needed a framework to connect these planning activities to the intended service outcomes of our organization. This paper describes our use of the logic model planning and assessment framework to define service outcomes — that align Library priorities to campus goals, professional trends, and long-term impact areas — and to set up assessment for future organizational development.

DESIGN, METHODOLOGY, OR APPROACH

Logic models are visual planning and assessment diagrams that depict the connections between the work of an organization and the intended results. They are like concept maps that depict an if-then relationship between resources, activities, outputs, and outcomes/impact, i.e., if we have the resources, then we expect to conduct the activities; if we conduct the activities, then we expect to produce the outputs; if the outputs are delivered, then we may realize specific outcomes/impact. Outcomes may be divided by different types of change in our users — specifically in their learning, actions, and conditions. Additionally, outcomes may also be differentiated by time range as short, intermediate, or long term. The logic model framework has been used by governmental, philanthropic, academic, and library organizations. It may be especially useful for aligning the current activities of the organization towards new or revitalized long-term service outcomes and developing assessment methods to inform this change.

FINDINGS

We describe how the logic model approach contributed to a unified framework for coordinating strategic, leadership, and organizational planning during our multifaceted organizational renewal. We focus on three beneficial features of the logic model: the framework is centered on clear outcomes around user change, it connects work activities to organizational impact, and it gathers holistic perspectives in a manageable and structured manner. These features contributed to strategic planning by framing priorities in a resonant way to users and external stakeholders, providing a roadmap for implementation and assessment, and facilitating shared understanding and decision-making. In terms of leadership planning, the logic model approach was instrumental to aligning conversations around mission and strategic priorities, ensuring clear paths between leadership and organizational structures, and providing methods to align leadership with organizational work and operations. Finally, logic models contributed to organizational planning by ensuring organizational structures support service priorities and users’ needs, prioritizing work based on outcomes and resources, and facilitating assessment at an operational and organizational level.

PRACTICAL IMPLICATIONS OR VALUE

The value of logic models for multifaceted organizational planning falls along four themes: (1) operational- and activity-centered strategic planning, (2) mission- and outcome-aligned organizational planning, (3) user impact-focused communication, and (4) contextual and additive assessment to coordinate planning. The underlying common element to these positives is the clearly articulated organizational outcome that is framed around user change and broader campus goals. Such outcomes lay the groundwork for evaluation, continuous improvement, and future organizational development. They also serve as indicators of progress and help identify course corrections in a changing environment. And by anchoring outcomes to long-term campus goals, we have latitude to adjust our organization to suit changing needs and trends. This paper may help the reader engage in multifaceted Library planning through the design of organizational outcomes for planning and assessment.


2:35 p.m.
Like Peas and Carrots: A Case for Assessment and UX Teams (Watch on YouTube; link opens in new tab)
Presented by Emily Daly (Duke University Libraries) and Michael Luther (Emory University Libraries)

Slides (PDF)
Show Abstract

PURPOSE AND GOALS

As Assessment and User Experience (AUX) becomes more complex and increasingly integrated into the administrative functions of academic libraries, some institutions have formed small teams to focus on this important work. Drawing from the examples of Duke University Libraries in Durham, NC, and Emory University Libraries in Atlanta, GA, this paper will describe two AUX teams, providing detail on where the libraries’ approaches align and diverge. Both libraries have combined the roles of assessment and user experience (UX), and both define UX broadly, to include web interfaces, physical spaces, services, and collections. Both employ teams of five in service to these functions. Beyond these somewhat superficial similarities, the teams have grown organically, reflecting the goals and priorities of their library and university. One notable difference is that at Duke, the department head developed and expanded the team over time, while at Emory, the department head was hired to lead a newly formed team.

DESIGN, METHODOLOGY, OR APPROACH

The Duke University Libraries AUX department was formed in January 2013 as a User Experience department with just two staff. In August 2014, UX staff advocated for library assessment (organizationally separate from UX at the time) to be combined with UX, and the library hired a new Assessment Analyst and Consultant. Since that time, AUX has continued to increase its capacity and responsibilities, expanding to include multiple student assistants and five full-time staff, including two web developers/project managers, two assessment analysts, and a department head. By combining UX, assessment, and project management in one department, AUX staff are able to approach web, assessment, and UX projects holistically and with UX and project management principles firmly in mind.

While the Emory University Libraries assessment program goes back to 2005, AUX as a team was founded in late 2018 in an effort to bring a more coordinated and strategic approach to these functions. Acknowledging the close relationship between assessment and user experience, a conscious choice was made to combine these roles into a single department and to situate it within public services rather than IT. The team of five consists of an Assessment Coordinator for the Services Division, an Assessment Coordinator for the Libraries more broadly, a Service Design Librarian focusing on UX generally and the Springshare suite in particular, a data analyst focused on collection assessment, and a department head who coordinates the program and seeks to better integrate AUX throughout Emory’s nine libraries.

FINDINGS

While these departmental structures work well at Duke and Emory, there are certainly many ways to approach assessment, UX, and project management. The nature of the work is such that it may be configured in any number of ways and performed by any number of individuals. For instance, in addition to their AUX departments, both Duke and Emory employ cross-departmental teams responsible for overseeing large-scale assessment projects, such as the campus-wide library survey. Many committees and task forces also include a member from AUX, ensuring that assessment and UX perspectives are represented broadly throughout the organization. The fact that UX is not siloed within IT makes it easier to direct the UX focus wherever it is needed. Maintaining flexibility and adaptability can be a huge benefit as one makes a case for expanding AUX principles within an institution.

In addition to describing what works well about Duke’s and Emory’s AUX structures, we will discuss the challenges of the ways we are currently organized. For instance, what does it look like to manage staff who are doing different types of work? What are the common threads that tie this work together, not only in a single department but across the entire library? What are the drawbacks to having such a large department or even multiple teams contributing to assessment and UX efforts? We will share what we have learned in our roles as department heads, drawing on real-world examples, from missteps to best practices.

The presenters will also talk about their future visions of AUX at their respective institutions. Even with a team of five, issues of capacity, priority, and focus arise regularly. A strategic mindset is required if the team is to have a broader impact without the addition of resources.

PRACTICAL IMPLICATIONS OR VALUE

Presentation attendees will learn how two large academic libraries have employed AUX teams to coordinate this complex work. The presentation is intended for a broad audience but will likely be of particular interest to those curious about combining Assessment and User Experience, building AUX teams, or approaching AUX work holistically.


2:55 p.m.
Analyzing Library Expertise: How Gonzaga University’s Foley Library Interrogates Expertise Data Towards Data-Driven Recruitment and Professional Development (Watch on YouTube; link opens in new tab)
Presented by Tony Zanders (Skilltype) and Paul Bracke (Gonzaga University)

Slides (PDF)
Show Abstract

PURPOSE AND GOALS

In February 2018, Gonzaga’s Foley Library became the first organization to begin research and development with Skilltype – a team of library technologists and developers focused on rethinking talent management for the LIS community. Leading a cohort of nearly a dozen academic libraries across the United States, Gonzaga sought to develop sustainable models to analyze in-house expertise, and how to use that analysis to make better talent decisions. This paper aims to showcase the community’s findings during 2 years of research and development, while illustrating what’s possible for libraries, their team members, and their consortia. Readers will learn the challenges to developing a data-driven talent management framework, how to create incentives for data sharing while being mindful of privacy, policy considerations at the individual and organizational levels, and more.

DESIGN, METHODOLOGY, OR APPROACH

At the outset of the partnership, the Foley Library was undergoing a reorganization. As part of this reorganization, a committee formed focusing on organizational effectiveness, comprised of staff across various units within the organization. One of this committee’s projects has been the Skilltype partnership, representing our organization’s views and and needs within the broader Skilltype community. Our work in the cohort was divided into two areas of research and development: the technology itself, along with the policy that governs its design. Each month, the Skilltype team met with our cohort as part of a feedback loop to solicit input for each release. Live sessions were archived online for later viewing. Representatives from a variety of types of academic libraries participated, including private and public institutions, large research institutions along with small liberal arts colleges. The community worked toward the goal of developing a sustainable solution for organizations of all sizes to analyze their expertise, develop personalized professional development pathways for each staff member, and eventually share this expertise across the community with other organizations. In this paper, we will publish for the first time before and after snapshots of how each institution managed expertise analysis within their organizations.

FINDINGS

One of our earliest findings was that the descriptive methods used in the library community for competencies and expertise were both fragmented and redundant. This led us to analyze, merge and deduplicate nine separate core competencies frameworks to create a single controlled vocabulary to describe staff skills and interests, producing an inventory of over 500 machine-readable skills and interests across the information science domain and adjacent areas in the Skilltype vocabulary. Another finding was related to staff motivations and incentives. In today’s privacy-conscious age, we had to address questions of data ownership, data privacy, and motivation to share personal data with an employer. Creating a top-down approach where institutional leadership required staff members to create profiles describing their expertise would result in less engagement, thinner data, and potentially a negative impact on morale. So we collaborated to understand what incentives would encourage participation in a voluntary manner, while still generating a robust enough data set to create meaningful insights for an organization’s administration. This led us to begin collecting professional development resources from conferences, associations and sites across the library web, describing them with our vocabulary to produce personalized learning and development feeds for staff. Activity from engaging with learning resources, coupled with expertise data, create two additional layers of insight for an organization seeking to make data-driven talent management decisions

PRACTICAL IMPLICATIONS OR VALUE

Early indications show that the development partnership between Gonzaga, Skilltype, and our other partner libraries have produced the beginnings of a data-driven paradigm of talent development for the broader LIS community. One of the first practical implications is that surfacing organizational expertise data can create new alternatives to acquire talent. For example, rather than defaulting to recruitment as the first option when new expertise is needed, managers have insight into both full and partial skill matches within the organization. A full match could lead to an internal conversation regarding promotion or role change, while a partial match could lead to a conversation about reskilling or upskilling. Having a real-time pulse on organizational expertise and learning activity that uses a shared vocabulary also implies that expertise sharing will become easier. As organizations use the same language to describe their needs, our community will come one step closer to smarter collaboration, better utilization of expertise, higher cost savings, and increased retention.


3:15 p.m.
Assessing the effectiveness of a university library’s strategic initiative to foster data-informed decision making (Watch on YouTube; link opens in new tab)
Presented by Dana Peterman (UCLA Library) and Sharon Shafer (UCLA Library)

Slides (PDF)
Handout (PDF)

Show Abstract
PURPOSE AND GOALS

“Maturity models” aid enterprises in understanding their current and target state. A recently enacted library strategic plan at the University of California Los Angeles (UCLA) identified an initiative to integrate a culture of assessment throughout the organization to encourage data informed decision-making processes. As a part of its charge, the newly formed Assessment for Change Team (ACT) identified dimensions for an assessment rubric to measure the progress of its initiative and to provide a potential road map for success. The purpose of this study is to determine where UCLA Library is in its assessment maturity. Using a maturity model, we will determine whether the strategy, roles, instruction, methodology, tools, data awareness, and services offered by ACT to library staff resulted in measurable changes in behaviors linked to library goals by answering the questions: 1) What categories of assessment criteria can be measured and analyzed? 2) What evidence shows library staff adopted the practices and principles of assessment as a process of change? 3) What evidence shows decision-makers use of assessment practices resulted in making data-informed decisions?

DESIGN, METHODOLOGY, OR APPROACH

ACT launched a four pronged approach to inculcate a culture of assessment: initiating workshops and education, developing a planning platform; investigating and promoting data repositories, analyzing the documentation of case studies, and offering consultations. Evidence of enterprise assessment maturity will be measured by collecting data from forms filled out by library staff in the assessment platform and analyzing the staff’s adoption of practices and principles of assessment. Participants in assessment educational activities will be surveyed and interviewed to identify their assessment work and any data-informed decisions that resulted. Documented assessment work, reports and associated data-informed decisions made between the launch of the strategic plan and ACT’s interventions will be identified and entered in to the assessment platform by members of ACT. Finally, library leaders will be asked to participate in a focus group to discuss the potential impact of the planning platform on their decision-making processes.

FINDINGS

Preliminary findings indicate the key role of consultations and mentoring in assessment. Workshops and education, while important, appear to play a supportive or initiating role. The use of planning document templates as a platform provided focus for assessments, but some staff approached them with timidity. Assessment was perceived by some library staff as a punitive, difficult, unrewarded, or low ranking activity. Nevertheless, upper-level management supported and valued assessment. While complete findings are not yet available, the analysis will focus on determining where UCLA Library is in its assessment model maturity.

PRACTICAL IMPLICATIONS OR VALUE

The rubric developed will allow ACT to understand what stage of assessment maturity the library has reached and what needs to happen next to progress. By locating the specific measures that demonstrate the most impact on decision-makers and on those conducting assessments, ACT can concentrate its efforts on activities and tools that encourage data-informed decision making as a valued process. Feedback from interviews will act as a sort of thick description to better shape assessment within the UCLA Library. It is hoped that these efforts could serve as a model for similar library organizations.


3:35 p.m.
ClimateQUAL & people-focused strategic planning (Watch on YouTube; link opens in new tab)
Presented by Susanna Cowan (University of Connecticut) and Lauren Slingluff (University of Connecticut)

Slides (PDF)

Show Abstract
PURPOSE AND GOALS

Parallel institution-level priorities at the UConn Library–a strategic planning framework process and a new emphasis on professional & organizational development–resulted in two major assessment initiatives the library: the implementation of ARL’s ClimateQUAL survey and data gathering in support of the Library’s Strategic Framework process. The juxtaposition of the two, which ran simultaneously, added richness to both. It enabled us to keep focused on our stakeholders while reflecting constantly on our own organizational strengths and challenges.

DESIGN, METHODOLOGY, OR APPROACH

The UConn Library conducted ClimateQUAL during three weeks in October, 2019, while simultaneously the Strategic Framework Steering Committee worked through the data gathering stage of developing a Strategic Framework for the Library. ClimateQUAL, ARL’s Organizational Climate and Diversity Assessment, is a well-regarded and highly-vetted survey-based instrument, which uses quantitative measures to assess an institution according to various “climates” (scales) measuring organizational climate and attitudes. By contrast, the qualitative methods developed for the Strategic Framework process comprised a survey and a series of conversations based on just three intentionally broad open-ended questions intended to solicit a wide range of responses from faculty, students, and staff.

ClimateQual reflected in its own right a focused commitment to organizational health, but it also provided, for the purposes of the Strategic Framework process, a benchmark for assessing the culture and climate of the UConn Library and highlighted our particular strengths and challenges. These areas of focus provided insight into what existing strengths we could leverage in our strategic thinking and where organizational health might need to be bolstered in planning for our future. Additionally, we added a range of other data sources to the framework process that bridged quantitative and qualitative data and rounded out our own data such as ARL and IPEDS metrics, and results from the Ithaka S+R faculty survey.

FINDINGS

From the start, it was clear that ClimateQUAL would inform the work of the Strategic Framework process. A decision was made to share the full results of the ClimateQUAL report with the members of the Strategic Framework Steering Committee before the ClimateQUAL executive summary had been drafted or the results shared with staff at large. Discussing ClimateQUAL findings with the group helped frame results of thematic analysis on our survey and dialogue data. We again bore ClimateQUAL in mind as we crafted the language of Framework itself, particularly the “Empower” section, which focused on staff diversity, learning, collaboration, innovation, and leadership.

In turn, the Strategic Framework influenced the framing and language of the Executive Summary of the ClimateQUAL findings. In addition to highlighting three areas for focused improvement, the executive summary underscored how these areas were strategic areas of focus that were essential in the library’s forward motion. In other words, the Strategic Framework further grounded the purpose behind ClimateQUAL: to bolster our strengths and address our challenges so that we can do great work. The Strategic Framework provided an articulation of our future as an organization and thus became also a framework for introducing ClimateQUAL to our staff.

PRACTICAL IMPLICATIONS OR VALUE

The fact that our implementation of ClimateQUAL and the launch of the strategic framework process were mutually informing by the end of each was initially an accident. It was early in the process of mapping the stages of the strategic framework process that we realized just how critical staff visions of the library would be to the process. As a result, ClimateQUAL became an essential component of the data gathering and reflection stages of the process. It is difficult now to conceive of one process without the other.

Strategic planning requires significant input from key stakeholders. Our Strategic Framework data gathering drew initially from three main sources: existing institutional data, open-ended question asking, and national benchmark data. Adding ClimateQUAL as a powerful, quantitative source of data providing a macro view of staff perceptions of our organization was a tremendous boon to the process and added context to the range of responses that emerged from our qualitative work. This paper will make a strong case for why a library might want to intentionally incorporate ClimateQUAL into a strategic planning process from the outset.


3:55 p.m.
Optimizing Staffing through Quantifying Workload per Position (Watch on YouTube; link opens in new tab)
Presented by Jennifer Livingstone (independent Data Analytics Consultant) and Vicki Thompson (Inasmuch Foundation)

Show Abstract

PURPOSE AND GOALS

Staffing decisions are some of the most difficult to make. In library systems with many branches, library mangers compete over a finite set of resources – staff. Often times, managers with the loudest voices and largest span of influence have their staffing requests approved the most. Many times, perception is the primary if not the only guiding factor in determining staffing decisions. This can result in systems ineffectively/inefficiently using their most important, and most costly, resources.

The paper explains how staffing decisions can be made more effective and data-driven.

Goals:

  • Managers will have data-based evidence to back up and justify staffing decisions both upwards towards management and downwards towards explaining to staff
  • Staffing resources are used more efficiently
  • Staff are given equitable workloads across position types across branches
  • Managers will move away from perception and ‘gut-based’ staffing decisions
DESIGN, METHODOLOGY, OR APPROACH

Staffing decisions often occur at two levels. One is at initially determining how many public library staff are allocated to each branch and the second is in then determining what shifts those library staff members are scheduled for. Many times, managers are not equipped with the right tools and data to effectively make these decisions.

To address these challenges,  we initiated a system-wide front-line staffing study that quantified total staff workload at each branch by position. The study identified over 30 different variables to measure as inputs. Using these variables, we then produced a total workload index by position type for each branch. By then comparing these workload indexes to actual FTE allocations, we were then able to see which branches had higher estimated workloads per staff member and which had lower estimated workloads per staff member. Equipped with this information, management could now better reallocate FTE. Additionally, these reports gave administration evidence to show managers that declined or accepted staffing requests were not based on favoritism or neglect, but instead equity.

The second level of staffing decisions occurs locally in shift scheduling. To provide decision makers with information at this level, we created a series of workbooks within our data visualization platform, Tableau. These were connected directly to our ILS, our door counter software, and our computer reservation system to create automated regularly updated reports. These allow managers to see actual and average activity levels at different times of the day and days of the week. Now, along with a variety of other data points, managers are able to see for every day of the week and hour of the day, the average occupancy, the number of items checked out from different parts of the collection, the number of items returned, and the number of people of different age groups using the computers – information that is key in helping them to schedule their staff most effectively.

FINDINGS

A ratio for workload per FTE was developed and revealed that staffing levels were not always equitable across branches. Based on this, recommended changes to FTE at each branch were developed and in some cases incorporated into the next year’s budget.

Activity level data was successfully shared and communicated through Tableau for managers to use.

PRACTICAL IMPLICATIONS OR VALUE

Using the data and recommendations shared, the library system was able to redistribute staffing resources more effectively and create more equity across branches for each position type.

On a local level, the Tableau dashboards helped managers make more fact-based decisions around when to schedule their staff to accommodate both peak and quiet periods.

On a system-level, the success of this project and its direct implications also helped to create buy-in to the idea of using data more in decision-making.


4:15 p.m.–4:25 p.m.: Questions and wrap-up

Moderated by Martha Kyrillidou (QualityMetrics, LLC) and Elizabeth Edwards (University of Chicago)

Wednesday, December 16: Critical/Theoretical Assessment and Space

Watch the full session:

1:00 p.m.–1:15 p.m.: Welcome Remarks

Jackie Belanger (University of Washington) and Maurini Strub (University of Rochester)

1:15 p.m.–4:15 p.m.: Paper Presentations


1:15 p.m.
Showing the Way: A UX project into navigation and wayfinding in 21 libraries (Watch on YouTube; link opens in new tab)
Presented by Frankie Wilson (University of Oxford)

Slides (PDF)
Show Abstract

PURPOSE AND GOALS

This paper presents the results of a large-scale, twelve-month project to investigate the usability issues relating to navigation and wayfinding in 21 of the libraries that make up the Bodleian Libraries, University of Oxford. The purpose of this project was four-fold. The primary purpose was to make it easier for user of the physical libraries to find things within the building – from physical features such as stairs and toilets, to services such as printers and self-issue machines, to books on shelves.

The second purpose of the project was to cascade UX techniques across the libraries, so that staff in the libraries are able to use these techniques for other projects in the future. Thirdly, the project was designed to provide a development opportunity for lower-level, frontline staff to take part in key strategic project, including all aspects: project management; collaboration; data gathering; problem solving; proposal scoping; presenting; write up. Finally, it was hoped that all these aspects would contribute to the embedding of a culture of assessment across the libraries.

DESIGN, METHODOLOGY, OR APPROACH

Following approval through the line management structure, seven project groups were formed consisting of one or two staff members from between two and four libraries, self-organised on broadly subject lines.

Each project group followed a UX approach – which melds ‘ethnographish’ (Lanclos & Asher, 2016) and design research techniques. This approach started with investigation of how library users actually behave when they try to locate things in the library, and how this makes them feel. The methods used to elicit this information were: observation – repeated observation of the people in specific ‘navigational decision-point’ spaces at different times; touchstone tours – experienced users of the library give the investigator a tour of the library; and ‘treasure hunts’ – experienced users of the library are asked to take photos of things that meet criteria such as “something confusing”, “something I would show a new student” then talk through their choices. These were supported by the use of eye tracking glasses with novice library users, following a standard protocol mimicking the journey to locate a book, photocopy it, and then take it to a specific location (e.g. group study room). This enabled investigators to literally see through the users eyes.

Following the initial data gathering, each project group participated in a workshop to analyse the data, identify the issues, acquire insight into the underlying reasons for the issues, generate ideas about how the issues may be solved, and identify ‘low hanging fruit’ to develop into the first set of prototypes to try. Each group selected two issues per library and tested two prototypes for each issue. These prototypes ranged from temporary signs made from cardboard boxes stuck to shelving with tape, to hand-drawn maps, to new shelf-end labels, to removing existing signs.

Following a third workshop to review the information gained by prototyping and share learning from the other groups, the groups worked through two more iterations of the prototypes.

FINDINGS

In addition to specific findings for each library (e.g. the best place for the self-issue machine; the most eye-catching location for a sign to the toilets) the project produced some overarching insights to navigation and wayfinding in libraries.

  • Library users navigate differently from staff. One reason is that users do not feel fully comfortable in the space and are sensitive to the impact their presence has on other users, which affects the routes they take.
  • Unless they are very experienced, library users feel anxious and uncertain when navigating the library. They recognise that it is complicated, and therefore need reassurance that they are on the correct ‘track’.
  • Signage cannot make up for poor layout design.
  • Users can cope with more than one classification scheme in the same space if and only if they are obviously different (e.g .Library of Congress and Dewey).
  • Not all information has to be presented at once – maps, signs and navigation aids should be specifically designed for how they will, at the point they are encountered, help the users on their journey.
  • Library users are not psychic – the location name / shelfmark presented on the catalogue has to be an exact match to the location name / shelfmark in the library. They are also unlikely to refer to a compass, so calling e.g. “the north stairs” is unhelpful. Names that need no explanation (e.g. “upper floor” “entrance hall”) are least confusing.
  • When it comes to signs, subtle is bad, and less is more.
PRACTICAL IMPLICATIONS OR VALUE

The project has resulted in extensive, specific changes to the navigation and wayfinding in all 21 of the libraries that took part, including new signs and maps, moving collections or services, and changes to the catalogue.

The project has also resulted in a ‘handbook’ of the principles of useful signage, maps, navigation and wayfinding for the Bodleian Libraries, including templates and examples.

In addition, the project engaged 34 front-line staff with UX process, providing them with the skills, experience and confidence to continue user-focused assessment.

Finally, the project demonstrated that library users accept prototyping and experimentation, even in historic libraries, paving the way for this approach in future projects.


1:35 p.m.
Envisioning our Future and Living It: New Library Space and How It Works for Us (Watch on YouTube; link opens in new tab)
Presented by Nancy Turner (Temple University Libraries)

Slides (PPT)
Show Abstract

PURPOSE AND GOALS

Temple University Libraries is participating in a broad assessment effort initiated by the Association of Research Libraries related to how library spaces facilitate innovative research, creative thinking and problem-solving. Here at Temple we are exploring how changes in space impact the work of staff, our work with colleagues and with users.

We had an opportunity to explore these questions when we opened a new main library in August 2019. In semi-structured interviews, we asked staff how they envisioned their work changing.  In the second phase of the project, conducted after our move to Charles Library, we are replicating the interview process.  We are positioned now to consider how our expectations, in terms of opportunities and challenges, played out and how those differences might inform decisions about space going forward. How is the space making an impact on how staff work?

  • As individuals?
  • With our colleagues?
  • With users?
  • What are the opportunities in the new spaces?
  • What are the challenges?
  • What are the ways in which staff are supported in making these transitions?
DESIGN, METHODOLOGY, OR APPROACH

Our research design includes two phases. The first phase consisted of structured interviews with library staff moving to the Charles Library in August of 2019. During the months of June and July, the research team members conducted interviews with 29 staff members from all levels of the organization from diverse functional areas. In addition, we facilitated two open sessions for library staff to confirm the emerging themes with a wider group of staff.   Conceived of as a “participatory action” project, participants reviewed the transcripts before sharing with the research team, and had the opportunity to review the final report draft before it was shared externally.

A second round of interviews is taking place now that we have been in the space for one semester. Our questions focus on the similarities and differences that staff perceive in the new spaces – the challenges and opportunities the new spaces provide for working with users, with colleagues, and in doing individual library work.  Because the spaces are still undergoing change, the timing of interviews now provides us an opportunity to explore issues of space as well as change management within organizations.

FINDINGS

Our findings at present relate to pre-move interviews.  Staff concerns centered most about the loss of private offices and the need to schedule rooms for all small meetings and consultations, including those with students. Another potential cost will be the shared spaces that are noisy and full of distractions. Yet for many staff members, these have always been part of their work environment. And they appreciate the collegiality and fun that can develop in these environments.

Staff did acknowledge that private offices can create isolation from colleagues, and a lack of visibility to students. Staff imagined that the new spaces will provide far more opportunities for interacting with colleagues in informal settings.

In the months prior to the move, the anxiety level of many staff was high. That stress needs to be acknowledged, with safe opportunities provided for open discussion.
The first phase of the research project led to a series of Workplace Norms discussions for staff – useful for many moving from private offices to shared office spaces.

Communication is key as staff prepare for changes. While most recognize that information may not be available, even at the highest levels, and is always subject to change, many cited the frequently scheduled open chat sessions when administrators were available to address staff questions.

PRACTICAL IMPLICATIONS OR VALUE

This project was designed as a Practice Brief under the umbrella of the ARL Framework for Library Impact. As a research project of practical use to ourselves and other libraries, we address these questions:

1. As we move into new spaces and initiate new services, can we identify areas for staff to
be engaged with assessment practice towards continuous improvement of those new
spaces?

2. What are the best practices for supporting staff as they prepare for changes to learning
spaces in the library?

3. What are the ways in which the library as an organization (administration, management, peer groups) might utilize this process for surfacing issues related to change and change management as it develops new types of spaces for innovation and creativity?

4. What is the feasibility and effectiveness of using an open, participatory action research process to surface how staff are experiencing the changes in how libraries are organizing space for work with users in new ways? Is this a process that might be useful for other libraries as they explore how change is managed in their organizations?


1:55 p.m.
Modern and Motivating: An exploration of student perceptions of the impact of design on productivity through architectural and library assessment lenses (Watch on YouTube; link opens in new tab)
Presented by Steve Borrelli (Penn State University Libraries), Lawrence Payne (WTW Architects), and Lana Munip (Penn State University Libraries)

Slides (PDF)
Show Abstract

PURPOSE AND GOALS

This study investigates the impact of a recent renovation of an approximately 10,000 ft2 space converting a news library and café into the Collaboration Commons, a flexible student-centered workspace. The session will present design considerations from the perspective of WTW Architects including acknowledging a shift in pedagogical practices increasingly supportive of collaboration, the psychology of color, proxemics guidelines, and flexible infrastructure in support of collaborative activities and the results of a mix-methods investigation into student use of the Collaboration Commons and their perceptions of how the space supports their academic work. A secondary goal included exploring sound measurement relative to other similar open study areas in the Libraries.

DESIGN, METHODOLOGY, OR APPROACH

This mixed-methods study draws heavily on data from semi-structured interviews and one focus group session with the Libraries’ Student Advisory Board. The goal of the interviews was to collect multiple accounts of students’ experiences in the Collaboration Commons, from which across-case themes related to use and perception of the space, would emerge. By augmenting interviews with a focus group session, the researchers aimed to bring greater depth to data collection. In addition to the study’s qualitative component, an observation study of occupancy by seat type and decibel level by location and time of day was conducted for a period of three weeks.

FINDINGS

Students reported that relative to other modern study spaces available on campus, what sets the Collaboration Commons apart is the availability of resources and the perception that the library historically communicated the idea of being a place to focus and work. Students described feeling “motivated” in the space, impacted by design elements including spatial configuration, color, lighting, and noise dampening technologies. Powered tables, flexible furniture, writable work surfaces and group study rooms equipped with technology were described as enabling them to “get work done.” While designed to support collaboration, only 25% of interviewees reported actively collaborating in the space while 75% reported being engaged in individual work or parallel studying. Observation data demonstrated that the space never topped 50% capacity. However, participants reported that the space often felt “full” suggesting that furnishing a space for groups of four or more can negatively impact occupancy percentages in practice.

PRACTICAL IMPLICATIONS OR VALUE

The study illustrates how contracting with an architectural firm with a proven track record in designing library and higher education facilities results in successful spaces that contribute to student motivation and productivity. That less active collaboration is happening in spaces than perceived shows the impact of furnishing choice when optimizing spaces for supporting student work. Implications of findings on future renovations will also be discussed. The results of the study have been shared with the Provost’s Office and university facilities, and used to facilitate library development efforts in addition to multiple award submissions.

PURPOSE AND GOALS

This study investigates the impact of a recent renovation of an approximately 10,000 ft2 space converting a news library and café into the Collaboration Commons, a flexible student-centered workspace. The session will present design considerations from the perspective of WTW Architects including acknowledging a shift in pedagogical practices increasingly supportive of collaboration, the psychology of color, proxemics guidelines, and flexible infrastructure in support of collaborative activities and the results of a mix-methods investigation into student use of the Collaboration Commons and their perceptions of how the space supports their academic work. A secondary goal included exploring sound measurement relative to other similar open study areas in the Libraries.

DESIGN, METHODOLOGY, OR APPROACH

This mixed-methods study draws heavily on data from semi-structured interviews and one focus group session with the Libraries’ Student Advisory Board. The goal of the interviews was to collect multiple accounts of students’ experiences in the Collaboration Commons, from which across-case themes related to use and perception of the space, would emerge. By augmenting interviews with a focus group session, the researchers aimed to bring greater depth to data collection. In addition to the study’s qualitative component, an observation study of occupancy by seat type and decibel level by location and time of day was conducted for a period of three weeks.

FINDINGS

Students reported that relative to other modern study spaces available on campus, what sets the Collaboration Commons apart is the availability of resources and the perception that the library historically communicated the idea of being a place to focus and work. Students described feeling “motivated” in the space, impacted by design elements including spatial configuration, color, lighting, and noise dampening technologies. Powered tables, flexible furniture, writable work surfaces and group study rooms equipped with technology were described as enabling them to “get work done.” While designed to support collaboration, only 25% of interviewees reported actively collaborating in the space while 75% reported being engaged in individual work or parallel studying. Observation data demonstrated that the space never topped 50% capacity. However, participants reported that the space often felt “full” suggesting that furnishing a space for groups of four or more can negatively impact occupancy percentages in practice.

PRACTICAL IMPLICATIONS OR VALUE

The study illustrates how contracting with an architectural firm with a proven track record in designing library and higher education facilities results in successful spaces that contribute to student motivation and productivity. That less active collaboration is happening in spaces than perceived shows the impact of furnishing choice when optimizing spaces for supporting student work. Implications of findings on future renovations will also be discussed. The results of the study have been shared with the Provost’s Office and university facilities, and used to facilitate library development efforts in addition to multiple award submissions.


2:35 p.m.
Leveraging Library Assessment for Research Evaluation (Watch on YouTube; link opens in new tab)
Presented by Lauren Di Monte (University of Rochester)

Show Abstract
PURPOSE AND GOALS

In this paper I argue that academic libraries should prioritize research evaluation in their suite of research support initiatives, and that library assessment professionals are uniquely positioned to lead this type of service transformation, as well as deliver research support services to clients and stakeholders.

DESIGN, METHODOLOGY, OR APPROACH

I begin this paper by providing a literature review of research evaluation to contextualize its place within higher education, and to develop a definition that can be used by academic libraries for planning and program development. Then I discuss how research evaluation activities intersect with the domain knowledge of library professionals in order to make a case for the strategic value of shifting library research support efforts towards research evaluation. Next, I demonstrate how the skills and knowledge of library assessment professionals are particularly well suited to forming partnerships with research evaluation stakeholders across campus, as well as informing the development of new evaluation- and assessment-based approaches to library research services. Finally, through a discussion of specific research evaluation projects completed at the University of Rochester, I show how the lens of library assessment has broadened our approach to research support, and helped us develop a high-impact research intelligence services for the university.

FINDINGS

Research evaluation refers to set of practices designed measure the quality of a research organization’s outputs. Universities are increasingly creating research evaluation units to help them understand and strategically position themselves within the global research landscape. Academic libraries are well positioned to support and lead such efforts. Focusing on research evaluation helps libraries leverage the knowledge and unique and valuable skills of library staff, and helps connect their research support programs to high-impact and high-priority strategic work happening at enterprise level. Assessment professionals are particularly well-suited to this task. Assessment librarians have the knowledge and perspective to identify the needs of stakeholders across the institution, and the skills to design and deliver programs that better align library services with university-wide research evaluation initiatives. Assessment professionals can help libraries add significant value to the cross-functional teams supporting research, and their work and perspectives should be engaged through the entire service lifecycle, and not only during post-hoc assessment.

PRACTICAL IMPLICATIONS OR VALUE

Assessment librarians can transform and enrich research support initiatives at academic libraries. Staff in these roles can capture institutional needs and gaps related to research evaluation and use those data transform of research services so that they help make the research enterprise more measurable, and the ouputs of that measurement more meaningful across the university. Further, assessment professionals can integrate into cross-functional research support teams to help the library deliver robust research support services that help transition the institution from indicator-focused evaluation towards outcomes-based initiatives.


2:55 p.m.
Undergraduate students’ attitudes about search data privacy in academic libraries: A qualitative research study (Watch on YouTube; link opens in new tab)
Presented by Laura Gariepy (Virginia Commonwealth University Libraries)

Slides (PDF)
Show Abstract

PURPOSE AND GOALS

This paper presents the findings of a qualitative study on undergraduate students’ attitudes about search data privacy in academic libraries, and their preferences for how librarians should handle information about what they search for, borrow, and download. This is an important topic due to the increasingly data-driven nature of assessment in higher education, contrasted with libraries’ professional commitment to privacy which has historically limited the amount of data collected about student library use. Although the literature is rich with evidence of librarians’ commitment to user privacy, very few studies address user perspectives on this matter.

The central research questions that guided this study were:

  1. What are undergraduate students’ attitudes about whether academic libraries should collect and maintain user search data, and why?
  2. What are acceptable and unacceptable uses of students’ library search data according to undergraduate students, and why?
  3. In what ways do undergraduate student attitudes about search data privacy differ in the context of using academic libraries and commercial search engines such as Google?
  4. What do students perceive as the risks and benefits of libraries collecting student search data, and how do these perceptions influence their search behavior?
DESIGN, METHODOLOGY, OR APPROACH

Using a qualitative approach through the lens of interpretive description, I used the constant comparative method of data collection and analysis to conduct in-depth, semi-structured interviews comprised of questions and vignettes with 27 undergraduate students at a large, urban public research institution. Interview transcripts were data-rich and totaled over 700 pages. Through inductive coding, I organized the data into interpretive themes and subthemes to describe students’ attitudes, and developed a conceptual/thematic description that illustrates how they are formed.

FINDINGS

Students revealed that a variety of life experiences and influences shaped their views on search data privacy in academic libraries. They viewed academic library search data as less personally revealing than internet search data and, as a result, were generally comfortable with libraries collecting search data so long as it is used for their benefit. They were comfortable with data being used to improve library collections and services, but were more ambivalent about use of search data for personalized search results and for learning analytics-based assessment. Most students expressed a desire for de-identification and user control of data. Some expressed concern about search data being used in ways that reflect bias or favoritism, and many were aware that privacy issues may be more significant for marginalized or vulnerable groups. Most participants had little concern about their library search data privacy being used by government agencies to protect public safety. Although some disagreed with the practice in concept, most did not feel that the search data would be useful, nor would it reveal much about their personal interests or selves. Students who were not comfortable with search data collection in academic libraries often held their convictions more strongly than peers who found the practice acceptable, and often identified as members of vulnerable or oppressed groups.

PRACTICAL IMPLICATIONS OR VALUE

The findings of this study provide some of the first in-depth, exploratory information about student perspectives on search data privacy in academic libraries. The results raise questions useful for shaping future library policies related to privacy. For example, who should librarians primarily focus on when developing privacy policies: the many, or the few? Libraries should investigate whether privacy policies should be based on the most conservative privacy related views held by students, even when those views are infrequent, or more liberal privacy views held by many. This question, and others raised in light of the results of this study, position librarian-researchers to further this line of inquiry in in order to inform libraries’ practices for assessment and evaluation.


3:15 p.m.
Indigenizing Library Spaces Using Photovoice Methodology (Watch on YouTube; link opens in new tab)
Presented by Susan Beatty, Cheryl Jeffs, and Alix Hayden (University of Calgary)

Slides (PPT)
Show Abstract

PURPOSE AND GOALS

The purpose of the study was to explore and understand how Indigenous undergraduate students experience their learning within informal library spaces and other spaces on campus. The results will inform and identify steps that the library might take to make the informal learning spaces more supportive of their learning.

DESIGN, METHODOLOGY, OR APPROACH

Starting in January 2020 it is expected that the data will be collected by March 2020 and initial analysis will be completed by June 2020. Because the primary researchers are non-Indigenous librarians/researchers, we wanted to conduct research collaboratively with Indigenous students. We are experts in librarianship and education, but novices in Indigenous ways of knowing. We chose a research methodology that places student voices at the centre of the research: community –based participatory research (CBPR) framework, which Castleden and Garvin (2008) note has the potential to contribute to efforts to decolonize the university researcher-Indigenous community relationship.

Indigenous students were recruited to act as co-researchers. Julien et al (2013) note that “due to the unique way Photovoice participants are involved in data gathering, analysis, and sometimes even the planning and dissemination phases of the study, they become researchers in their own right” (p. 259). They were asked to take photographs of informal spaces in the library and elsewhere on campus that inform such questions as “who am I as a learner?” and also share their reflective stories about learning that grow from the photographs. Through a series of workshops designed by the students we explored the meaning and relationship of space and learning from their point of view. Photovoice is a method designed to explore and uncover individual perspectives. By focussing on the informal spaces where students learn we uncover and explore the relationship that the Indigenous students have to space and learning. Students tell their stories that accompany the photographs allowing the student to explain to the researchers what was really going on in the photo.

To augment the main research question, the researchers conducted both pre and post study interviews with the students to discuss their learning and their experiences in the study. The researchers also kept field notes during each workshop to further investigate the Photovoice study process.

A final element in the study is a scoping review of the current literature on learning and learning services, supports and spaces for Indigenous students.

FINDINGS

Although the literature is peppered with suggestions for Indigenizing libraries, such as the personal librarian program for first year Aboriginal students at U of Alberta (Farnel et al, 2018), there has been little attention specifically on Indigenous students’ lived experience of learning within academic library spaces. Encouragingly, recent studies have investigated Indigenous students’ experiences and perceptions of academic libraries. Neurohor and Bailey (2016) conducted a photo-elicitation study that explored the role of academic libraries in the lives of native students. The results focused on the tangible such as uncertainty about library services (using the collection, signage, and printers facilitating student work). However, the researchers did not investigate the students’ experiences of informal learning within the library spaces. The findings of this study will have the following results:

1. What is the perceived relationship between space and learning from an Indigenous point of view

2. What is the value of Photovoice methodology in uncovering students’ perceptions of space

3. What is the value/learning related to students as co-researchers

4. What does the literature tell us is the current state of library and learning services, supports and spaces for Indigenous students

PRACTICAL IMPLICATIONS OR VALUE

We anticipate providing an authentic exploration of Indigenous students’ learning. Our study will help guide our library, as well as the academy, in Indigenizing learning spaces. We do believe that this research will be transformational for the students and for us as librarians/educators, the Library and the broader academic community. Our project is sustainable as it will bring to the forefront the ways in which Indigenous students learn in informal learning spaces, and will inform future initiatives on informal learning space design.


3:35 p.m.
Mapping Sense of Belonging in Library Spaces (Watch on YouTube; link opens in new tab)
Presented by Ted Chodock (College of Southern Nevada)

Show Abstract
PURPOSE AND GOALS

The purpose of this paper is to present a method to assess physical library spaces’ contribution to student academic success through spatially mapping student sense of belonging (SB). Its relevance for academic libraries emerges from SB’s association with persistence and improved grades for college students, overall, and for those in historically marginalized communities. Through locating SB-related experiences and learning activities in library spaces, it provides rich data connecting in-person library use with learning and academic achievement. These associations may assist academic libraries in aligning physical spaces, practices, and policies with college-wide goals related to student success outcomes and reducing achievement gaps.

DESIGN, METHODOLOGY, OR APPROACH

This paper is based on an ongoing qualitative comparative case study to be completed during spring 2020, which is being funded by a 2019 ACRL Academic Library Impact Research GrantIts cases are the physical spaces within the main libraries at two urban Southwestern Minority Serving Institutions, a community college and a research university. Primary data consists of observations, student generated reflexive photography, and photo elicitation interviews with undergraduates. A triangulation dataset includes space planning documents and 10 informant interviews with librarians and library staff.

This study’s theoretical framework centers on sense of belonging, viewed as a psychological, sociocultural, and spatial component of students’ experiences in academic libraries. Research suggests that there are links between informal and situated learning activities typical of academic libraries and SB, including studying alone, communally, or with a diverse group of students, discussing assignments, developing academic-focused student networks, and receiving academic support. In addition, various categories of higher education student experiences, attested in studies located in physical libraries and in other campus locations, have been linked to SB. These include feeling supported, comfortable, or at home, having favorite places, and being oneself. In the library context, examples include receiving support from librarians, which students experience as being cared about; studying by windows, experienced as attention-restoring favorite places; relaxing while engaged in social learning, experienced as feeling at home; and participating in safe spaces or counter spaces, which provide a place for marginalized students to be themselves and form community.

The SB-connected learning activities and experiential categories described above are the focus of this study’s observations and reflexive photography. When analyzed, these data will be used to generate a map of SB-related experiences in each site library’s physical spaces. An initial map layer will be formed by eight hours of unobtrusive observations. Grand tour observations are being used to identify locations with concentrated SB-related activities. These are being followed by library section-specific seating sweeps used to locate particular SB-related learning activities and experiences and to associate them with student demographics. Approximately 30 hours of photo elicitation interviews and 150 student generated images will overlay the data from observations with student perspectives on learning activities and experiences that have taken place in the photographed locations.

FINDINGS

Data analysis is ongoing but will mostly occur over the summer of 2020, with findings not yet available. Findings from a pilot version of this study, conducted at the community college site during the spring of 2019, include that learning activities, often with a social component, centered on group study rooms, and were a key factor in students’ sense of belonging. In addition, receiving support from librarians and library staff, as well as from fellow students, contributed to students’ emotional and social connection to the library as a community. Finally, favorite places, which students associated with views of natural scenes through windows or wall-mounted photographs, had restorative qualities connected to relaxation and SB.

PRACTICAL IMPLICATIONS OR VALUE

Despite a growing interest in the contributions of academic libraries to student sense of belonging, academic library research to date has neither connected SB to student learning and academic success outcomes nor to library space assessment. Filling this gap, through employing this study’s assessment method, academic libraries can communicate how library physical spaces contribute to student academic success and make a case for additional resources to remediate non-SB supporting library spaces. In addition, through mapping student demographics with learning activities and experiences, libraries can better understand how well, where, and in which ways they support marginalized student communities and make changes toward improving equity. Finally, this study contributes toward an ongoing reconceptualization of academic library physical spaces from locations for resource access to places that nurture learning while fostering community.


3:55 p.m.
Ring of Fire: Assessment Comes Full Circle for Rebuilding K-State’s Hale Library (Watch on YouTube; link opens in new tab)
Presented by Laurel Littrell (Kansas State University)

Slides (PDF)
Show Abstract

PURPOSE AND GOALS

What would you do, as an assessment librarian, if in one afternoon your entire main library suddenly closed for more than a year?

Brief information will be shared about the impact of such a large-scale tragedy, but the focus of the paper is in the importance of building and maintaining a body of useful assessment data that can be quickly incorporated in multiple ways when needed, specifically concerning library space design as an example.

DESIGN, METHODOLOGY, OR APPROACH

On May 22, 2018, a fire broke out in the roof of Hale Library, the main library at Kansas State University. The fire was in a difficult to access attic and resulted in a 550,000 square foot building fully permeated with toxic soot and approximately 650,000 gallons of water. Although no one was injured and most of the collections were ultimately saved, the interior of the building was almost completely destroyed, requiring a nearly complete rebuild and redesign. Assessment opportunities permeated every aspect of rebuilding Hale Library including collections curation, providing services, and facilities/space planning.

In the previous year, the assessment librarian conducted a study of several sources of recent assessment data, compiling this information into a unified report of online and physical library usage across user groups and disciplines. Much information about Library as Place had already been gathered in preparation for a renovation of the first floor of the library, to take place in the summer of 2018. Because these plans were already in place, the first-floor project was implemented immediately to enable this part of the library to open as soon as possible. In the meanwhile, realizing the fire had expanded renovation plans to the entire building, a small team developed a more detailed survey and focus group project to assist in developing a larger renovation plan to meet user needs for library space, access to technology and physical collections, and library and technology services. The results from this project along with previously compiled information provided useful information to address needs for study spaces, collections footprints and rearrangement, locations of services, and building navigation.

When the first floor opened in August 2019, a post occupancy assessment plan was implemented to learn how users responded to the new space with the idea of applying this information elsewhere in the building. By using Suma (from North Carolina State University Libraries), building and furniture usage and patterns were observed to report to architects and designers for further design consideration.

FINDINGS

Consistency amongst the various sources of data was further borne out by the post occupancy assessment, providing reliable information for the redesign of the rest of the building. Greatest users of the physical library building are undergraduate students. When the first floor of the library opened, students immediately took ownership of the available space and the rate of occupancy has been consistently very high. The second floor of the library will open in March 2020 and the occupancy study will be extended to that area to assist with design in the remaining three floors, scheduled to open in late 2020.

Library User Preferences:

  • Collaboration rooms for small group study
  • A variety of seating configurations available
  • Tables with expansive surfaces
  • Power outlets
  • Whiteboards
  • Private study options such as shielded chairs or individual study rooms
  • Brighter, updated décor and restful colors in the newer wing of the building
  • Restoration to a traditional library setting for the original building

Efficiency and density of seating was an important configuration. Users disperse themselves as broadly as possible within a space. Furniture design and placement can encourage a higher percentage of seating utilization.

More people per square foot

  • “Zigzag carrels” with angles and sightlines
  • Small tables with 1-2 chairs
  • Roller chairs with large swing arms

Fewer people per square foot

  • Larger tables occupied by one person
  • Bench seats
  • Lounge seating
PRACTICAL IMPLICATIONS OR VALUE

Demonstrating library value was easy when suddenly the library was unavailable, but it created heartbreaking hardship to the university community. The opportunities for reconfiguring services, collections, and the facility are unprecedented and must be fully leveraged. This tragedy demonstrates the importance of keeping current reliable assessment data from multiple sources at the ready.

By studying patterns of library space usage during both peak and non-peak times, determining space and furniture needs can be applied to future renovations. Balancing user comfort and efficiency of space usage can assist with designing library spaces that are attractive to users yet will accommodate more people comfortably.


4:15 p.m.–4:25 p.m.: Questions and wrap-up

Moderated by Jackie Belanger (University of Washington) and Maurini Strub (University of Rochester)

Thursday, January 21: Measurement and Methods

Watch the full session:

1:00 p.m.–1:05 p.m.: Welcome Remarks
Frankie Wilson (University of Oxford) and Steve Hiller (University of Washington)

1:05 p.m.–5:25 p.m.: Paper Presentations

1:05 p.m.
Assessing library contributions and impact across the research lifecycle: a collaborative approach (Watch on YouTube; link opens in a new tab)
Presented by Fern Brody (University of Pittsburgh) and Jackie Belanger (University of Washington)

Show Abstract

PURPOSE AND GOALS

The purpose of this project, carried out under ARL’s Research Library Impact Framework Pilots Program, is to explore the ways in which the research library community can work together on developing and testing tools for measuring library contributions to researcher productivity.

Teams from University of Washington and University of Pittsburgh are working together to pilot approaches for investigating the impact libraries may have on the research lifecycle. Each institution will examine a different aspect of the research lifecycle: discovery (Pitt) and research impact (UW). Both partners will focus on the experiences and practices of early career researchers within STEM-related and health science disciplines.

The aim of the project is threefold:

  1. To provide feedback to ARL on the effectiveness of the collaborative approach to impact assessment work (based on the experiences of the team).
  2. To develop tools for measuring library contribution and impact that can be used across a range of research libraries.
  3. To provide each institution with actionable assessment data to inform service development and improvement.

DESIGN, METHODOLOGY, OR APPROACH

Each institution will focus on a specific stage of faculty research using semi-structured interviews: University of Pittsburgh will conduct interviews with early-career engineering and basic sciences researchers to investigate the role of library resources in the information-seeking behavior of researchers in the discovery phase. The University of Washington will conduct interviews with post-doctoral researchers and early career faculty in selected STEM and health sciences departments to understand their current needs for demonstrating the impact of their research and to explore the Libraries’ contribution to this phase of the research lifecycle. In both cases, interviews are designed to understand the impact of any current services, learn about researcher needs in order to develop future services that could have an impact on research productivity, and explore how users understand and define library value in terms of research support. Each institution will conduct their data collection phase in Spring/Summer 2020, with a view to complete all analyses and share findings over the Fall 2020 term.

The two institutions are collaborating throughout all stages of the project to identify target audiences, develop data collection tools, obtain IRB clearances, develop data analysis methods and to discuss the effectiveness of their respective approaches to their individual projects. The aim is to develop a set of tools that could be applied at other institutions and with other disciplinary populations to help libraries more effectively assess their contribution to faculty productivity at various stages of the research lifecycle.

FINDINGS

While results from the interviews will be in hand by October 2020 and briefly highlighted in this paper, the focus of the conference paper will be on discussing how the two institutions collaborated to develop an overarching shared research question, methodologies for understanding library contribution to researcher productivity, and an approach to impact assessment that can work across various libraries. The library research services (discovery and research impact) selected for this project are in different stages of maturity within the library landscape and we are particularly interested in developing and testing tools that will indicate if there are differences in researchers’ understanding of the libraries’ role in providing these services.

PRACTICAL IMPLICATIONS OR VALUE

The findings of this project will

  1. Inform the ARL’s future work on the development of indicators of impact and contribution.
  2. Provide each institution with tangible data to act on in considering development or changes to existing services.
  3. Provide a set of assessment/data collection tools for use by other institutions.

1:25 p.m.
Where do we go from here: managing collection moves with data visualization (Watch on YouTube; link opens in a new tab)
Presented by Anne Koenig (University of Pittsburgh) and Berenika Webster (University of Pittsburgh)

Slides (PPT)
Show Abstract

PURPOSE AND GOALS

Hillman Library, the main library at the University of Pittsburgh is undergoing a multi-year, multi-floor renovation.  Collections are being moved and shifted throughout the building and to our high density storage facility as floors within Hillman Library are closed, and others are redesigned with significantly less shelf space.  Our goals were two-fold: model various selection criteria for what collections would remain in the reinvented Hillman Library, and assist our users with finding collections that were moved to temporary or new permanent locations within the building.

DESIGN, METHODOLOGY, OR APPROACH

To model selection criteria, we created a Tableau visualization based on inventory reports which included filters for call number ranges (LC Class), historic circulations, date of last circulation, and publication dates.  Estimated linear footage calculations were incorporated into the visualizations, so we could test and model the impact of different selection criteria on the size and scope of the collection footprint.

To assist users to locate collections, we developed an interactive map in Tableau.  Users select the call number / subject area they wish to find, and the location of the item/collection is highlighted on the map.  This interactive visualization is publicly available via a link from our website and on iPad displays; it is updated as collections move.

FINDINGS

Our AUL for Collections and Technical Services, and our Collections Move Management Group have used the selection criteria modeling tool to shape the scope of the collection in the Main Library to fit in the allotted space, and meet the needs of our users by keeping the most relevant material in the building and sending items in less demand to our high density storage facility.  The visual aspect of the tool enabled more streamlined interactions with the data.  It allowed for combining data from inventory reports (size of collection) and architectural plans (linear footage of available shelving), replacing massive excel spreadsheets with easy to understand visual representations of the size of collections and shelf space and allowed for quick modelling based on selected criteria, such as LC class, publication year (ranges), volume of circulations, and dates of last circulation..

For several years now, staff and patrons have used our interactive map to locate collections.  The visualization is easy to update and maintain, and changes can be made much more quickly than to permanent/physical building signage.

PRACTICAL IMPLICATIONS OR VALUE

The methodologies that we used could easily be replicated by other libraries coping with large collection moves and renovation projects as well as shifting collection priorities and weeding projects.


1:45 p.m.
Search, Report, Wherever You Are: A Novel Approach to Assessing User Satisfaction with a Library Discovery Interface (Watch on YouTube; link opens in a new tab)
Presented by Rachel Vacek (University of Michigan), Ken Varnum (University of Michigan), and Craig Smith (University of Michigan)

Slides (PDF)
Show Abstract

PURPOSE AND GOALS

In the summer of 2018, the University of Michigan Library launched a new discovery interface, Library Search (https://search.lib.umich.edu), for discovering the library’s resources, collections, spaces, and expertise. Following our assessment plan for Library Search, we have iteratively measured the success of Search since its launch across a variety of measures using a mixed-methods approach. We have tested for accessibility, monitored system performance, and conducted considerable usability testing as part of the design and development process. In general, our assessments have indicated that Search works well according to many of our metrics, including accessibility, usability, design, and the general pattern of how searching for items, narrowing results sets, and accessing materials.

There have been, however, a number of concerns about the catalog search in particular, and a general anecdotal sense that this important part of the interface is not quite meeting users’ needs. Therefore, we created a tool that we could use to first get a baseline measure of overall user satisfaction, and second, use again after we have made changes to Library Search to understand the degree to which our changes improved satisfaction.

Our paper will report on our launch of our rather unique data collection tool and its use with multiple groups of stakeholders on campus. We will also detail how we plan to use the tool to gauge user satisfaction over time, and to regularly gather actionable data on the strengths and shortcomings of our catalog interface.

DESIGN, METHODOLOGY, OR APPROACH

An initial round of data collection focused on the search experiences of library employees whose work involves using Library Search to assist members of our campus community. Focusing on this group allowed us to survey people who had clear expectations of how catalog searching should function for library staff and users, and also to ensure that our data collection tool worked well before we used it to collect data from large numbers of faculty members and students on campus. The first full round of data collection, with faculty and students, is taking place in the winter of 2020; the results will be included in our paper and presentation.

The invitation to participate in the initial round of data collection was sent in December 2019 to 96 Library employees. As an incentive, participants were given a chance to enter a drawing for one $50 gift card. Participants took part in the study online, via a survey on the Qualtrics platform. Forty people provided enough data to be included in some analyses (a 41% response rate); of those, 36 completed the whole survey. The final sample had decent variability with regard to respondents’ library division and the number of years they had worked in the library.

A unique aspect of our data collection approach was to ask participants to conduct searches, to report on those searches, and to share the URLs associated with their search results. This allowed data from the survey to be interpreted while seeing exactly what participants were seeing as they did their searching. Thus, the first three sections of the survey asked participants to keep the survey tab open in their browsers while conducting specific types of catalog searches in a separate tab. The types of searches — known item, known set, and exploratory — were derived from another recent investigation of our library search interface. Participants used the Qualtrics survey to answer questions about those three search experiences. Specifically, participants were asked to report on their satisfaction with the relevance of results, the speed with which results were returned, and the adequacy of various pieces of information contained in item records. When participants encountered unexpected results in their searches, they were given an opportunity to share more about what they expected to see, in relation to what they saw.

A final section of the survey asked people to provide more global ratings and comments related to recent uses of Library Search (not limited to catalog searching; this could also include focused searches for articles, databases, etc.). For those that remembered using Search a year prior, a small set of questions also asked people to compare their current satisfaction with Search to what they remember feeling a year ago. These final questions, about recent experiences and comparisons to a year ago, were answered by most participants.

FINDINGS

We asked questions about three broad areas of search interactions: known item searches, known set searches, and exploratory searches. Known item searches are for specific, individual items. Known set searches are for collections of items (plays by Shakespeare, sonatas by Mozart, jazz CDs, etc.), from which the searcher would be more or less satisfied with any specific item. Exploratory searches are subject or topic-related.

For known item searching, respondents were asked two questions about what they saw in the results: did the item appear in the results as expected, and was the position of the item in the result set satisfactory. Seventy-five percent saw the item in the results as expected, and the majority (92%) were either very or moderately satisfied with the position.

We asked several additional questions for known item searches (we did not ask these questions for the other search categories, as we felt the responses would not be substantially different). When asked about the speed of search results, most (95%) expressed some level of satisfaction. In terms of ease of determining availability of print or online access to the items found, most were moderately or very satisfied (85%) but a notable minority were dissatisfied. And most people were moderately or very satisfied (86%) with identifying where physical items were located, with a notable minority expressing dissatisfaction.

For known item searches — and for the other two search types — participants who saw unexpected search results were asked to share what they expected to see, and what they did see. Comments touched on concerns such as the relevance of results and the way that holdings were displayed. These findings serve as guides for the continued fine-tuning of Library Search. In the paper we will present examples of how such comments were paired with recreated searches in order to guide the work of our developers.

For known set searches, just over half (58%) of the 36 participants who did this search saw what they expected; respondents were satisfied with the ranking of the results, about 50% each very or mostly satisfied. For exploratory searches, just over half (56%) saw what they expected in the results. Most (80%) were moderately or very satisfied, and a sizable minority (20%) reported some level of dissatisfaction. As noted, where participants saw unexpected results, they provided comments that shed additional light on their closed-ended survey responses.

In the final section of the survey, participants were asked about their recent experiences with Search (not limited to Catalog Search), and their views on whether Search has improved or not compared to a year ago (for those with memories of Search at that time).

Thirty-three participants had used Search within the previous two weeks. Of these, roughly three-quarters were moderately or very satisfied, with the rest expressing dissatisfaction. When asked about their satisfaction with the recent relevance of Search results, few were very satisfied (12%); most were moderately satisfied, and 30% expressed dissatisfaction. The same results were obtained when people were asked to rate their overall level of satisfaction with their recent experiences with Search.

When asked to compare their current satisfaction with the speed of Search with what they remember from a year prior, most (81%) were somewhat or much more satisfied currently. When asked to compare their current satisfaction with the relevance of Search results with what they remember from a year prior, 72% were somewhat or much more satisfied currently. Finally, when asked to compare their current overall satisfaction with Search compared to a year ago, 82% were somewhat or much more satisfied.

PRACTICAL IMPLICATIONS OR VALUE

This study provides an example of how libraries can use an online data collection tool to reach key stakeholders when evaluating satisfaction with a web-based library interaction. The general categorization of “known item”, “known set”, and “exploratory” searches, itself based on user research into kinds of searching, could easily be extended to other kinds of library interactions in which a user is seeking something. The general method of the survey allows disintermediated user research to take place, with the efficiency of gaining detailed user feedback about specific interactions without the investment of a commensurate amount of staff time.Another key advantage of our methodology is that it facilitates the repeated evaluation of the search interface over time.The inclusion of both library staff and campus users enables us to identify high-priority issues via staff insights and also to understand how users with a wide range of search expertise experience a core library discovery interface.


2:05 p.m.
From Analyzing Abundant Data to Identifying Actionable Steps: A Closer Look at Library Student Data and Success (Watch on YouTube; link opens in a new tab)
Presented by Mariya Gyendina (University of Minnesota Libraries), Jan Fransen (University of Minnesota Libraries) and Kate Peterson (University of Minnesota Libraries)

Slides (PDF)
Show Abstract

PURPOSE AND GOALS

This presentation follows up on the previous work in the field establishing connections between students’ use of libraries and student success measures such as GPA and retention (e.g. Soria, Fransen & Nackerud, 2014; Soria, Fransen & Nackerud, 2017; Nackerud, Fransen, Peterson & Mastel, 2013). Critiques of this body of work (e.g. Robertshaw & Asher, 2019) suggest that the correlations between students’ use of libraries and their GPA have low to non-existent effect sizes. This critique prompted a deeper analysis of more specific library usage types and the distinct populations that use them, since we know that students from different backgrounds experience college in disparate ways and have varying needs.

Analyses of Fall 2011 first year undergraduates showed correlations between use of library resources and typical success measures (GPA, retention). Effect sizes were small when compared to predictors such as ACT score and college credits earned while in high school, but still significant. The studies led to different kinds of outreach efforts, and by Fall 2016 88 percent of undergraduates used the library compared to 77 percent in Fall 2011.

Buoyed by our belief that using libraries helps students attain their goals, we sought to do more with what we have. Rate of library use is now so high overall that looking for correlations, while controlling for the many other factors that influence student success, has little value. Instead, we shifted our focus to specific populations, defined by characteristics such as college of enrollment and first generation status. For example, if data show that first generation students in the college of biological sciences are less likely to use the digital resources, and that those students in that cohort who DO use digital resources are more successful, we can devote time and effort to reaching those students. Yet, how can we efficiently analyze all of these potential characteristics?

DESIGN, METHODOLOGY, OR APPROACH

Using multiple demographic and institutional indicators, we took an iterative approach to building statistical models using one, two, and three predictor variables to create ANOVA models. This allowed us to estimate the correlation and see the effect sizes and eta-squared measures to have a more nuanced understanding of the importance specific correlations have. This approach follows the blueprint established in the previous round of analysis (Gyendina and Fransen, 2019). We started with creating one-way ANOVA models based on focused subsets created using three demographic and institutional variables. For example, bring together characteristics like age, college (e.g. science and engineering), and had library instruction to see if there is a meaningful correlation.

After analyzing the preliminary results, we identified key areas with the highest potential for actionable steps and developed relevant three-way ANOVA models.

FINDINGS

We present the relevant models with moderate to high effect sizes and corresponding actionable steps we are proposing. The presentation also describes the process we used and invites the audience to consider this methodology.

PRACTICAL IMPLICATIONS OR VALUE

The presentation invites the audience to consider the potential use of data to communicate with college stakeholders.


2:25 p.m.
15-minute break


2:40 p.m.
Have library users’ expectations decreased? A statistical comparison of LibQUAL+ scores for Affect of Service, Information Control, and Library as Place over the past two decades (Watch on YouTube; link opens in a new tab)
Presented by Martha Kyrillidou (QualityMetrics LLC), Giovanna Badia (McGill University), and Colleen Cook (McGill University)

Slides (PDF)
Show Abstract

PURPOSE AND GOALS

Most members of the Association of Research Libraries spend the largest portion of their budgets on collections, as shown in the 2017-2018 ARL Library Investment Index. While academic library budgets focus on collections, a recent report by the Ohio Library and Information Network calls for a focus on user needs rather than collections since the population being served aims to find and access a variety of sources, beyond the holdings of their local libraries (Evans & Schonfeld, 2020). This is reinforced by the majority of users starting their search in Google or Google Scholar instead of the library. The report also mentions that “user expectations of finding, accessing, tracking, and using content and services are increasingly driven by consumer technology product experiences that have dramatically re-centered on the user, for example, Amazon Prime memberships and delivery, streaming media, Uber, …” (Evans & Schonfeld, 2020, p. 16). With the shift to increased user centeredness in technology applications, have individuals’ expectations of the library changed over time? More specifically, do users perceive specific library functions, such as collections, reference services, or physical spaces, as being more important now than they did in the past? If so, is a particular generation or group of users driving this change? Since changes in users’ expectations have the potential to affect library strategic directions and budget allocations, the authors sought to answer these questions by comparing the LibQUAL+ survey results of libraries who have administered this standardized survey multiple times since the early 2000s, with their most recent survey administration being in the past 5 years.

DESIGN, METHODOLOGY, OR APPROACH

LibQUAL+ results were used to determine changes in library users’ expectations over time since respondents are asked to assign one score for each statement in the survey, from lowest to highest on a 9-point scale, for the minimum service level they find acceptable, the level of service they personally desire, and the level of service they believe is being provided. Statements in the LibQUAL+ survey address “Affect of Service” (about interactions between library staff and users), “Information Control” (about access to resources) and “Library as Place” (about physical spaces). Differences between LibQUAL+ scores for each question over time were analyzed to determine whether the variation was statistically significant. The authors studied differences between scores for LibQUAL’s three categories of questions (i.e., Affect of Service, Information Control, and Library as Place), both for individual libraries and among different libraries, as well as examined whether rises in one category were predicted by changes in the other(s).

FINDINGS

Preliminary findings indicate that users’ expectations of collections have decreased over time in some institutions while their expectations have increased in other areas, such as the library as place. Libraries as infrastructure has been the focus on much architectural work in recent years (Mattern, 2014). The authors will discuss the extent to which this phenomena and other observed changes are widespread.

PRACTICAL IMPLICATIONS OR VALUE

Changes in library users’ expectations over the past two decades, especially in regards to collections, will question or reinforce the traditional role of libraries as content assimilators and providers. Knowing which areas have led to increased user expectations will help library administrators with strategic planning and allocation of funding.


3:00 p.m.
What Does It Take? Evidence-based strategies for student survey engagement (Watch on YouTube; link opens in a new tab)
Presented by Nicole Betancourt (Ithaka S+R)

Slides (PDF)
Show Abstract

PURPOSE AND GOALS

The risks of not understanding and effectively engaging with any target population when conducting research are clear: low response rates and skewed sample parameters can introduce bias and error, decreasing generalizability and representativeness.

In fall 2018, we surveyed students across seven community colleges to assess the value of and demand for proposed services designed to address student goals, challenges, and needs. As part of this research initiative, we concurrently examined strategies for increasing survey participation and subsequently generated recommendations for creating survey communications, distributing surveys via email, and determining effective incentives.

DESIGN, METHODOLOGY, OR APPROACH

We specifically explored in-depth the effectiveness of using different student email address types in outreach, including both institutional and alternative email addresses. We also A/B tested Amazon and Visa gift cards in a lottery incentive to assess which incentive was more likely to resonate with students and therefore maximize response rates. While testing both of these interventions, we also employed a suite of outreach strategies that we have employed over the past two decades in our long-running national faculty survey as well as our local surveys of faculty and students.

FINDINGS

Above all, local context matters–there often just isn’t a one-size-fits-all conclusion for every institution. However, there are clear steps that colleges can broadly take to understand their context and employ strategies to drive survey response rates, including requesting information from institutional databases early; meticulously cleaning lists of survey invitees; contacting survey invitees at key times of the day/week; personalizing survey communications; and strategically selecting signatories for survey messaging.

Additionally, in testing different student email address types, we found a substantial degree of variation across each of the partner colleges on this project, indicating that each has unique challenges and opportunities for contacting students in communications for surveys and beyond.

When examining rates of response among the two different prize types, we found that identifying an incentive that resonates with students without influencing responses in a biased fashion is important in bolstering results. In our session, we will discuss how to determine an effective incentive to offer students for participating in a research project.

PRACTICAL IMPLICATIONS OR VALUE

Implementing this large-scale survey allowed us to both utilize existing best practices developed for other higher education communities and to gain greater insight into newly tested approaches. We particularly found that taking a proactive approach in the early stages of a project will ensure a successful survey administration. Additionally, we highly recommend gathering information on whether students primarily use an institutional or alternative email address, as participation can vary depending upon which approach is used. It is also advisable to determine an incentive that resonates with students without influencing responses in a biased manner, as this can help to increase response rates.Our aim in sharing these findings is to continue to uncover and improve upon survey administration practices for surveying community college students, students more broadly, and the greater academic community.


3:20 p.m.
Secret Shopping as a Method to Understand User Experience: A Case Study (Watch on YouTube; link opens in new tab)
Presented by Grace YoungJoo Jeon (Tulane University)

Slides (PDF)
Show Abstract

PURPOSE AND GOALS

In 2019, the Tulane Libraries conducted a user experience study that employed a secret shopping method at four service points: the Circulation Desk, Research Help Desk, and Media Services Desk at Howard-Tilton Memorial Library (i.e. main library), and the Information Services Desk at Rudolph Matas Library (i.e. health sciences library). The goal was to understand how Tulane students experience the libraries at different service points. Specifically, this study was intended to document users’ experience with current library services, provide benchmark data, and provide inputs for refining or developing training for library employees if necessary.

DESIGN, METHODOLOGY, OR APPROACH

This study applied the concept of secret shopping, a research method that evaluates customer service by having trained people act as customers and rate their experience based on given criteria, to the library setting. Participants posed as patrons and asked library employees questions based on assigned scenarios. Once participants completed each scenario, they submitted online evaluation forms to provide information about their experience with the service point they visited/contacted.

The author investigated previous work that employed the secret shopping method in the academic library setting to develop effective data collection instruments. As part of this process, the author attended the Library Assessment Conference 2018 and spoke with colleagues from Texas State University who had years of experience conducting secret shopper assessments. This study’s instruments were developed based on samples shared by these colleagues.

The study included three phases: (1) analysis of service points’ transaction data, (2) development of scenarios, and (3) implementation of secret shopper assessment. First, an analysis of transaction data for the past three years (2016-2018) obtained from Ref Analytics in LibAnswers was carried out to understand formats and topics of questions student users commonly asked, resulting in the development of 45 scenarios. These included various combinations of question formats (i.e. in-person visit, phone call, chat, and email) and question topics (i.e. check-out, directional, literature search, reference including research help, resource, technology help, and miscellaneous), depending on the service point.

In implementing the secret shopper assessment, participants first attended an introductory session to ensure that they all had a common understanding of the study procedure. Each was given three randomly assigned scenarios and two weeks to complete them. They could visit or contact the service point at any time or day of the week during the service point’s hours. After completing the scenario, participants submitted an online evaluation form to provide information about how they perceived the customer service at the service point they visited/contacted based on their interaction with the library employee. The form asked about participants’ experience at the service point, such as their satisfaction with and comprehension of the answer provided and their perception of approachability and attentiveness of the library employee. Participants who completed all tasks received a $25 Amazon gift card for their participation.

FINDINGS

Twenty-one undergraduate and graduate students participated. The results show that overall, participants found their experience pleasant and positive. The majority reported that they were satisfied with the answer provided regardless of the mode of interaction and location of service points. Participants felt that the library employees treated them respectfully, were attentive and approachable, and answered their question in layperson’s language. Although the sample is small, a few participants who interacted with the library employees via means other than face-to-face contact reported some dissatisfaction with their experience, including a lack of contextual cues in phone interaction, lack of personalization in an email response, and difficulty in connecting to a chat.

The results allowed the Libraries to identify several ways to improve user experience at these service points, including reminding users that they could turn to the libraries for other additional help at the end of each interaction and refining ways to provide non-face-to-face interactions (i.e. phone, chat, and email).

PRACTICAL IMPLICATIONS OR VALUE

This proposal will demonstrate the reproducibility of secret shopping as a user experience study method in the academic library setting. Secret shopping has been used in many libraries to understand user experience and evaluate library services. However, in prior published work, relatively little information has been provided regarding the development of data collection instruments. Our study shares details about designing and developing data collection instruments to show that secret shopping can be a robust user experience research method in the academic library setting. Furthermore, this study indicates that sharing first-hand experiences among librarians in designing and implementing assessment activities offers one way to build effective and sustainable assessment.


3:40 p.m.
Unraveling the (search) string: Assessing library discovery layers using patron queries (Watch on YouTube; link opens in new tab)
Presented by Robert Heaton (Utah State University) and Liz Woolcott (Utah State University)

Slides (PDF)
Show Abstract

PURPOSE AND GOALS

Discovery tools are inherently complex. Users bring expectations and mental models from their experience with search engines, librarians often come from a different background of library catalogs and databases, and vendors and technical staff must have new and growing expertise in order to configure discovery services and interrelated tools. Anecdotes from each of these stakeholder groups can muddy important conversations and decisions about discovery. In a period of high uncertainty about the future availability of our discovery tool, Encore Duet, we hoped to move forward with a thoughtful approach to potential alternatives. Intense vertical integration in the industry limits competitive options and requires libraries to define their priorities carefully when assessing software. As librarians in two technical-services units, we devised a systematic method for capturing user behaviors in our current discovery tool.

DESIGN, METHODOLOGY, OR APPROACH

Our integrated approach centered on analyzing the “found data” of search logs, where we looked for patterns in user search strategies, their interactions with interface features, and the effectiveness of the results returned, including the relative prevalence of book and article results. We extracted search query data from Google Analytics on three separate occasions and coded the queries to determine use of faceting, search repetition, and known-item searching. These methods draw upon real-world data that offer better accuracy than insights into users’ behavior as reported in surveys, interviews, or focus groups. Similar methods—a mixture of automated and manual processes—can be applied to any discovery tool or search engine. Our library has a rich recent history of usability testing, to which our research adds a much larger body of data, allowing us to identify authentic trends among our user population. These strategies equipped us with key facts about our users and key questions to consider when a potential migration approaches.

FINDINGS

Initial analysis of the data reveals that users employed faceting to refine their search in only 21% of the queries while using repeated, adjusted queries to refine their searches approximately 60% of the time. This indicates that users might not be using the facets as frequently because search term adjustment allows for more control over the search process. Known-item searches were more commonly used to find books rather than articles, but overall, known-item queries accounted for approximately one-third of searches. These known-items searches were compared to circulation numbers within two weeks of users’ searches to determine if they culminated in checkouts to help determine the success rates of queries. In looking at the relative value of an integrated catalog and article-level search, researchers found that only 47% of searches would retrieve any results in the catalog alone, without article-level data. Of that 47%, the average number of results per page lowered from 23 to 15, indicating that article-level data provided the bulk of search results in the discovery layer. This indicated that splitting apart catalog and article level searching could present problems for library users unfamiliar with the content in each index.

PRACTICAL IMPLICATIONS OR VALUE

Where much of collections-usage analysis relies upon aggregate numbers, the use of anonymous search logs offers a much more intimate portrait of how patrons use our discovery tool. For example, a series of related searches gives us insight into the patron’s mind, and definite patterns appear across patrons. Our research brings valuable patron-focused data to local decision-making related to discovery. This aids in making practical changes to our current tool, such as the inclusion or exclusion of article results or emphasizing or deemphasizing advanced-search features. Our findings also indicate possible new directions for discovery-related library instruction and tutorials, either to correct or to accommodate common search behaviors. As we consider implementing alternative discovery tools, our findings suggest a structure for the questions we should ask, whether in a formal RFP or in preliminary research. Finally, other libraries can adopt this analytical approach to understand their own patrons better and compare their findings to ours.


4:00 p.m.
15-minute break


4:15 p.m.
The CARL library impact toolkit project: An overview (Watch on YouTube; link opens in new tab)
Presented by Justine Wheeler (University of Calgary) and Mary-Jo Romaniuk (University of Calgary)

Slides (PDF)
Show Abstract

PURPOSE AND GOALS

The Canadian Association of Research Libraries (CARL) represents Canada’s twenty-nine largest university libraries and two federal institutions. Our libraries make a difference to people’s lives (CARL, About, 2020). CARL libraries support students, faculty, and society. We contribute to student recruitment, engagement, retention and graduation, research grants, faculty recruitment, community welfare, stewardship, culture and identity, economic growth, job creation, and community development (CARL Assessment Committee, 2019).

As librarians, how do we communicate to university administration, government, faculty, current and future students, parents, alumni, the local community or wider society that the library contributes to the things that matter to them? (CARL Assessment Committee, 2019).

Library impact assessment is one approach that demonstrates the significant role academic libraries play in society. CARL’s culture of assessment within and across Canadian research libraries enables the articulation of library’s contributions towards higher education and the research enterprise, and facilitates the continuing evolutions of assessment capacity and services (CARL Strategic Directions, 2019). By using meaningful library impact measures, libraries can demonstrate our value, tell our stories, and evolve our services, programming, collections and research enterprises.

With this in mind, in 2019 the CARL Assessment Committee agreed to form a task-oriented project team to undertake the creation of an online library impact resource for Canadian research libraries.

The purpose of the project is to develop a framework that articulates the diverse impact of research libraries and provide indicators that contribute to the story of impact (CARL Library Impact Working Group, 2019). The product of this project is a toolkit to help CARL libraries communicate the ways they contribute to their universities, communities and society.

DESIGN, METHODOLOGY, OR APPROACH

CARL Library Project team members were selected through a call for participation on the CARL mailing list. The CARL Assessment Committee appointed interested participants based on a complementary mix of perspectives and expertise such as assessment, strategic perspective, and communication (CARL Assessment Committee, 2019).

As established by the CARL Assessment Committee, the framework for the project centres around four impact areas: 1) research & scholarship; 2) student learning & experience; 3) community; and 4) institutional reputation. For each impact area the project team researched and decided upon: definitions; scope; indicators; relevant methodologies; case studies (with a focus of CARL institutions); key resources, and further research questions.

Once the team was established, sub-groups were formed around the four impact areas, with two to three members on each sub group.

FINDINGS

This initiative is still underway and thus findings are not currently available. However, this project will be completed prior to the Library Assessment conference. We anticipate that findings from this project will contribute to the body of literature on library assessment by informing our understanding of best practices in library assessment and making visible potential new library impact measures. Findings will include a review of library assessment scholarly literature, grey literature and library websites. Emphasis will be place on qualitative and quantitative methods and methodologies that reveal key indicators, which move beyond measurement and towards demonstrating impact and value. Challenges and next steps will be addressed. This project has the potential to further our understanding of how we assess and understand the value of libraries.

PRACTICAL IMPLICATIONS OR VALUE

The audience for this presentation is academic librarians, and other professionals in academic libraries. In particular, administrators, communication staff, assessment librarians and user experience librarians, may benefit from this session.On a pan-Canadian basis, the CARL Library Impact Toolkit has the potential to assist in the facilitation, strengthening, and communication of ongoing qualitative and quantitative library assessment (CARL Strategic Directions, 2019).At an institutional level, the toolkit may be used to both advocate for a broader perspective on the impact of libraries (CARL Strategic Directions, 2019) and assist libraries in describing and documenting the broad range of impacts of research libraries (CARL Assessment Committee, 2019). Additionally, this resource provides library assessment strategies and methodologies.Finally, for individual academic librarians this resource is a flexible tool that contextualizes the importance of assessment and provides pathways for conducting library impact studies. (TOR, 2019).


4:35 p.m.
Practical service design blueprinting: using service design techniques in a library task force (Watch on YouTube; link opens in new tab)
Presented by Sarah Tudesco (Yale University Library)

Slides (PDF)
Show Abstract

PURPOSE AND GOALS

The paper will demonstrate the value of the service design blueprinting protocol in a short-term library task force charged with making recommendations for improvements to the purchase request process for users in the science and medical communities.

DESIGN, METHODOLOGY, OR APPROACH

The task force partnered with the library’s assessment team who facilitated a series of service design blueprint workshops. The team ran four workshops and consulted with individuals across the organization to blueprint multiple use cases that covered the major acquisition formats requested by the science and medical communities: databases, e-journals, datasets, and e-books. The workshops involved staff from across the service spectrum, including selectors, technical services, collection development, and the business office. In the workshops, facilitators guided staff in a conversation that identified the actors, systems, policies, questions, and ideas involved in each step of the process. The draft blueprints were built with post-it notes on whiteboards with extensive notes and comments being collected simultaneously. The assessment team transcribed this information into a digital blueprint that was used to identify key themes. The task force is set to make a final set of recommendations to library administration in May 2020. When the work concludes and the recommendations report is submitted, the assessment team will interview the co-chairs about the utility and impact of the service blueprinting exercise.

FINDINGS

The blueprinting process has proved to be an effective tool – it has helped facilitate cross-departmental conversation and identified opportunities that will be explored in the coming months. An effective service blueprint visualizes organizational processes with the goal of optimizing how an organization delivers a user experience. The tool has helped departments understand the complexities of the processes and systems they navigate.

PRACTICAL IMPLICATIONS OR VALUE

The technique of service design blueprinting has already proved to be an effective tool in encouraging and facilitating cross-departmental conversations. The ability to visualize the request to delivery process in current practice has already proved an effective tool in identifying pain points in the system as well as highlighting opportunities to optimize workflows.


4:55 p.m.
So, what now? Moving through tensions to practice in critical library assessment (Watch on YouTube; link opens in new tab)
Presented by Ebony Magnus (Simon Fraser University) and Maggie Faber (University of Washington)

Slides (PDF)
Show Abstract

PURPOSE AND GOALS

This paper will explore the practical implications for library assessment of the question: what happens to assessment when we conceptualize it as “an ethical, value-based social practice for the public good”? (1)

The paper will focus on three key themes:

  1. Drawing on Stephen Town’s work, what happens to our assessment practices when we disentangle the meanings and emphasis of value and values in our work? (2)
  2. How does the shift in perspective towards an “ethical, value-based social practice for the public good” change our day-to-day assessment activities?
  3. How does this shift in perspective challenge the ways in which we operate in the wider institutional context around demonstrating value and accountability?

In 2018, the authors presented a Library Assessment Conference paper designed to highlight questions related to power structures, equity, and social justice in library assessment. We argued that the the emphasis on practicality can obscure necessary professional conversations about the ways in which our assessment approaches need to be explored in order to dismantle oppressive power structures in our work. While conference attendees were receptive to the ideas, many (rightly) asked “what does this actually look like for our work?”

In this paper, we return to those questions from the perspective of practitioners who have been attempting to translate our inquiry and reflection into changes to our assessment activities, program, and engagement with institutional colleagues. While the focus is on the practical, we will continue to challenge the privileging of the practical over theoretical/conceptual considerations by highlighting where there is not an easy translation between critical assessment concepts and meaningful changes to our work.

(1) Wall, A. F., Hursh, D., & Rodgers III, J. W. (2014). Assessment for whom: Repositioning higher education assessment as an ethical and value-focused social practice. Research & Practice in Assessment, 9, 5-17 (p.5).
(2) Town, S. (2011). The value of libraries: the relationship between change, evaluation and role. In Baker, D., & Evans, W. (Eds.). Libraries and Society: Role, Responsibility and Future in an Age of Change, 303-325. Oxford: Chandos.)

DESIGN, METHODOLOGY, OR APPROACH

The paper draws on an ongoing literature review and structured discussions of that literature from a range of disciplinary perspectives, including higher educational assessment and institutional research, student affairs assessment, cultural and gender studies in education, and indigenous studies. The paper also draws on reflective practices used to explore and inform our own approaches to assessment at our respective institutions.

FINDINGS

Presenters will detail the ongoing outcomes of our continued research and structured discussions over the past three years, which have led into a concerted effort to recognize inequity and work towards addressing power imbalances in our everyday assessment work. While we cannot report dramatic transformations, we will highlight some of the ways in which we are attempting to reshape our work to center equity and attend to power dynamics. These include using intentional and consistent reflective practices before, during and after assessment activities to understand whose interests are being served by our work, and why we are undertaking a particular activity.

PRACTICAL IMPLICATIONS OR VALUE

This paper will expand wider conceptual discussions about library assessment and the ways in which practices can be informed by critical perspectives. The paper will provide attendees with reflective strategies and considerations of how assessment activities might align with values centered on equity, inclusion, and power sharing. The authors believe that many assessment practitioners may have experienced conflicts between institutional priorities, administration expectations, user needs and experiences, methodological integrity, and personal and professional values. This paper is intended to continue discussions started at the 2018 Library Assessment Conference about how practitioners engage meaningfully with these (and other) potential tensions in order to transform their assessment work.


5:15–5:25 p.m.: Questions and wrap-up
Moderated by Frankie Wilson (University of Oxford) and Steve Hiller (University of Washington)

Thursday, February 18: Services

Watch the full session:

1:00 p.m.–1:05 p.m.: Welcome Remarks
Elizabeth Edwards (University of Chicago) and Maurini Strub (University of Rochester)

1:05 p.m.–4:55 p.m.: Paper Presentations

1:05 p.m.
Professional Development and Professional Identity: A Qualitative Assessment of the Art of Teaching Program (Watch on YouTube; link opens in a new tab)
Presented by Kris Markman (Tufts Clinical and Translational Science Institute)

Slides (PDF)
Show Abstract

PURPOSE AND GOALS

Instruction has become an integral part of an academic librarian’s responsibilities (Cox & Corrall, 2013; Julien & Genuis, 2011; Hagman, 2015), however, there is a disconnect between the centrality of instruction and the training librarians receive (Brecher & Klipfel, 2014). Confidence and self-presentation as a teacher are linked to training, professional identity, and applied practice (Esty, 2017; Hagman, 2015; McGuinness, 2011), while fear, insecurity, and feeling underprepared are barriers to adopting teacher identity (Esty, 2017; Saunders, 2015). Professional development for librarians is needed to teach instruction skills and prepare them for the level of instruction needed in the workplace (Brecher & Klipfel, 2014; McGuinness, 2011).

To address this issue, in 2016 Harvard Library launched an in-house professional development program, The Art of Teaching for Librarians (AoT). The first iteration was offered in July 2016 as a three-week blended learning experience. Feedback on the program assessment indicated that the time frame was too compressed, and more time was needed to assimilate concepts and practice skills. A revised program was offered in spring 2019 as a semester long seminar, which met three hours per week.

While develop librarians’ practical teaching skills was the primary goal, a secondary goal was to foster a sense of teacher identity. This paper will present the results of an IRB-approved research project designed to assess our success in meeting these goals.

DESIGN, METHODOLOGY, OR APPROACH

Participants were recruited from the 2016 and 2019 AoT cohorts (N = 21) to participate in semi-structured interviews. Thirteen agreed to be in the study, six from the 2016 cohort and seven from 2019. The 2019 participants completed pre- and post- surveys. The 2016 participants completed an identical survey after they signed up for an interview slot. The survey included a self-assessment of skills related to the course, participants’ agreement with 10 statements related to teaching and professional identity, and in the post-test, subjective ratings of the seminar and facilitators. Qualitative content analysis (Schreier, 2014) was used to interpret the interview data.

FINDINGS

Overall, participants rated librarianship as highly important to their professional identities, particularly the 2019 cohort, whose ratings for this survey item increased from 6.3 to 7 (strongly agree). Similarly, for the statements “Teaching is an important component of librarianship” and “Librarians are educators,” the 2019 cohort’s ratings increased from the pre- to post-test, and were also slightly higher than the 2016 cohort. However, for the item “Teaching is an important part of my professional identity,” the 2019 cohort rating dropped from 5.9 to 5, putting it below the 5.5 rating from the 2016 cohort.

It is notable that this item provoked the widest range of perceptions, with participants becoming more ambivalent after the professional development program. This ambivalence was reflected in our interview data. Some participants eagerly embraced a teacher identity, and noted that they had already thought of themselves that way before participating in AoT. Others drew a distinction between the act of teaching and being a teacher. For some, this was explicitly linked to not having a formal qualification as a “teacher”, such as a faculty rank. Others talked about instruction as “just what librarians do,” with their primary identifications being librarians.

Finally, we also found that participants did not always see the connection between what they saw as “teaching,” the techniques taught in the program, and their actual work. While some participants expressed that they had gained new skills or the training had bolstered their confidence, others were not sure how to put into practice what they had learned, given the informality (i.e. one-shots) of much of their instruction work. This also played into the resistance to adopting the identity of “teacher,” with that being reserved for “real faculty.”

PRACTICAL IMPLICATIONS OR VALUE

Although data analysis for this study is currently ongoing, the preliminary findings indicate that while teacher training can help librarians increase their skills and confidence, it is still unclear if these programs can contribute positively to a sense of teacher identity, particularly in the absence of a formal instruction program. In a related project we found that institutional factors such as librarian faculty status can influence how librarians communicate their professional and organizational identifications (Ishii, Markman, Arnow, & Carr, 2019). It is also possible that the lack of faculty status contributes to a diminished sense of teacher identity, irrespective of the amount of instruction work librarians do. Finally, our data suggest improvements for future iterations of AoT that will help bridge the gap between participants’ perceptions of their work and concepts related to effective teaching.


1:25 p.m.
Impact of Library Collections on Faculty and Researcher Recruitment and Retention Decisions (Watch on YouTube; link opens in a new tab)
Presented by Maria Chiochios (University of Texas Libraries)
Email: Maria.Chiochios@austin.utexas.edu

Slides (PDF)
Show Abstract

PURPOSE AND GOALS

In 2019, the University of Texas Libraries conducted an in-depth study to examine if and how library collections play a role in attracting and retaining top researchers and faculty to the University of Texas at Austin. This research project was part of a larger suite of pilot studies being undertaken for the Association for Research Libraries under their “Research Library Impact Framework.” The resulting paper will detail the quantitative and qualitative findings of the study and nuance the impact library collections had in the decision-making processes of recently hired and promoted researchers and faculty. Findings could be used to help future library collections-related decisions as well as inform researcher and faculty hiring and retention processes at the provost level.

DESIGN, METHODOLOGY, OR APPROACH

The targeted audience of this assessment focused on newly recruited and promoted faculty and researchers within the past five years (2013-2018). This study gathered data through two independent but related processes: an online survey and one-on-one semi-structured interviews with the target audience. The survey was distributed to 991 people and received 284 responses for a response rate of 28.66%. Interviews were conducted with 13 faculty and researchers evenly distributed across disciplines (Fine Arts and Humanities, Social Science, and STEM) and without overlap in departments to ensure a diversity of viewpoints. Survey and interview questions and methodology were developed based on: (1) a similarly focused research project conducted in 1987 by Dale Cluff and David Murrah (“The Influence of Library Resources on Faculty Recruitment and Retention”) and (2) a previous Ithaka S+R project done that examined Asian Studies collections and research trends, issues, and support needs and challenges. The project team analyzed the quantitative survey results for trends and then used a grounded theory approach to code the interview transcripts and survey comments, identify key concepts, and determine broad categories.

FINDINGS

While data collection has been completed, data analysis is still in the early stages. Preliminary findings indicate that in general library collections are not the foremost driver of career decisions for faculty and researchers at the University of Texas at Austin. Other factors such as family needs, spousal hires, departmental or institutional reputation, salary, and geographic location were cited as taking priority. However, in certain disciplines, library collections did play a primary role in the recruitment and retention of faculty and researchers, especially those using special collections. In addition, faculty and researchers expressed that if easy and quick access to certain library materials was reduced, then library collections would play a much larger role in their career decisions. Furthermore, many interviewees and survey comments indicated that the symbolic and structural signaling of the university’s commitment to and investment in the libraries – or lack thereof – was becoming a stronger driving force in their criteria for current or future career decisions.

PRACTICAL IMPLICATIONS OR VALUE

This conference paper will offer insights into an area of research that has not been studied in-depth and provides a methodological approach that can be replicated by other assessment practitioners. While higher education literature abounds with studies on how to recruit and retain faculty to universities in general, a review of the literature revealed a significant lack of published research on the connection between library collections and faculty hiring and retention decisions. Only one notable exception exists from 1987 wherein faculty from four Texas universities were surveyed to determine the impact of the library on their professional decision making (Cluff & Murrah, “The Influence of Library Resources on Faculty Recruitment and Retention”). Gaining a better understanding of this relationship and taking it into account as part of decision-making processes enables libraries to more effectively attend to faculty and researcher needs and to more appropriately and responsibly allocate resources.


1:45 p.m.
Assessing the Role of Reference: Prioritizing Users and Emphasizing Critical Thinking in Collaborative Workflows (Watch on YouTube; link opens in a new tab)
Presented by Jasmine Spitler (George Mason University Libraries), Ashley Blinstrub (George Mason University Libraries), and Melanie Bopp (George Mason University Libraries)

Slides (PDF)
Show Abstract

PURPOSE AND GOALS

Many academic libraries have replaced the reference desk with different models for reference services (Bodemer, 2014; Bunnett, et. al, 2016; Rodriguez, Meyer, & Merry, 2017). At Mason Libraries, we developed an “on-call” reference service at Fenwick Library on our main campus, where subject librarians volunteer for shifts. In order to move forward in a productive manner, we performed an assessment collaboratively with the intent of making data-driven decisions and facilitating regular discussion with stakeholders while developing workflows.

Our main research questions include:

1) Do users have good access to reference help?;

2) Are Access Services staff overburdened with reference questions?;

3) What are the job expectations of subject librarians, whose focus has shifted from providing reference services solely within the library, to doing more outreach and working with users beyond the library?

DESIGN, METHODOLOGY, OR APPROACH

We designed an assessment project for the Fall 2019 semester to understand the reference activities occurring at Mason Libraries, from both the perspectives of subject librarians and Access Services staff. After gathering a representative task force of managers, subject librarians, and Access Services staff, we defined three different levels of questions (Bunnett et al., 2016; Gerlich, n.d.). These questions were divided into general categories: directional (Tier Level 1), reference (Tier Level 2), and research consultations (Tier Level 3).

Access Services submitted data for which Tier Level question was asked, as well as their response. Response options included: whether they referred to the on-call librarian; referred to virtual reference (VR); were unable to refer; or did not need to refer. Subject librarians submitted data for which Tier Level question was asked, as well as if they referred them to a specific subject librarian. We collected dates and times of all reference interactions, followed by three open feedback sessions to hear from staff providing the service. After data collection, the task force discussed recommendations based on analysis of the data, and submitted a final report to the Dean of Mason Libraries.

FINDINGS

Data collected within a seven week period showed that a significant number of users did not follow up with an on-call librarian after a referral from the Information Desk. Librarians on-call recorded a majority of Tier Levels 2 and 3 questions, including those referred to the appropriate subject librarian. Overall trends show the Information Desk was busiest at the beginning of the week, during the middle of the day, and statistics from reference on-call librarians presented a similar correlation.

The open feedback sessions revealed an overall consensus that the Fenwick Library should provide some kind of reference service. However, coverage and staffing of the reference on-call service in this form is difficult to manage. No official communication procedures between Access Services staff and subject librarians hindered the service’s success. There were also several problems regarding the location of the service, as some librarians would stay in their offices, while those traveling from other campuses were located in one of the labs in Fenwick.

After the final task force meeting, subject librarian and Access Services supervisors were designated as the point people to continue communication and evaluation of the process, with assistance from Assessment for data and trends. However, staffing changes, space changes, and the beginning of a new semester limited the amount of time and focus on this project.

Supervisors and staff are still dedicated to making reference services available for the university community. The reference on-call staff look forward to bridging departmental structures to ensure users have the services they need, in whatever form they take, using data assessment as the basis of those decisions.

PRACTICAL IMPLICATIONS OR VALUE

Defining the role of reference has been a challenge amongst academic libraries for several years now. Assessing reference may bring necessary, positive changes to the way libraries engage with their users. Another key component of this study that brings value to the larger assessment community is the focus on collaboration: assessment done in a team environment, rather than in a siloed one, is critical, as there are many stakeholders who need to buy in to any changes made.

At Mason, we made on-call hours more strategic and requested a designated research consultation space within eyesight of the Information Desk to reduce the number of patrons who did not follow up on their reference questions. The Dean of Libraries approved these recommended changes based on the data.

We have updated our data collection forms so that we may better track reference on-call data. Subject librarian managers and the Head of Access Services will continue to collect qualitative data in facilitated cross-departmental conversations with staff, while also developing a clear communication procedure.

This project has contributed to an environment of collaboration and critical thinking in developing user services. Furthermore, we are assessing the feasibility of training Peer-Referral Coaches as part of the reference on-call service. We believe that this re-prioritization of public services assessment will better serve our users and promote more data driven decision-making in our institution.


2:05 p.m.
15-minute break


2:20 p.m.
Looking deeply at journey points and disciplinary discourse practices in support of graduate education (Watch on YouTube; link opens in a new tab)
Presented by Elizabeth Kline (University of Arizona Libraries)

Slides (PDF)
Show Abstract

PURPOSE AND GOALS

The core focus of graduate education is to train and develop independent scholars that contribute scholarship in an individual’s chosen area. Largely the graduate educational landscape moves along a commonly structured path from acceptance to graduation. Investment in graduate education is expensive so completion success is a big concern for universities. Libraries offer services including literature review assistance, one-on-one reference help, data management support, and copyright guidance geared to this population and they work to try to reach graduate students at different points in the research life cycle. The extant library literature is pervaded with studies attempting to measure library user’s awareness of services, use of and experience with information resources, and experience conducting research. Those same studies often report similar findings that indicate users are largely unaware of library services, Google is the typical go-to tool, and information literacy skills are underdeveloped. Why are library users still largely unaware of library services? Why do students seemingly struggle during the course of their graduate education?

This study takes a different approach from the current literature and focuses on critically exploring the disciplinary discourse practices that shape the graduate students’ research identity in different disciplines. The author used interviews to discover the numerous intellectual journeys of graduate students in different disciplines and identified where faculty perceive the greatest struggle for graduate students.

DESIGN, METHODOLOGY, OR APPROACH

Researchers recruited faculty from different disciplines to participate in one-on-one, hour-long interviews. Discussions were audio recorded, transcribed, and then coded into NVivo. Iterative review on the data continued until themes emerged.

FINDINGS

This presentation will provide a discussion of graduate student personas revealed through intellectual journeys, assess the issues students encounter, share critical time points and key places within these intellectual journeys where significant development occur, and suggest how libraries can and should connect with graduate committee members to establish missing support structures.

PRACTICAL IMPLICATIONS OR VALUE

By aligning services to journey points where students struggle, libraries can build missing support structures to advance graduate education.


2:40 p.m.
Implementing Rapid Assessment to evaluate e-resource subscriptions for an immediate cancellation project (Watch on YouTube; link opens in a new tab)
Presented by Theresa Carlson (Northern Arizona University)

Show Abstract
PURPOSE AND GOALS

This paper describes the process and results of an e-resource collection management project using a rapid assessment procedure. The project group had less than a year to conduct a comprehensive e-resource evaluation in order to make change or cancellation recommendations to address a severe budget deficit. The tools of a rapid assessment procedure enabled librarians to conduct a thorough assessment of e-resource holdings.

DESIGN, METHODOLOGY, OR APPROACH

This paper defines the rapid assessment procedure (RAP) and describes how the procedure was adopted by librarians in order to complete a thorough e-resource collection evaluation using a combination of qualitative and quantitative methods. RAP is process oriented, yet flexible, and involves a holistic assessment approach, examining data from multiple angles (Scrimshaw and Gleason, 1992). Significantly it involves many stakeholders and considers carefully their involvement in decision-making. The key aspects of RAP proved critical for the library’s cancellation project given the diverse backgrounds of the project team and the time constraints on the project.

FINDINGS

Our assessment provided critical information about faculty’s knowledge and use of the library’s e-resources. This analysis revealed unexpected insights into the ways that faculty perceive licensed content, in general, and how they experience seeking and collecting information from these resources. The results of our approach have proven effective in recommending changes or cancellations to e-resource subscriptions with little faculty unrest. Furthermore, it has provided invaluable insights for our ongoing collection management procedures.

PRACTICAL IMPLICATIONS OR VALUE

This paper fills a gap in the literature for academic libraries by providing an overview of the rapid assessment procedure for an e-resource collection management project. It offers a comprehensive yet efficient way to analyze collections with community input. Furthermore, this approach could be adapted for continuous review and evaluation of library collections. Indeed, this approach remains part of the library’s maintenance plan for collections.


3:00 p.m.
Modeling Complex DDA Purchasing and Use Patterns with Machine Learning (Watch on YouTube; link opens in a new tab)
Presented by Kevin Walker (The University of Alabama)

Show Abstract
PURPOSE AND GOALS

This study describes a replicable implementation of an (Adaptive Boosting) AdaBoost model that predicts the likelihood of DDA titles being triggered for purchase. The predictive capacity of this model is then compared with that of a more traditional regression-based model. Through this comparative study,  the researchers is able to explore how machine learning might be leveraged toward more effective collection development and management strategies via the predictive modeling of complex collection use and purchasing patterns.

DESIGN, METHODOLOGY, OR APPROACH

An RStudio Server environment, running eight (8) dedicated processing cores with 32GB of RAM, was used as the analytical and development platform for this project. A traditional regression-based predictive model was built using R’s standard general linear model (glm) package, while the AdaBoost algorithm was deployed using the adabag package. Twenty-four months of e-book purchasing and use data formed the core dataset, which featured the four variables: publisher, publication yearLC class, and price. A random 80/20 sample of those data was culled for use as training (20%) and testing (80%) datasets for both the AdaBoost model, while the entirety of the original dataset informed the regression model.

FINDINGS

This research describes a replicable implementation of an adaptive boosting (AdaBoost) model that predicts the likelihood of DDA titles being triggered for purchase. When compared with a traditional regression model, which featured a pseudo-R2 value of .104 (i.e., a 10% prediction rate), the AdaBoost model provided accurate predictions in 82.4% of cases.

PRACTICAL IMPLICATIONS OR VALUE

This research serves as a proof of concept for the deployment of machine learning within the library environment. The incredibly robust predictive capacity of the AdaBoost algorithm, when compared with traditional regression-based predictive modelling, shows great promise as a practical solution to a variety of difficulties associated with building library collections.


3:20 p.m.
15-minute break


3:35 p.m.
Experiences and Expectations of a Library Document Delivery Service: A Study with Service Users and Non-Users (Watch on YouTube; link opens in a new tab)
Presented by Craig Smith (University of Michigan) and Emily Campbell (University of Michigan)

Slides (PDF)
Show Abstract

PURPOSE AND GOALS

Like many libraries, the library at the University of Michigan is re-imagining how services, collections, and spaces are organized to serve the needs of its users. With more content available to users online, and with library document delivery services designed to move both on- and off-site materials to users quickly, there is less of a need to fill library floors with stacks of physical materials. Indeed, the University of Michigan is planning to build a large off-site repository to house many of its physical materials, and is also planning to create larger spaces in its buildings for individual student work, student group work, and library offerings such as consultation, digital scholarship services, and instruction.

One consequence of these changes is that it will be more common for users who want access to non-electronic materials that are not in the stacks to make requests for document delivery. This includes both the delivery of physical items (e.g., books), and also the emailing of scanned documents. Given the centrality of the Library’s document delivery services in this vision for the future, it is critical to understand how users currently experience the Library’s document delivery service, what expectations they have of the service, and what suggestions they have for improvement. The purpose of the present study was to explore these questions with a group of Library patrons who used document delivery services recently (in fiscal year 2019), with the assumption that they would have relatively fresh memories to report on. We also included a group of non-recent users — people who had not used the service within the past five years or at all — in order to understand why some people do not make use of the service, and to explore whether non-users have different expectations regarding document delivery, compared to recent users. The Library is using the findings of this study to inform improvements to its document delivery service, and to the way that we communicate about the service to students and faculty members on campus.

DESIGN, METHODOLOGY, OR APPROACH

The study focused on graduate students and faculty members who fell into two groups: (1) fiscal year 2019 (FY19) users of document delivery services, and (2) graduate students and faculty members who had not used the service between FY14 and FY19. The study was designed to include three streams of data (1) library-maintained document delivery usage statistics for the FY19 user group, (2) survey data from respondents in both the FY19 user group and the comparison group, and (3) campus-level administrative data on all survey participants (e.g., track, rank, and discipline for faculty; program type and discipline for graduate students).

All of the FY19 service users were invited to participate in the survey; 1,366 completed surveys, which represented a 25% response rate. The sample of FY19 users was almost evenly split between graduate students (52%) and faculty members (48%; this included tenure-, research-, and clinical-track faculty members at all ranks, and also included lecturers). The FY19 user sample was also characterized by a decent distribution across broad disciplinary areas: 28% from the arts and humanities, 44% from STEM fields, and 25% from social sciences (a small percentage were in cross-disciplinary fields such as public health).

Other students and faculty members were identified as good matches for a comparison sample through propensity score matching. This involved using the characteristics of FY19 users (e.g., area of study, type of role on campus, amount of time at the University, race, gender, etc.) in a logistic regression model to find ‘matching’ individuals who had not used document delivery between FY14 and FY19. Those with good propensity score matches were included in the comparison group that was invited to participate in the survey. (A propensity score is an estimated probability that a person might have been exposed to an experience — in this case the use of document delivery.) Those flagged as ideal for a comparison group were sent survey invitations; 12% completed a survey (n = 458); 67% were graduate students and 33% were faculty members representing all tracks and ranks. Thirteen percent of the comparison sample respondents were in arts and humanities fields, 58% were in STEM fields, and 25% were in the social sciences (a small percentage were from cross-disciplinary fields).

We measured respondents’ use of document delivery services in two ways. First, all participants were asked whether they had requested items for delivery or scan, and whether they had done this once or multiple times. Second, outside of the survey, we also had the number of times different aspects of the service were actually used by FY19 users, based on logs maintained by the document delivery department.

The survey also asked both FY19 users and the comparison group about their preferred turnaround times (TAT) for physical item deliveries, and for emailed scans of documents; we also asked what would be too long where TAT was concerned. For the FY19 users, the survey also asked about their recent memories of how long items actually took to arrive.

One key advantage of the study design was that we had a large group of document delivery non-users. Thus, one section of the survey posed questions about why non-users hadn’t taken advantage of the service.

Finally, the survey asked all respondents to rate their feelings about two future scenarios. In the repository scenario, respondents indicated how they felt about more Library materials being housed in a local repository, with quick delivery available upon request. In the collective collections scenario, respondents rated their feelings about more materials being in networked collections, shared with peer institutions and available for quick delivery.

PRACTICAL IMPLICATIONS OR VALUE

As the U-M Library plans for changes to the way that spaces are allocated for onsite collections versus onsite services, the importance of a fast and reliable document delivery services is underscored. Assessment of this service allows us to plan for users’ needs as we make changes to how people can access our materials. The current assessment points toward multiple action steps, including:

  • Targeting communication about the service to graduate students in particular (they were more likely to be unaware of the service).
  • Making scans easier to read for a wide range of users.
  • Providing better options for the return of delivered items.
  • Adjusting workflows to aim for delivery times that match or do better than user expectations.
  • Identifying the small percentage of people on campus who are troubled by the repository and collective collections scenarios; these people are often vocal about their concerns, so understanding and addressing them proactively will be important.

3:55 p.m.
Exploring Undergraduate Experiences with Obtaining Course Texts, Including a Novel Look at ‘Adoptable Monographs’ (Watch on YouTube; link opens in a new tab)
Presented by Craig Smith (University of Michigan) and Charles Watkinson (University of Michigan)

Slides (PDF)
Show Abstract

PURPOSE AND GOALS

It is well understood that textbook costs are a major source of financial stress for many undergraduate students. The present study sought to explore these issues, and how they are navigated, with a large and diverse sample of undergraduate students at the University of Michigan. In addition to exploring students’ experiences with acquiring course texts, which has the potential to inform both library interventions (e.g., expended course reserves) and university policy (e.g., additional financial support for vulnerable students), a goal of the study was to examine student acquisition patterns for ‘adoptable monographs’ assigned as course books. In our study, adoptable monographs are books published by university presses that are adopted by instructors as course books. This type of course adoption is a major source of revenue for university presses. Understanding how students make decisions about acquiring these types of books has the potential to shed light on why revenues for books that have traditionally been adoptable monographs are falling. This issue is of particular importance to the University of Michigan Library, as the University of Michigan Press is a core component of the University Library’s publishing division.

DESIGN, METHODOLOGY, OR APPROACH

Data for the study were collected in a survey effort that was launched in the spring of 2019. Two groups of undergraduate students at the University of Michigan were invited to participate in the survey. A smaller group was invited to participate based on the type of book they had been assigned for one of their courses. These books were deemed to fit the profile of adoptable monographs (i.e., university press books — from any university — that were adopted as course texts). These students were asked some general questions about their textbook acquisition behaviors, and then some specific questions about their acquisition of the adoptable monographs being used in their courses. A larger group of undergraduates was invited to take a nearly-identical survey; these students were asked questions about how they acquired their course books more generally, without questions about specific books.

The final sample included 213 students in the adoptable monograph survey, and 534 students in the general textbook survey. The overall sample included 262 first-year students (35%), 173 sophomores (23%), 162 juniors (22%), and 148 seniors (20%). The sample as a whole was quite diverse, as we over-sampled underrepresented groups for the general textbook survey. Traditionally-underrepresented racial/ethnic minority students made up 26% of the sample, Asian/Asian-North American students made up 23% of the sample, and white students made up 48% of the sample. Because we were constrained in terms of who we could invite to join the adoptable monograph part of the study, the general textbook portion of the sample was significantly more diverse with regard to race/ethnicity, χ2(3, N = 743) = 26.62, p < .001, φ = .19.

Of the 747 in the sample, 166 students indicated that they were first-generation students (22%). This included 21% in the general survey (115 of 534) and 24% in the adoptable monograph sample (51 of 213). The two study groups did not differ in terms of these percentages (p = .48).

Both versions of the survey asked students where they typically first search for course books, where they actually purchased/rented course books during the current semester (if relevant), how much money they spent on course books, and all of the ways they went about acquiring their books. The survey also asked about how students preferred to get their books (if money was not an obstacle), and what kinds of impacts the cost of books had on their acquisition decisions and on their financial lives. There were additional questions about when students choose to acquire their books, and what they do with the books after a course is finished. Students in the adoptable monograph version of the study were asked some of these questions specifically about the adoptable monograph that was assigned to them (the title of the relevant book was piped into the Qualtrics-hosted survey using back-end data).

FINDINGS

All students, in both versions of the survey, were asked where they first go to search for course books; the wording made it clear that this question was not asking about how students ultimately obtained their books. Nearly half (46.6%) started their searches on Amazon, and nearly one third (29.9%) started on Google. Smaller percentages started with the University Library catalog (9.9%), Barnes and Noble (6.2%), and a physical bookstore (1.5%); 6% of students wrote in another response via an open-text option.

For students who indicated that they bought or rented books, the survey asked them about where they did so. Across the two study groups, nearly identical percentages indicated using Amazon: 65.5% of students in the general textbook survey, and 65.2% in the adoptable monograph survey, p = .94. There was no significant difference between the study groups with regard to using Barnes and Noble: 11.8% in the general textbook survey, and 6.7% in the adoptable monograph survey, p = .10. Significantly more students in the general textbook survey (15.7%), compared to the adoptable monograph respondents (5.2%) used Chegg, p = .002. The students in the general textbook survey were also more likely than the adoptable monograph students to have purchased their books at a local campus bookstore (21.6% vs. 8.1%; p = .001), and second-hand from fellow students (10.9% vs. 1.5%; p = .001).

All respondents were asked how much they spent on all their books during the semester of the survey. The mean expenditure was $165.09, but there was a tremendous amount of variability around the mean (range: $0 to $1,000; SD = $136.39). Interestingly, the mean expenditure for the adoptable monographs was quite low ($12.57), and 35% of students in this version of the survey reported getting the adoptable monograph for free. Thus, although book costs in general are quite high, many students assigned adoptable monographs do not pay a lot of money for them (perhaps to the benefit of students but to the detriment of vulnerable university presses).

Although students in the general textbook survey were expected to spend a good deal of money on textbooks, 30.6% reported that multiple assigned books were not needed at all in their classes, 19.5% reported that one book was no needed at all, and 37.1% reported that all books were needed but at least one assigned book was barely used. When students in the adoptable monograph survey were asked about the focal book specifically, 7% reported that it was not needed at all, 44.6% reported that it was barely used, and 45.5% reported that it was truly needed.

Of students in the general survey who indicated that they had classes with unused books, 45.5% indicated that they paid for and kept the book(s), 12.1% paid for and returned the book(s), 24.1% rented the book(s) and did not get a refund, and 5.4% rented the unused book(s) and did get a refund. Thus, a substantial group of students not only pay a lot for course books, but then experience having books that are un- or under-used.

Students were asked how the cost of books influenced their behavior in a variety of ways. Students in the general version of the survey were asked about the costs of course books, and students in the adoptable monograph version were asked about the cost of the focal monograph. Similar numbers — over half — indicated that costs influenced them to not purchase a required book (62.8% in general version; 52.4% in adoptable version). Very few reported not taking a course due to book costs (6% in general version; 0% in adoptable version). Very few also reported taking fewer courses due to book costs (2.5% in general version; 2.4% in adoptable version); the same small percentages in each survey version reported dropping a course to save on book costs. Nearly a quarter of respondents in each survey indicated that book costs added to difficulties in a course (23.1% in general version; 21.4% in adoptable version). Larger numbers reported experiencing stress about affording other resources such as food and rent due to book costs (45.4% in general version; 40.5% in adoptable version). None of the study group differences in this section of the survey were significant.

In both versions of the survey students were asked the same question — without reference to a particular book — about ways they attempt to reduce course book costs. First, a very small minority (6.1%) reported no attempts to reduce costs. Common approaches to reducing costs were finding free copies online (not via the library; 59%), buying used books from a bookseller (53%; more common for adoptable monographs), buying used copies from peers (40%), renting print copies of books (39%), buying digital versions of books (33%), sharing books with friends (28%), using earlier editions (27%), checking out library print copies (18%; more common for adoptable monographs), using library digital versions (16%; more common for adoptable monographs), and renting digital books (15%).

The diversity of the sample allowed us to explore differences in responding as a function of race/ethnicity and of first-generation student status. Not surprisingly, students who identified as members of underrepresented racial/ethnic groups and as first-generation students experienced more stress than their peers; these differences will be presented in the talk and the paper.

PRACTICAL IMPLICATIONS OR VALUE

The findings of this investigation are very much in line with other recent large-scale surveys that describe the impacts of course book costs on undergraduate students. Many students in our current study reported taking multiple approaches to cutting course book costs, and described multiple ways that book costs caused stress or strain. These impacts are felt by some students more than others, with family socioeconomic background playing a role. These findings will be presented to the leadership of the University Library, and may inform discussions about how the Library approaches maintaining and communicating about its course reserve collection. The findings will also be presented to administrators of the University’s undergraduate programs; these leaders might consider some relief for especially vulnerable students. Finally, one unique goal of the study was to shed light on factors contributing to reduced revenues for adoptable monographs, books that have traditionally been quite important for university presses. We found that many students pay little or nothing for such books, and also take many steps to avoid paying top dollar for these types of materials. As presses like the University of Michigan Press form strategies to manage the current publishing context, these types of empirical findings, which bolster anecdotal evidence, will be critical.


4:15 p.m.
Don’t Pay for Free: updating Cost of Use for the age of Open Access (Watch on YouTube; link opens in a new tab)
Presented by Heather Piwowar (Our Research)

Show Abstract

PURPOSE AND GOALS

The purpose of this paper is to explain an advanced version of Cost Per Use, called Net Cost per Paid Use (NCPPU), which values citations and authorships as types of use, excludes papers that can be obtained for free, and incorporates the cost of ILL.

DESIGN, METHODOLOGY, OR APPROACH

Net Cost Per Paid Use is calculated by taking the subscription cost and subtracting the anticipated ILL cost, and dividing this by the usage (measured by number of downloads plus weighted authorships and citations) that is not available for free via Open Access or Perpetual access backfile.  Some of these terms involve complicated calculation (anticipated ILL cost, the proportion of usages available for free), which will be explored in the paper.

FINDINGS

We graph NCPPU against traditional Cost Per Use to show examples where collections decisions made with traditional cost per use would over- or under-value journals, given their ILL costs, Open Access proportion, and/or authorship interest for a given university.

PRACTICAL IMPLICATIONS OR VALUE

This matters a great deal when institutions are deciding how much they would be willing to pay for a given serial, and which serials they should subscribe to.

 


4:35 p.m.
Data and Method as Assessment Terministic Screens: Why Improvement is Lost (Watch on YouTube; link opens in a new tab)
Presented by Aaron Noland (James Madison University Libraries)

Show Abstract
PURPOSE AND GOALS

In response to Stiggins (2002) call for assessment for learning not assessment of learning, we have doubled-down on process oriented discussions at the expense of outcomes oriented discussions. A consistent theme in assessment conferences and professional development opportunities centers on how to get people to buy-in to assessment, understand it, and build an assessment culture. Pair these themes with the ways in which assessment is positioned on campus, the academic conversation about assessment, and the manner in which it privileges advanced quantitative methodologies and it is no wonder there is a canyon between assessment professionals and faculty and staff about assessment. Assessment professionals must pay more attention to communicating about assessment and how assessment discourses are framed. In order to achieve greater buy-in and impactful assessment, we must provide entry points to the discussion that emphasize why we should practice assessment rather than how assessment is practiced.

The purpose of this research is to investigate how the discourse of assessment impacts how we communicate the practice. That is, when assessment discourses focus on data and method assessment becomes about data and method crowding out space for any focus on learning or improvement. This approach, thereby, deflects attention from what many call the core of assessment – learning improvement. Additionally, this research will provide analysis of common assessment discursive practices applying Kenneth Burke’s terministic screens and pentadic analysis to illustrate how the focus on data and method impacts perceptions of assessment. Finally, practical implications for shifting the conversational approach and widening the scope and communicated value of assessment will be provided

DESIGN, METHODOLOGY, OR APPROACH

This research takes a critical approach to the discursive landscape of Library assessment. Using Kenneth Burke’s Dramatism, I discuss conventional conversation, both academic and praxis, and how this conversation creates an assessment reality focused on data and method while obscuring the alleged goal of assessment – improvement. Analyzing the assessment discourse using Burke’s pentadic analysis and terministic screens will allow the reader (or audience) to understand how assessment professionals, unintentionally, keep the focus of assessment on data and method.

FINDINGS

Analysis of assessment discourse examines assessment cycles, models, methods, approaches, and case studies that primarily focus on surveys, research design, learning objectives, and other process oriented elements. When these elements are centered in assessment conversations it is logical for one to conclude that assessment is about those elements. As such, these elements become terministic screens, attracting attention to the process oriented elements and methodology of assessment while deflecting attention from outcome oriented elements like learning or process improvement.

The pentadic analysis concludes that the ratio of data – method serves to reinforce the worst perceptions of assessment as only concerned with counting and quantitative methodologies leaving improvement and learning out of the discourse. According to Burke, and this interpretation would likely hit home with assessment critics, we can interpret motive from what we speak about. Therefore, the abundance of discourse about data and method reflects its primary importance and, most importantly, the paucity of discourse about improvement and learning reflects its peripheral importance.

PRACTICAL IMPLICATIONS OR VALUE

Assessment professionals cannot simply assume the value of assessment is known and inherent. This truism has led us to pursue methodological rigor, a push to close the loop and demonstrate impact, and a focus on data integrity. It seems we have missed the mark. What we really need to focus on is how we talk about assessment as a practice. In talking about data and method we make assessment process oriented. Instead, if we take lessons from Burke’s Dramatism, terministic screens, and pentadic analyses we can learn that audiences will infer motive from what we talk about and how much time we dedicate to component parts. Assessment professions can use this analysis to, as Simon Sinek says, start with why. An assessment discourse anchored in why’s such as process or learning improvement is a compelling starting point for even seasoned assessment critics. Instead, we have started our conversation with technical jargon. We have elected to start with the complex and then try to convince our audiences that our focus is simple – improvement. We must reverse our course – we must start with the simple – improvement, and move from there to the complexities of data and method. Doing so, according to Burke, will center the core purpose of assessment and make space to access the process oriented elements through an improvement framework. This can make a sizable impact for assessment professions attempting to gain faculty / staff buy-in, incentivize assessment engagement, and demonstrate the value of assessment.


4:55 p.m.–5:05 p.m.: Questions and wrap-up

Moderated by Elizabeth Edwards (University of Chicago) and Maurini Strub (University of Rochester)

Wednesday, March 17: Teaching & Learning and Value

Watch the full session:

1:00 p.m.–1:05 p.m.: Welcome Remarks
Megan Oakleaf (Syracuse University) and Klara Maidenberg (University of Toronto)

1:05 p.m.–5:15 p.m.: Paper Presentations

1:05 p.m.
The Collective: Assessing a State-Wide Training Program on Library Impact (Watch on YouTube; link opens in a new tab)
Presented by Lisa Hinchliffe (U of IL Urbana) and Karen Brown (Dominican University)

Show Abstract
PURPOSE AND GOALS

CARLI Counts: Analytics and Advocacy for Service Development is a continuing education library leadership immersion program that prepares librarians to make effective use of research findings on the impact of academic libraries on student success for the twin purposes of service development and library advocacy. CARLI Counts addresses the need of academic librarians to demonstrate their libraries’ impact on student learning and success in competitive campus budgeting processes, accreditation reports, and program reviews. Doing so requires thoughtful service design and delivery based on clear goals and outcomes that are informed by documented best practices and that reflect local institutional priorities and strategies. While a growing body of evidence supports the assertion that academic libraries positively impact student success, libraries must make the argument individually to their stakeholders in ways that are meaningful to local administrators. Program participants learn how to use local library data analytics to improve their services and demonstrate their value and build their confidence in their ability to do so. This paper reports on the results of this program.

DESIGN, METHODOLOGY, OR APPROACH

CARLI Counts is being evaluated using multiple methods, quantitative and qualitative, and includes feedback from participants, evaluation of program materials, and performance outcomes from participants. Methods include interviews, focus groups, document analysis, and pre, post, and during program surveys. The evaluations take place before, after, and during the year-long experience of CARLI Counts participants. The evaluation is designed to assess participants’ understanding and use of evidence-based library practices, the impact of the projects at their institutions, the effectiveness of team-based professional development, and the collective statewide impact of the program.

FINDINGS

CARLI Counts is achieving its twin program outcomes of improving the ability of librarians to investigate and communicate the impact of academic libraries on student learning and success and of improving the confidence that librarians have in their ability to do so.

PRACTICAL IMPLICATIONS OR VALUE

CARLI Counts demonstrates that library consortia are well-positioned to serve as a center for professional development on academic library impact on student learning and success. Funded by IMLS, CARLI Counts will also be releasing an open version of the curriculum that can be utilized by other groups and CARLI Counts personnel will make themselves available to consult with other consortia or libraries that may wish to adopt and/or adapt this model.


1:25 p.m.
Focusing on the Forest (Not Just Leaves) when Preparing Faculty for and Assessing Online Course & Curriculum Design (Watch on YouTube; link opens in a new tab)
Presented by Aaron Noland (James Madison University Libraries)

Slides (PDF)
Show Abstract

PURPOSE AND GOALS

Extended learning opportunities through online environments provide arenas for faculty members to expand their teaching proficiencies. Like translating a teaching method into a different language with fluency, these can only be fulfilled effectively through preparation with the understanding and application of learning theories and instructional design principles, practice of innovative pedagogy, selection of technology for informed integration, acquiring skillsets and literacy when using technologies, and partnering with a network of experienced scholars and practitioners. This evidence-based mixed method research presents the assessment design of an immersive faculty development program based on a framework of online teaching efficacy.

The program is a 40-hour 10-week institute that equips faculty with the capacity to teach in online environment. The institute uses an efficacy framework, unique in faculty development programming, that necessitates a need for assessment. There is little research on assessing collaborative faculty development immersive institutes that aim at increasing online teaching efficacy. The goal of this research is to fill that gap and work toward improving the institute experience and build faculty online teaching efficacy.

The purpose of this research assessing the institute is to build continuous improvement into the process of institute design. Our approach centers an improved and more user-centered design. In making adjustments to the institute based on assessment data we are better suited to improve faculty online teaching efficacy over time. This model is consistent with a learning improvement model whereby information gathered from assessment data informs adjustments in the institute design that yield a more effective and efficient faculty development experience (Fulcher, Good, Coleman, and Smith, 2014).

The institute uses a self-efficacy framework. Using efficacy as a target is a different approach than many faculty development programs for online teaching that focus more specifically on discrete course design activities. An efficacy framework is preferred because of its broad applicability. Increasing efficacy focuses on the what and the how of online pedagogy thus broadening faculty capacity; focusing on the forest, not the leaves. This approach allows the faculty member to use their skills in a variety of course formats and in future online course development.

DESIGN, METHODOLOGY, OR APPROACH

Assessment of the online and blended institute in this project follows the assessment for learning approach rather than assessment of learning (Stiggins, 2002). The purpose of assessing this institute is to build continuous improvement into the process of institute design. Our approach centers an improved and more user-centered design.

The assessment of the institute follows a mixed-method longitudinal research design, with an online teaching efficacy questionnaire before and after the institute, an analysis of artifacts of designed courses (e.g. syllabi, assignments, course modules), and narrative inquiry of continued communication with faculty. The scale used to assess efficacy is a modified version of Robina and Anderson’s (2010) online teaching efficacy scale. Modifications were made to make language and focus consistent with the institute curriculum. The pre-institute scale consists of online teaching efficacy close-ended questions while the post-institute scale repeats the pretest and includes open-ended questions about participant perceptions of the institute experience.

FINDINGS

We will present data analysis of the pre and posttest, qualitative responses to post-institute questions, and an analysis of artifacts provided by participants from two semesters. Data collection will end in May 2020 and analysis will occur in June and July 2020.

PRACTICAL IMPLICATIONS OR VALUE

Assessment for instructional design online teaching immersive faculty development initiatives is a nascent space for Library assessment. At JMU Libraries, as with a growing number of other libraries across the country, educational technology and instructional design are housed within the library as core academic infrastructure. This research provides practitioners with a best practice model for integrated assessment in a faculty development institute aimed at improvement. The institute’s efficacy focus maximizes broad applicability, focusing on overall online pedagogy development rather than simply the mechanics of creating an online course.


1:40 p.m.
Associations between Library Usage and Undergraduate Student GPA, 2016–2019 (Watch on YouTube; link opens in a new tab)
Presented by Felichism Kabo (University of Michigan), Doreen Bradley (University of Michigan), Stephanie Teasley (University of Michigan), and Ken Varnum (University of Michigan)

Slides (PDF)
Library Learning Analytics Project website
Show Abstract

PURPOSE AND GOALS

The Library Learning Analytics Project (LLAP; https://libraryanalytics.org/) examines how libraries impact learning outcomes, specifically in the areas of course instruction, publications, and funded research. Learning in these three areas requires that members of the university community engage in activities such as accessing digital data and publication repositories, conducting literature reviews and managing citations, and creating data management plans. These activities usually entail interacting with the library physically such as attending a library instruction session, or virtually such as when retrieves materials through the library’s proxy server. We hypothesize that the degree of use of library resources among individuals is associated with learning outcomes. Finally, multi-institutional studies are more likely to enable holistic analysis of complex impacts of the library on learning than are analyses of single institutions. The ability to design and implement studies of this nature has been limited by lack of cross-institutional frameworks to enable the sharing of scripts, protocols, and other resources critical to library learning analysis. LLAP bridges this gap by partnering with 14 diverse institutions, our project advisory group (PAG), in the development of sharable data dictionaries, scripts, and protocols on the basis of principled and inclusive engagement.

DESIGN, METHODOLOGY, OR APPROACH

This paper focuses on the association between electronic resource use captured by EZProxy event logs, defined as being associated with one or more EZproxy sessions during an academic term, and course instruction outcomes (semester GPA and cumulative GPA) among the 29,337 undergraduate students enrolled at the University of Michigan (U-M) in the six-semester time period fall 2016 through winter 2019. Building on models of information seeking behavior, we developed a theoretical framework that correlates short-term and long-term student performance with EZproxy use, controlling for variables such as socio-demographics and academic background (Johnson, 1997; Wilson, 1999). We took raw, unstructured EZproxy logs, and cleaned and structured them using Python scripts and Regular Expressions. We then entered the data into a relational database using Structured Query Language (SQL) scripts. Over 80% of the EZproxy data have strong university identifiers which facilitated merges with other administrative data, such as course instruction and student data. Using SQL and R scripts, we merged the data and exported the resultant data set into the R programming environment for modeling and analysis.

The theoretical framework suggests that student outcomes are a function of both race and gender that apply to all students (“fixed effects”), and factors such as academic background that is defined by disciplines, schools, and colleges (“random effects”). We also account for student random effects for unobserved, time invariant factors, such as motivation or grit. Thus, we ran panel linear mixed effects regression models of the association between EZproxy use and student GPA, contingent on students being enrolled in at least four semesters. We cleaned and documented the Python, SQL, and R scripts used for data management and analysis. We uploaded these scripts to a GitHub repository accessible to the PAG members, in addition to the dictionaries for the EZproxy and associated university data.

FINDINGS

We were able to identify 58% of all undergraduates as having one or more EZproxy sessions during an academic term. The regression models showed positive associations (at a significance level p < 0.1) between EZProxy use and student GPA, controlling for “intervening” variables including race, gender, high school GPA, family income, first generation status, and class status. Using EZproxy during an academic term was correlated with a student having a semester GPA that is 0.135 higher compared to students who we could not identify as EZproxy users. Similarly, EZproxy use was correlated with a student’s cumulative GPA being 0.017 higher than that of a comparable student that we could not identify as an EZproxy user. These findings suggest that using specific library services has differential impacts on students’ short- and long-term performance. For example, for semester GPA, first generation students had a lower GPA (-0.119) than their peers, and males had a lower GPA (-0.091) than females. Thus, gender and first generation had impacts on semester GPA that were smaller in magnitude than the impact of using EZproxy during the academic term.

PRACTICAL IMPLICATIONS OR VALUE

Because library data are often not integrated into other university data, there are major obstacles to demonstrating the complexity of the value of academic library use to students who use these resources. We show how merging EZproxy and student outcome data at one institution yields valuable insights on the value of the academic library, and we have created and shared scripts and tools that enable replication of our work in other institutional settings.


2:00 p.m.
Privacy Expectations: Student Perspectives on Learning Analytics in Libraries & Higher Education (Watch on YouTube; link opens in a new tab)
Presented by Michael Perry (Northwestern University) and Andrew Asher (Indiana University-Bloomington)

Show Abstract
PURPOSE AND GOALS

This paper will highlight findings from the second phase of the Data Doubles Project, a three-year, student-centered, Institute of Museum and Library Services (IMLS)-funded project that seeks to understand student perspectives of privacy issues associated with academic library participation in learning analytics (LA) initiatives. 40,000 students across 8 institutions were invited to participate in a quantitative survey regarding their perspectives on privacy and learning analytics.

DESIGN, METHODOLOGY, OR APPROACH

The phase II survey built on the findings of the Data Doubles Project’s first phase  (112 qualitative interviews) by developing and validating a quantitative instrument to evaluate student perspectives on privacy in a statistically-representative format and to examine demographic differences that may affect students’ opinions.  Construct and content validity was evaluated using  a subject matter expert panel review  and cognitive interviews with 10 students before the survey was finalized.  Stratified random samples were utilized at each of the eight participating institutions to ensure a representative survey population. This survey closes on March 30, 2020, and first findings will be available for this presentation.

FINDINGS

This address five thematic areas related to student expectations of learning analytics practices: Knowledge about Learning Analytics;  Data Sharing; Access & Limits; Rights;  Consent; Privacy & Trust.  Selected first  findings  in each of these modules will be
available for this presentation.

PRACTICAL IMPLICATIONS OR VALUE

Learning analytics continues to be a important trend for both libraries as well as higher education. The findings from this study will allow practitioners to understand student perspectives and work them into library learning analytics practices.


2:20 p.m.
20-minute break


2:40 p.m.
Open Access Publishing: A study of UC Berkeley faculty views and practices, with implications for Library support (Watch on YouTube; link opens in a new tab)
Presented by Chan Li (University of California Berkeley), Becky Miller (University of California Berkeley), and Mohamed Hamed (University of California, Berkeley)

Slides (PDF)
Show Abstract

PURPOSE AND GOALS

The UC Berkeley Library has taken many actions to advocate for open access (OA) publishing and to promote a publishing ecosystem where the impact of research can be maximized by removing readership barriers. One example is negotiating transformative agreements with academic publishers. The success of these efforts not only relies on the Library and publishers, but also the support and commitment of our researchers, particularly our faculty. But, what are our faculty’s opinions regarding OA issues? How often do they publish OA? What are the possible barriers for faculty to publish OA? How do their publication behavior and OA views interact? Finally, how should the Library help?

Our research (as part of the ARL Research Library Impact Pilots) aims to look into possible areas of correlation between Berkeley faculty’s OA research output, research impact, and their opinions on various open access issues. We also plan to investigate demographic differences, including faculty’s discipline, years of experience, job title, and research funding support availability, to identify the behavior and perception differences among different groups in order to provide more customized library support.

DESIGN, METHODOLOGY, OR APPROACH

The Berkeley Library Ithaka Faculty Survey conducted in October 2018 included several questions related to open access and the library’s role in this area. 811 faculty (30%) responded to the survey and their responses will be analyzed to evaluate faculty’s perceptions of open access publishing. In addition, scholarly publication output data for the same group of faculty for the years 2016 to 2019 will be retrieved from the Scopus database using multiple APIs. Scholarly output will be measured by number of publications, number of open access publications, and number of citations. Furthermore, the survey responses, publication output, and faculty’s demographic information will be analyzed together to study their correlation.

FINDINGS

Our 2018 Ithaka Faculty Survey revealed that 66% of our faculty consider it very important that the Library provides active support that helps to increase the productivity of their research and scholarship. Also, 71% of our faculty (a higher percentage than our peer institutions) would be happy to see the traditional subscription-based publication model replaced entirely by an open access publication system in which all scholarly research outputs would be freely available to the public. However, the survey also indicated that a higher percentage of faculty prefer no cost to publish than no cost to read, so cost appears to be a barrier for our faculty to publish open access.

At the same time, by looking at their current scholarly publication output, we found that more than a third of their publications are open access. Our in-depth correlation analyses will reveal more information about different demographic groups, such as which of these groups have the highest open access output, which groups are more supportive of open access publishing, which groups are more concerned with the publishing cost, and which groups value the Library’s role in increasing their research productivity the most.

PRACTICAL IMPLICATIONS OR VALUE

This research will help us to obtain a better understanding of faculty’s views on OA and their OA publishing practices. It will not only identify the user groups who would need the Library’s support more in this area, but also provide insight into ways the Library can support open access publishing and maximize the Library’s impact in these areas. As more and more transformative OA agreements are negotiated with academic publishers, this research should be repeated in the future to see how faculty’s views on OA and OA publishing practices have changed.study the long-term outcome.


3:00 p.m.
Developing best practices and ethical guidelines for assessing reuse of digital content (Watch on YouTube; link opens in a new tab)
Presented by Joyce Chapman (Duke University Libraries) and Caroline Muglia (University of Southern California)

Slides (PDF)
Show Abstract

PURPOSE AND GOALS

While digital library practitioners measure “use” of digital collections using access metrics, they rarely measure or assess “reuse” in research, social media, instruction, and other formats. Reuse metrics are often anecdotal and ephemeral, which pose a challenge to collection and comparison to other metrics. To that end, the Digital Content Reuse Assessment Framework Toolkit (D-CRAFT) is developing guidelines and best practices for practitioners in galleries, libraries, archives, museums, repositories, and data (GLAMRD) organizations to assess how users engage with, reuse, and transform digital content. D-CRAFT is a multi-year IMLS grant that began in summer 2019 and builds on the previous Digital Library Federation project, “Developing a Framework for Measuring Reuse of Digital Objects” (hereafter, “Measuring Reuse”).

DESIGN, METHODOLOGY, OR APPROACH

Ethical Guidelines for assessing reuse:

  • Using an evidence-based approach to authoring Ethical Guidelines for assessing reuse of digital content, the project team conducted an environmental scan of existing codes of ethics for a wide range of GLAMRD organizations as well as computing and information management organizations. The project team employed both qualitative and quantitative methods to analyze the resulting corpus. The team utilized Voyant Tools to visualize the corpus and perform text analysis. The group performed both gap and theme analyses. The final document will focus on core values, ethical standards and issues related to digital object reuse assessment, responsibility and accountability, privacy, inclusion, professionalism, and transparency. The standards will center traditionally underrepresented communities’ concerns and ideas, as well as overall user privacy. The draft guidelines will be reviewed by the D-CRAFT grant’s Advisory Group, traditionally underrepresented populations, and by the cultural heritage and assessment communities.

Best Practices and Engagement and Education Tools:

  • The project team began by conducting a wide-ranging literature review around the seven Use Cases and the Levels of Engagement of the use and reuse matrix, developed based on focus groups conducted by the “Measuring Reuse” project. The team used Dedoose — a web application for qualitative data analysis aimed at facilitating collaborative mixed methods research — to code, thematically group, and tag excerpts from the resulting corpus. Sub-teams were then designated to begin developing Best Practices and educational tools by Use Case. Each group used the rich data in Dedoose to conduct a gap analysis, and perform further data gathering as needed (in some cases from less traditional sources, such as institutional websites or affinity group listservs).

As assessment, access, privacy, ethics, cultural-competency, and educational tools are key pillars of the toolkit’s design, the grant provides funds to hire part time consultants specializing in Privacy, Diversity, Assessment, Instructional Design, and Accessibility. Consultants contribute valuable expertise to key product development.

FINDINGS

The deliverables of D-CRAFT include Ethical Guidelines for assessing digital object reuse, Best Practices around assessment of digital content, and a suite of freely available engagement and education tools. The project also seeks to define and delineate between use and reuse of digital assets — eliminating long-held frustrations in this area among GLAMRD practitioners.

Timeline for deliverables (all three areas are currently well underway):

  • Ethical Guidelines – April 2020
  • Use and Reuse Matrix and Definitions – Spring 2020
  • Best Practices and Educational Tools – Fall 2020

PRACTICAL IMPLICATIONS OR VALUE

The D-CRAFT toolkit will be a vital GLAMRD community resource that addresses the lack of common practices and instructional resources for assessing reuse of digital materials, provides definitive guidelines on what constitutes use and how that differentiates from reuse of digital content, and develops the first Ethical Guidelines for assessment and reuse of digital content.

D-CRAFT is a product of the GLAMRD community. This session will enable the D-CRAFT project team to collect valuable feedback on the project from the assessment community.


3:20 p.m.
Data Repositories as a Service Supporting Research and Teaching: A Conceptual Framework
Presented by Qiong Xu (Queens College, City University of New York)

Show Abstract

Purpose and goals

The purpose of the study is to explore a conceptual framework for data repositories as a service nurturing research data engagement to support research and teaching in an academic community. In the current study, research data engagement refers to the practice of research data management (RDM) and using research data for teaching. RDM practice includes research data deposit, sharing and reuse for research; while using research data for teaching includes such activities as using the data for teaching research methods, teaching statistical concepts, assigning research projects to students, etc.

Design, methodology, or approach

A mix-methods approach is employed to examine the relationships between intentions to use institutional data repositories (IDR) and related factors. The investigated relationships include (1) the relationship between faculty’s intentions to use IDR for research and such factors as awareness of research funders’ RDM policies, needs for RDM support, attitudes toward data services, and perceived usefulness of disciplinary data repositories (DDR); (2) the relationship between intentions to use IDR for teaching and such factors as attitudes toward using data for teaching, data engagement in teaching, and perceived usefulness of DDR; and (3) the relationship between research data engagement and other factors.

An online administered survey questionnaire, which includes both closed-ended questions and open-ended comments, is used as the research instrument of this study. The survey questionnaire is adapted from prior studies and further designed based on an integrated theoretical framework combing theory of planned behavior (TPB) and Technology Acceptance Model (TAM). The questionnaire is hosted on Qualtrics and distributed to faculty participants via emails, flyers, social media, and website posting at a senior college in the United States.

The quantitative survey data, the responses to the closed-ended questions, are imported into statistical software packages (e.g., IBM SPSS and R) for data analysis. Statistical techniques such as principal component analysis, regression analysis and structural equation modeling are used to build and test a conceptual model. The qualitative survey data, the respondents’ comments are analyzed as a supplement.

Findings

Preliminary results show that attitude towards data reuse and awareness of RDM policies significantly influence faculty’s data deposit activity, while subjective norms of data reuse significantly impact data sharing and reuse practice. Attitudes toward data services such as data curation, data finding, and data analytics have significant associations with awareness of RDM policies and needs for RDM support. The results also reveal some significant indirect effects through mediator elements, such as (1) the indirect effect of perceived usefulness of DDR on intentions to use IDR for teaching through attitudes toward using data for teaching and (2) the indirect effect of data engagement on perceived usefulness of DDR through attitudes toward using data for teaching. Furthermore, the results indicate that perceived usefulness of DDR significantly impact faculty’s intentions to use IDR for research, while subjective norms of data sharing positively impact awareness of RDM policies.

Practical implications or value

Based on the investigation and findings about factors affecting intentions to use IDR, a conceptual model for data repositories as a service nurturing research data engagement is built and tested. The model explains whether and how data repositories can be used to support research and teaching in an academic community. With the conceptual framework, this study theoretically contributes to data repository and data curation research in library and information science through exploring and defining the concepts of research data engagement and related factors which impact the use of data repositories. With the findings, this research also practically provides insights for academic libraries to develop data services supporting research and teaching activities.


3:40 p.m.
Engagement pathways to transfer student success (Watch on YouTube; link opens in a new tab)
Presented by Anne Cooper Moore (University of North Carolina at Charlotte) and Rebecca Croxton (University of North Carolina at Charlotte)

Slides (PDF)
Show Abstract

PURPOSE AND GOALS

As the number of students entering higher education institutions from high schools decreases and the number of adults needing to complete or continue their education increases, we must develop a deeper understanding of the factors that contribute to transfer student retention and success. What role do out-of-the-classroom engagements play in transfer student success in comparison to first-time freshmen? A continuation of a previous study of undergraduate students who matriculated in summer/fall of 2012 through summer/fall of 2017 focuses on which co-curricular, extracurricular, pre-entry (high school GPA, number of incoming credits, Pell grant eligibility), and demographic factors (under-represented minority status, gender) contribute to transfer versus first-time freshman student retention and success at a large, public, research university in the southeast with a high transfer student population. The study investigates which student-level engagements in activities in the library (and total engagements in academic support and extracurricular activities) for transfer students versus first-time freshmen relate to student success as measured by year 1 to year 2 retention, cumulative GPA after 4 years, and time to graduation. The study reveals the role of the library and other academic support and extracurricular engagement in transfer student success and helps institutions understand what engagements they should emphasize with incoming transfer students.

DESIGN, METHODOLOGY, OR APPROACH

This project is part of an ongoing, longitudinal study of undergraduate student engagement and success data of students who matriculated in summer/fall 2012 through summer/fall 2017. For this project, the researchers conducted a comprehensive comparative analysis of students who entered the university as first time in college freshmen and transfer students, including an exploration of transfer student data to better understand the co-curricular, extracurricular, pre-college, and demographic factors associated with their success. For this project, the full dataset was analyzed using Analysis of Variance and linear regression of the following dependent variables: year 1 to year 2 retention, cumulative GPA after 4 years of enrollment, and time to graduation.

FINDINGS

The fewer the total number of engagements with co-curricular services the lower the retention rate and the 4-year cumulative GPA for transfer students. Conversely, engagement in co-curricular activities significantly relates to reduced time to graduate. Transfer students tend to use co-curricular services more in the first year than freshmen; however, they have significantly lower retention rates and 4-year cumulative GPA than freshmen. Variance in first year retention, 4-year GPA, and time to graduation for transfer students is explained by engagement in the following library activities: instruction, database access (authentication), computer logins, study room reservations, and book checkouts. Myriad other co-curricular and extracurricular engagements significantly contribute to transfer student success.

PRACTICAL IMPLICATIONS OR VALUE

The findings of the study will provide a model of the engagements in the library, other co-curricular, and extracurricular activities for transfer students as opposed to first-time freshmen with various incoming characteristics, such as incoming credits, high school GPA, Pell grant eligibility, under-represented minority status, and gender. As part of a longitudinal project that creates an institutional repository of student-level data that can be mined to understand the factors that contribute to student success, this focus on pre-existing characteristics and on engagements out-of-the-classroom for transfer students as a population will help libraries and universities approach how to help this important population succeed.


4:00 p.m.
15-minute break


4:15 p.m.
From Mapping to Menus to Modules: A Comprehensive Instructional Services Assessment leads to changes in library instruction and the university curriculum (Watch on YouTube; link opens in a new tab)
Presented by Devin Savage (Illinois Institute of Technology) and Yi Han (Illinois Institute of Technology)

Slides (PDF)
Show Abstract

PURPOSE AND GOALS

Assessment of instructional services is growing into maturity as an area of expected activity across academic libraries. At our mid-sized, research intensive, doctoral-granting institution, we wanted to push forward not only an understanding of individual efficacy as we grappled with the opportunities offered by the Information Literacy Framework, and we wanted to better understand the possible areas for learning interventions throughout each program at the university. Information Literacy was not recognized nor were the skills officially recognized anywhere in the curriculum at our institution, and our value proposition for library instruction sessions largely centered on increasing students’ searching efficacy. We needed a comprehensive and structural effort to change what we did and where we were allowed to do it, so that we could better serve the learning outcomes for our students.

DESIGN, METHODOLOGY, OR APPROACH

In the Fall of 2017, we led our library liaison team in a process of curriculum mapping, wherein we gathered and reviewed syllabi from across the campus to identify learning objectives, assignments, and resources used. This included a literature review on curriculum mapping and other assessments of instructional services. Although we encountered some significant challenges, we also found some immediate, actionable insights. We began to craft and categorize the information in January of 2018, but we continued to receive further syllabi until the Fall of 2018, which is when we scoped our study and did our initial analysis. During this time, the Head of Research and Instruction Services also held a weekly workshop series for the instruction team to engage with the Framework for Information Literacy. This process provided opportunities for a comprehensive assessment of instructional needs on campus, adjusting in-class activities and assessment instruments, changing outreach strategies, and identifying our librarians’ needs for instruction support. These last few opportunities allowed us to work through the shift afforded by the Framework for Information Literacy, create an instruction toolkit, and change our more generic offerings into an instruction menu which articulated what we would be able to provide to campus partners.

FINDINGS

We found a variety of intriguing results, from the somewhat expected findings that there was no clear home for information literacy in the curriculum, that there was a dearth of undergraduate interaction with databases or research literature within some programs, to the unexpected discovery of certain academic programs. We also found there were drawbacks to the on-demand one-shot model we currently see. Students may not have been through sessions, or may have been through them multiple times. The content covered may not match sequentially or appropriately to the needs of the students. The curriculum mapping exercise, as well as the internal and external conversations that accompanied it, provided multiple opportunities for assessment, evaluation, adaptation, and implementing, communicating, and assessing further changes. Such changes included in-class activities, our formative assessments, and the way we not only approached but what we offered to faculty. This was particularly helpful once conversations about addressing Information Literacy begin to take shape on campus, which led directly to the formation of the Information Literacy module pilot program. Our Information Literacy module assessments have already provided some rather surprising results regarding students’ keyword searching and plagiarism awareness.

PRACTICAL IMPLICATIONS OR VALUE

Not only were we able to more than double the number of instruction sessions within two years, we were able to provide the University’s senior leadership data-informed arguments for embedding information literacy within the curriculum. We presented our findings and our proposals for changes to the incipient campus Teaching and Learning Center, the University Faculty Council, the Undergraduate Studies Committee. This helped position us for a collaboration with our Provost, in coordination with Deans and Chairs, to create an Information Literacy two-part module to embed in a composition class, which we piloted in the Spring of 2020 and are planning to enact across the board in the 2020-2021 Academic Year. This also necessitated a new approach to assessing existing student skills as well as student learning outcomes, and so we have piloted a pre-test and post-test in the module along with an annotated bibliography. The practical implications of our experience in this multi-year endeavor include the value of 1) creating a comprehensive investigation, 2) scoping that comprehensive exercise into representative components, 3) creating opportunities for “action research” and adjusting implementation and outreach mechanisms, and 4) having decision-makers at the university level be informed by data. This comprehensive assessment process ended up with a much slower timeline than anticipated, but it has proved to provide substantive improvements in our instructional offerings at the individual, programmatic, and university wide levels.


4:35 p.m.
Developing an Assessment Plan for 100-level Information Literacy Courses at the University of Northern Colorado (Watch on YouTube; link opens in a new tab)
Presented by Brianne Markowski and Lyda McCartin (University of Northern Colorado)

Slides (PDF)
Show Abstract

PURPOSE AND GOALS

After the release of the ACRL Framework for Information Literacy for Higher Education, the University of Northern Colorado Libraries revised our shared student learning outcomes (SLOs) for all 100-level information literacy credit courses. In Fall 2015, we began work on a plan to assess these SLOs across 100-level courses using signature assignments. Signature assignments are course-embedded assignments, activities, projects, or exams that are (ideally) collaboratively created by faculty to collect evidence for a specific learning outcome. These assignments are then embedded in all sections of a course, regardless of instructor. This paper will describe the process we used to develop the assessment plan and signature assignments, discuss benefits of this approach, and share lessons learned along the way.

DESIGN, METHODOLOGY, OR APPROACH

The assessment plan for the shared 100-level SLOs was collaboratively developed by the Libraries’ Curriculum Committee, comprised of all librarians who teach in the information literacy credit course program. In the first year, we agreed to begin developing the plan with what we felt was the easiest SLO to assess, students will be able to evaluation information. We used the following process to develop the assessment plan.

Step 1: Develop signature assignment. Each librarian proposed assignments for the SLO, based on how we were already assessing evaluation in our courses and ideas drawn from the literature. After vigorous debate, we agreed upon a signature assignment to pilot.

Step 2: Pilot signature assignment. In order to evaluate how the proposed signature assignment would work in the classroom, two librarians embedded the assignment in their courses and collected data from students in their course sections.

Step 3: Determine analysis procedure. While the pilot was underway, we discussed how we would analyze the data once collected, looking for examples of how others had assessed similar learning outcomes and modifying those procedures to fit our needs. At the end of the semester, we tested the analysis procedure on data collected from the pilot.

Step 4: Set achievement benchmark. After discussing the results from the pilot, we agreed upon an initial benchmark level of achievement for the SLO. This would be used to measure our success in achieving the SLO.

Step 5: Document the assessment method for the SLO.

In the second year, we implemented the assessment in all sections of the 100-level courses and began the process again for the next SLO. Now in the fifth year, we have successfully embedded signature assignments for each SLO into all sections of 100-level information literacy courses taught at University of Northern Colorado. At the end of the 2019-2020 academic year, we will have completed one full cycle of the assessment plan.

FINDINGS

This process resulted in a robust assessment plan for our information literacy course program. The plan outlines a method of assessing each of the SLOs, a schedule for when data are collected and analyzed, who is responsible for this work, and how the results are shared. The process of collaboratively developing signature assignments has led to valuable conversations among librarians about what we teach and why. The process also allows us to discuss effective strategies for teaching as we look at student achievement across courses. Implementing this plan for systematically assessing SLOs in our courses has also improved teaching and student learning.

PRACTICAL IMPLICATIONS OR VALUE

Throughout this process there were a number of lessons learned. The length of time needed to fully develop the assessment plan was longer than we initially thought. During this process, we also revised multiple SLOs as we discovered the SLO language did not truly reflect what we wanted to know about student learning. With all the changes taking place, it was important to stay flexible and document each decision we made along the way. Finally, even though we approached the assessment plan through a collaborate design process, challenges with buy-in arose as librarians joined and departed the Curriculum Committee.

Though assessment of information literacy has become widespread in recent years, few examples describe assessment of SLOs in information literacy credit courses. Those that do typically look at one course rather than assessing SLOs across multiple sections and instructors. While the process described here was used to assess shared credit course SLOs, the process could be used by libraries looking to assess library-wide SLOs for one-shot or embedded information literacy instruction programs.


4:55 p.m.
Assessing the United Nation’s Sustainable Development Goals in academic libraries (Watch on YouTube; link opens in a new tab)
Presented by Clare Thorpe (University of Southern Queensland)

Slides (PDF)
Show Abstract

PURPOSE AND GOALS

The United Nation’s 2030 Agenda for Sustainable Development identifies 17 goals and 169 targets as a shared blueprint for peace, prosperity, people and the planet (United Nations, 2015). Australian university libraries, through the Council of Australian University Librarians (CAUL) in partnership with the Australian Library and Information Association (ALIA), have started documenting and planning how university libraries contribute to the Sustainable Development Goals (SDGs), including the identification of assessment frameworks and key performance indicators for CAUL members. In 2019, the University of Southern Queensland (USQ) Library stepped through an exercise of understanding how our day to day work and annual planning targets mapped to the SDGs. This paper will outline how USQ Library undertook the mapping process and discuss how academic libraries can assess their own contribution to the SDGs.

DESIGN, METHODOLOGY, OR APPROACH

The approach of this paper is a case study. The authors will outline how we mapped USQ Library’s services, projects and action plans to the SDGs and how the mapping exercise was communicated to library staff, senior university leaders and industry professionals. The paper will situate this activity among the broader approaches being taken by the Australian library community, including the 2030 stretch targets for Australian libraries (Australian Library and Information Association, 2019).

FINDINGS

USQ Library staff found that existing services, collections and projects correlated to 8 of the 17 SDGs. Activities were mapped to these eight goals and reported to senior executive of the university. The mapping exercise increased the awareness of library staff about the broader cultural and societal implications of their roles. The communication strategy led to conversations that increased university leaders’ awareness of the SDGs and the value and impact of USQ Library in improving access to information and the library’s role in transforming the lives of USQ students and community.

PRACTICAL IMPLICATIONS OR VALUE

The United Nation’s 2030 Agenda for Sustainable Development provides an opportunity for university libraries to assess, demonstrate and communicate the breadth and depth of their contribution to quality education, reducing inequalities, supporting innovation and economic growth and building communities. By undertaking an exercise to map collections, services and projects to the UN SDGs, USQ Library has been able to demonstrate the knowledge and information infrastructures by which we enable student achievement and research excellence. By sharing our experience we hope to encourage more university libraries to engage with the SDGs as a both benchmarking tool and as a challenge to set stretch targets aligned with the United Nation’s 2030 agenda.


5:15 p.m.–5:25 p.m.: Questions and wrap-up

Moderated by Megan Oakleaf (Syracuse University) and Klara Maidenberg (University of Toronto)