Session 1: Measurement and Measures Indicators I
Outcome Measurement in Academic Libraries: Adapting the Project Outcome Model
Eric Ackermann (Radford University), Sara Goek (Association of College & Research Libraries), and Emily Plagman (Public Library Association (PLA), ALA)
In 2014, the Public Library Association (PLA) launched Project Outcome, a free online toolkit designed to help public libraries understand and share the impact of their programs and services by providing simple surveys and an easy-to-use process for measuring and analyzing outcomes. Project Outcome also gives libraries the resources and training support needed to apply their results and confidently advocate for their future. This session will begin with an overview of the Project Outcome model and the results of activity after four years of engagement and use in the public library field.
Expanding upon this successful model, PLA has partnered with the Association of College & Research Libraries (ACRL) to develop a version of Project Outcome for academic libraries. While outcome assessment may be common in higher education already, the Project Outcome model will offer practitioners access to a standardized set of measures and a free, easy-to-use toolkit. It includes seven patron-focused surveys, an online dashboard of interactive tools for collecting and analyzing the data, and practical guidance on using the results. This toolkit will provide academic libraries of any size the means to easily measure the learning outcomes of their programs and services and to use that data as the basis for improvements and advocacy. ACRL appointed a Task Force for this work in early 2018 and will launch the new tool in April 2019. In this session the Task Force chair and ACRL staff will share the theory of change for learning outcomes in academic libraries, initial results from field-testing of the new surveys, and how Project Outcome can create opportunities for growth or change.
With over 1,000 libraries collecting more than 130,000 patron surveys in the system, PLA’s Project Outcome has helped the public library field collectively move towards the use of standardized outcome measures. In adapting this model for academic libraries, ACRL intends to support its members and the field to create momentum towards “outcome measurement as common practice.”
Aligning Textbook Affordability with State Performance Based Funding Metrics
Penny Beile (University of Central Florida)
In 2016, the Florida Virtual Campus (FLVC) administered a state-wide survey to higher education students to examine how the cost of textbooks impacted their education, purchasing behaviors, and academic success. More than 22,000 students responded to the invitation, and the FLVC ultimately reported that “the high cost of textbooks is forcing many Florida higher education students to make decisions that compromise their academic success.” Survey responses specific to the home institution, University of Central Florida (UCF), revealed that of the 1,975 UCF students who completed the survey, 53% “frequently” or “occasionally” had not purchased a textbook due to cost, and 19% attributed obtaining a poor course grade to not having the textbook. This environment served as an impetus for librarians and instructional designers to begin collaborating to promote and facilitate adoption of affordable textbook alternatives. On another front, in 2014 the Florida Board of Governors (BOG) approved the Performance Based Funding (PBF) Model, which is designed to reward excellence or improvement across ten different metrics. Metrics are most closely associated with what is traditionally thought of as student success indicators, such as graduation and retention rates, degrees awarded in areas of strategic emphasis or without excess hours, and average cost to the student. Since inception of the PBF model, UCF has received over $330 million in recurring base and performance based funding, which comprises a significant portion of the institutional budget. The “cost to attend college” metric is based on tuition and fees, books and supplies, and financial aid provided to the student. The cost of books and supplies as calculated by the College Board served as a proxy until last fall, when a new methodology was approved. The current books and supplies sub-metric is now calculated using bookstore information and the percentage of open access use. Data related to affordable adoptions has been tracked since 2016, consisting of faculty name and college, course information, year and semester of adoption, type of adoption, cost of the traditional text, and savings calculated as new cost of the old textbook by number of student enrollments. These efforts also have been assessed using the COUP Framework for Evaluating OER, with results of the study widely disseminated across campus. The COUP Framework suggests various methods for assessing impact of affordable textbooks, covering cost savings, student academic outcomes, student use of the OER, and perceptions of the resource. The Institutional Effectiveness office was aware of this work and consequently asked the author to submit textbook affordability metrics in support of the “cost of books and supplies” PBF sub-metric. Session attendees will learn of the FL PBF model (a widely adopted model nation-wide), different methods used to facilitate adoptions of affordable textbooks at UCF, internal data collection procedures, and how the program was assessed using the COUP Framework and disseminated widely, in turn positioning the library and its sister online learning unit to be key entities in supporting student success and providing institutional funding metrics.
Finding Hidden Treasures in the Data
Carolyn Dennison and Jan Sung (University of Hawai’i at Mānoa)
Purpose: Libraries rely on statistics to capture who their patrons are and what resources they use. Vendor-provided statistics, such as how many times a e-resource was accessed, may provide some insights about our patrons. However, they do not capture all aspects of patron behavior. This presentation explores how local evidence obtained from an EZproxy server can be utilized to illustrate a broader picture of patron behaviors. This local evidence, in combination with research evidence and librarian expertise, form the basis for the decision making approach for evidence-based assessment.
Approach: The University of Hawai‘i at Mãnoa Library for a land-, sea-, and space-grant research institution requires almost all of its patrons to be authenticated through an EZproxy server in order to access e-resources. Using EZproxy log data captured from July 2016 to June 2017 at the point of entry, this presentation will answer who is accessing resources, what resources are being used, and where and when they are accessing resources.
Findings: By analyzing around 350,000 entry points, differences in behavior between undergraduate students, graduate students and faculty members were determined: time and day, location, and databases or resources. Repeated access to particular articles or books by a single user still needs to be evaluated.
Practical implications or value: Analyzing statistics such as when and where users access e-resources may help library staff determine when staff assistance is needed and what patron groups may require addition or modified support services to encourage and facilitate use, and usage by patrons who access the same resource. Also, analyzing patron access to e-resources can clarify some aspects of patron behaviors such as which patrons access the same resource (e.g., article or e-book) multiple times, which can be used to determine if vendor-provided statistics are inflated. This may ultimately influence the decision making process in managing e-resources.
Smart Data, Smart Library: Assessing Implied Value through Big Data
Jin Xiu Guo (Stony Brook University) and Gordon Xu (Northern Michigan University)
The growing expenditure on electronic resources has become a new norm for academic libraries. It is crucial for library administration to measure the impact of such investment consistently and persistently, and then develop collections strategy. Big data technology provides such an arena for management to gain insights through meaningful data and allow libraries to optimize collection operations in real time. The purpose of this study is to assess the implied value of a research library by analyzing resource-use data with BigQuery—a cloud-based data warehouse. The authors develop a systematic approach to process structured data including e-resource usage and interlibrary loan transactions, and then analyze the data in BigQuery. Meanwhile, Google Data Studio is utilized to visualize the results. The findings of this study not only manifest the implied value of the research library but also offer an innovative approach to predict the future collection needs. The methodology employed in the study also provides a new opportunity for libraries to adopt big data technology and artificial intelligence to tackle intricate problems and make smart and informed decisions in this big data era.
Do Download Reports Reliably Measure Journal Usage?
Douglas Steigerwald, Ted Bergstrom, and Alex Wood-Doughty (University of California, Santa Barbara)
Download rates of academic journals have joined citation rates as commonly-used indicators of the value of journal subscriptions. While citation rates reflect worldwide influence, the value that a single library places on access to a journal is probably more accurately measured by the rate at which it is downloaded by local users. If local download rates accurately measure local usage, there is a strong case for employing download rates to compare the cost-effectiveness of journals. We examine download data for more than five thousand journals subscribed to by the ten universities in the University of California system. We find that controlling for measured journal characteristics—citation rates, number of articles, and year of download—download rates, as captured by the ratio of downloads to citations, differs substantially between academic disciplines. This suggests that discipline specific adjustments to download rates are needed to construct a reliable tool for estimating local usage. Even after adding academic disciplines to the variables we control for, we find that there remain substantial “publisher effects”, with some publishers recording significantly more downloads than would be predicted by the characteristics of their journals. While the usage tool can be modified to incorporate the publisher effect, this raises the question of what causes such substantial differences across publishers once journal and discipline characteristics are accounted for.
Data Modelling and Why it Matters
Frankie Wilson (University of Oxford)
Purpose: This paper will describe my journey into the world of data modelling, and why it is a useful tool for assessment librarians.
Approach: As part of a project to automate the collection, reporting and visualisation of key assessment data, the Bodleian Libraries (University of Oxford) contracted with Counting Opinions to create five easy-to-use online forms to gather data from library staff and turn it into a single output file suitable for importing into Tableau.
The statistics to be gathered were not complicated: for example the number of enquiries answered during the sample week. The Bodleian Libraries is a large library system, but not atypical: 700+ staff, 29 libraries, broadly subject-focussed, plus ‘back office’ and administration buildings, and a book storage warehouse.
However, I struggled to explain to the developers in Canada how the enquiry statistics were organised: Every person in the library, for every day in a specific week, keeps a tally of the number of enquiries they receive. They report the daily number to a team ‘rep’ who sums them to get a team daily total. The rep then enters this figure into the form. Some libraries have more than one team; some teams cover more than one library; and some teams are not in any library. It took hours of phone calls, a face-to-face meeting and hundreds of emails to correctly set up the ‘rules’ for combining the data from the forms into a single output.
This is no reflection of the developers’ ability to understand, nor actually of mine to explain, but is because so much of our knowledge about our data is tacit.
Findings: As humans, we do not experience the world as pure input from our senses. We situate that data into context—if I ask you how to get to Birmingham, your answer will depend on whether we in Alabama, Indiana, Iowa, or England. This tacit knowledge is something only revealed by immersion in the context and regular shared experience with a community of practice. No wonder I couldn’t explain it over the phone!
Computer programmes only work if all the necessary inputs are available. Developers are therefore used to breaking down complicated concepts so there is no ambiguity. The questions the developers asked me teased out my tacit knowledge, and led me into the hitherto unknown world of data modelling.
A data model is a conceptual framework that represent the world as accurately as possible. It defines the actors, actions and rules that govern the ways that processes work, representing them in a standard syntax.
If I had known about data models at the start of the process of working with Counting Opinions, my ability to communicate what the system needed to do would have been greatly improved.
Value: Data models enable humans to convey their tacit knowledge to computers—essential as automation of reporting and visualisations become more prevalent. Data models can also be valuable in supporting shared understanding across assessment teams. Not just a human-to-computer tool, but a human-to-human one too.
Using Student Survey Data to Build Campus Collaborations
Elizabeth Edwards and Rebecca Starkey (University of Chicago)
Unlike many undergraduate programs, the “Core” curriculum for first and second-year students at the University of Chicago focuses on analyses of assigned texts while discouraging outside research. As a result, many undergraduates may not have the opportunity to conduct research until much later in their college years, often not until beginning a senior thesis. While the library was aware of challenges presented by this curricular model, responses to its 2017 Survey of Undergraduates, conducted in partnership with Ithaka S+R, offered new windows into areas of student need, and opened doors to new opportunities for collaboration with campus units to address these needs.
Findings from this survey, administered to all enrolled undergraduates, made it clear that these students enter the university expecting to have opportunities to conduct original research and anticipate that the research skills honed while in college will be useful in their future careers. However, by their senior years, only 39% of respondents reported being regularly assigned research papers or projects, and less than a quarter of respondents reported having collaborated with faculty or graduate advisors on original research. Additionally, respondents expressed uncertainty as to which campus unit, if any, held responsibility for furthering their research skills, echoing sentiments in previous survey responses from their graduate student instructors.
However, responses to the survey also demonstrated the positive impact that participation in library instruction has on student research skills. While only 38% of respondents reported having attended a library program or orientation, those who had the opportunity to participate in library instruction—in any form—consistently rated their research skills higher than those who did not.
With data that demonstrated existing student needs as well as the impact of library interventions, the library was able to approach campus partners on more even footing to explore opportunities for collaboration. Outreach to College support services informed by these data resulted in identifying a shared mission, and lead to new partnerships and collaborations that seek to prepare and engage undergraduate students in research.
Attendees at this presentation will learn:
- How to organize and structure survey findings in ways that resonate with different audiences.
- How to utilize findings from user research to strengthen collaboration with campus stakeholders.
- How data from the Ithaka S+R local surveys can be used in a variety of institutional contexts.
Knowing our Users: Deriving Value from the Ithaka S+R Local Surveys at the University of Missouri
Jeannette Pierce and Gwen Gray (University of Missouri)
Purpose: In 2017, the University of Missouri Libraries worked with Ithaka S+R to distribute the Ithaka S+R Local Graduate Student Survey and the Ithaka S+R Local Faculty Survey with the goal of gathering information from these user groups that would help us evaluate our services and inform strategic planning for the future. A secondary goal was to use evidence derived from the results to engage campus partners in helping us to define and evolve our services, especially those related to emerging aspects of scholarly communication, research data management, and space planning. In addition to the aggregate results, we are examining the data by broad disciplines in order to better understand the needs and perspectives of these distinct user populations.
Methods: The Ithaka S+R Graduate/Professional Student Survey and the Ithaka S+R Faculty Survey were administered to all faculty and graduate students at the University of Missouri in October 2017. Our assessment team worked with the Office of Institutional Research, the libraries’ management team, the Office of Graduate Studies, Division of IT, and the libraries’ marketing team to finalize a myriad of decisions related to: choosing which Ithaka survey modules to implement, customizing questions, and determining and managing distribution lists of faculty and student recipients.
Findings: Findings from the survey shed light on the practices and preferences of important campus communities and the usefulness of relevant library services. While graduate students at the university generally see the library’s role as providing access to resources, facilitating learning & study, and providing assistance in finding resources as highly useful, they were less enthusiastic about the provision of data management services. Faculty members were much more interested than graduate students in data management services offered by the libraries, and also indicated strong support for traditional strengths of shared resources, supporting undergraduate teaching, and preservation. A majority of both graduate students and faculty members demonstrated support for open access publishing.
Implications: This paper will share the University of Missouri Libraries’ experience implementing and analyzing the surveys, as well key findings that will help us with strategic planning and our priorities for services. The paper will also reflect on our efforts and experiences sharing results with key strategic partners on campus. We intend this paper to be presented as part of a panel that will provide attendees with an opportunity to learn about Ithaka S+R local surveys in a variety of institutional contexts.
The Group Within: Analyzing a Large-Scale User Survey to Focus on a User Subpopulation of Interest
Lisa Hinchliffe (University of Illinois at Urbana-Champaign)
Purpose: International student enrollment in the United States has grown considerably in the last decade, and international graduate students now represent a significant portion of graduate students in the United States. Colleges and universities increasingly rely on international students for tuition and fee revenue, which is often significantly higher than the tuition and fees paid by domestic students. As such, for those universities with significant numbers of international students, that subpopulation represent a strategically important group for the institution and thus for the library. Analyzing a large-scale user survey enables a focus on a subpopulation of interest and puts the analysis of that group in a comparative context that provides additional insights into the information practices and library needs of that group.
Methods: The Ithaka S+R Graduate/Professional Student Survey (with the International Student Module) was administered on campus. In addition to overall analysis of the results, comparative analysis for international and domestic students was conducted.
Findings: Some of the findings from the survey confirm results previously reported in the literature, while other findings raise questions about commonly-held beliefs about language difficulties and prior library experience shaping library use and research practices. In addition, this is the first known library user study to investigate the impact of whether students completed their undergraduate degrees in the United States or in another country on their information behavior and perceptions.
Implications: This paper will share findings from the survey analysis and posit how those findings can inform the library’s service strategy for a strategically important population. Session attendees have the opportunity to reflect on their own institutions’ strategically important populations and their libraries’ efforts to meet their needs and the potential for subpopulation survey analysis to assist in doing so.
Assessing the User Needs of STEM Graduate Students: A Comparative Analysis
Juliet Rumble and Adelia Grabowsky (Auburn University Libraries)
Purpose: Graduate students are significant contributors to research activity on university campuses, and their professional education is typically central to the mission of their home institutions. Supporting the user needs of this population is thus of prime importance for academic libraries. However, graduate students are far from a monolithic group. As library services for graduate students have expanded from providing access to collections to offering support throughout the entire research cycle, understanding disciplinary differences in researcher practices and expectations has become essential to effective library liaison work. This paper aims to contribute to this understanding by reporting on findings from a large-scale survey of graduate students’ attitudes and scholarly practices conducted at a public land grant university. The paper presents a comparative analysis of the responses of STEM and non-STEM graduate students and reflects on the implications the findings have for designing library services for STEM students.
Methodology: During Spring 2018, the Ithaka S+R Survey of Graduate & Professional Students was administered on our campus. The survey closed in early May, with approximately 20% of graduate and professional students completing the survey. In addition to a summary analysis of responses, comparative analyses of STEM and non-STEM subgroups is being conducted, to be completed in June.
Findings: While findings are not yet available, the comparative analysis will focus on responses to questions related to (a) patterns of information discovery and usage, (b) research skills that respondents believe contribute to academic and professional success, and (c) respondents’ perceptions regarding the library’s role in supporting different parts of the research cycle (e.g., locating academic information, organizing and managing information/data, disseminating research findings, etc.).
Practical Implications: Findings from the analysis of these graduate student subgroups will inform the development and implementation of library services geared towards promoting STEM students’ academic success and their development as effective, independent researchers. Attendees of this multi-paper session will have the opportunity to reflect on the suitability of the Ithaka S+R survey tool for analysis of subgroups of interest on their own campuses as well as to learn of additional applications of the survey in a variety of institutional contexts.
Step Aside, Tableau: The Pros and Cons of Analyzing and Reporting Ithaka S+R Survey Results Using Google Data Studio
Emily Guhde (Georgetown University Library)
Academic libraries invest significant financial and personnel resources in fielding surveys to gather research data about their users. But once a survey closes and the data files are downloaded, librarians face the challenge of finding the right tool to analyze the results and share the findings with their audience. Tableau is often recommended as a powerful tool to explore survey data and highlight findings, but it comes with a trade-off between price and privacy. Google Data Studio (Beta), a new competitor to Tableau, avoids the price/privacy trade-off by integrating its tool into the Google Apps Suite, though there are other caveats to consider.
This paper details the benefits and disappointments of using Google Data Studio to analyze and report research data from the Ithaka S+R Undergraduate Student, Graduate and Professional Student, and Faculty Surveys at Georgetown University, which were fielded in fall 2017. The paper is structured around key decision points that will help librarians begin to consider and compare analysis and reporting options before they have the results in hand, including usability, cost, learning curve, security/privacy, collaboration, speed, updates, data formatting requirements, data visualization options, dynamic features (e.g., filters and scorecards), and exit strategy.
While other users of Google Data Studio are likely to discover additional benefits and frustrations with this tool, the experience at Georgetown University Library has been by and large very positive. Some of the tool’s strongest features include no additional cost if on a Google campus, sharing access (view or edit) securely via the Google apps interface, and the ability to present engaging and clear visualizations that include flexible data views, customizable to a specific audience. The principal disadvantages include speed, frequent updates, and only moderate sophistication in terms of the data visualization options.
The unique contribution of this paper, especially when presented in the context of other Ithaka S+R studies as part of a panel, is its focus on data presentation considerations. Libraries spend a great deal of time and energy gathering these data, but a well-crafted strategy for analysis and reporting adds significant value to the survey results. By introducing a new data analysis and visualization tool, this research will provide additional options for libraries as they prepare to communicate their survey research to their stakeholders.
Disseminating Findings on User Needs Across a Multi-Campus Library System: An Approach to Packaging and Communicating Results from the Ithaka S+R Survey of Undergraduates at Penn State University
Steve Borrelli and Lana Munip (Pennsylvania State University)
How do you share key findings and survey results across a multi-campus library system and integrate findings into practice? That was the question the newly formed Assessment Department at Penn State University needed to answer after receiving the results of the Ithaka S+R Undergraduate Survey in its third week in existence. To inform the wider organization and promote utilization of results, the department developed a multipronged approach, which included packaging survey data in accessible and usable formats, engaging key stakeholders, communicating broadly across the organization, and using results to prompt further investigations. This approach facilitated the use of findings in communicating outwardly in annual library reporting and presentations to university committees and administrators, provided location-based evidence supporting campus head librarians in conversations with local administrators, informed renovation plans, and acted as a catalyst for multiple investigations into subpopulations in support of the libraries’ focus on promoting diversity and inclusion. This approach demonstrates how providing multiple exposures to key stakeholders and packing results for end users influences the ultimate utilization of results.
This session will be valuable for attendees interested in enhancing their own strategies for disseminating research on their users as well as those who are interested in learning about the Ithaka S+R local surveys from a variety of institutional contexts.
Library Impact with International Rankings—One Library’s Continuous Journey to Figure it Out
Liz Bernal (Case Western Reserve University)
Over the past year a major initiative has occurred at Case Western Reserve University (CWRU) to help determine why the international ranking of the university keeps declining year after year. In the summer of 2017, an inquiry was put into place by the university president to determine why the CWRU rank kept dropping and was tasked with how to improve the rankings moving forward. Using the big three rankers (Times Higher Education, QS and Shanghi) the task force began looking at the components of what makes up the score. The library was brought in after learning that all three used citation data and academic reputation as components of the scoring which helps builds the overall ranking and the taskforce realized that bibliometrics are a significant component and had the potential to be corrected. The assessment librarian was tapped to complete an initial analysis of the bibliometric data to see if there were issues that could be impacting the ranking. Thus began the process of data mining inside the two tools that the big three rankers relied on for their citation data, Web of Science, from Clarivate Analytics, and Scopus, from Elsevier. After a deep dive was completed to learn if there were errors, the assessment librarian, needed to make any changes necessary to clean up the records. The actions the library took to help clean the citation data had lead the library to become one of the departments in the forefront of the initiative and many other departments are starting to take their lead from the library. Using data mining in the abstract/citation databases to help correct many mistakes, having clear communication with the vendors and international ranking organizations and sheer tenacity, the library has been able to put into place procedures and processes to help strive for accurate bibliometric data for the university and work with faculty as a whole to try and improve their reputation.
Communicating Library Impact through the Assessment Website
Kristin Hall (University at Buffalo/Stony Brook University) and Janet H. Clarke (Stony Brook University)
Library assessment activities are designed to help improve programs and services and demonstrate library impact on student academic success and research productivity. One effective way to share assessment processes and outcomes is through scholarly publications in library and information sciences. However, how do academic libraries effectively communicate on their website their impact on the research and learning enterprise to other stakeholders? Who are their target audiences? What assessment information should they be sharing to achieve and optimize their online platform?
Stony Brook University is designing a minisite as part of the overall Libraries web presence to share our assessment activities, findings, and statistics with our university community. Our goal is to demonstrate library impact on academic success in a visually compelling way. As part of this project, a thorough review was conducted of the AAU (Association of American Universities) member institutions’ library webpages to locate and study their assessment information to serve as a comparison. This examination revealed a lack of a library assessment presence on a number of academic library websites. Where there was assessment information, some of the information was difficult to follow because of a library-centric presentation and lingo, and issues of volume, scope and types of materials. This presentation will discuss these findings and deliberations, and how they informed our process of creating a library assessment minisite that helps us “demonstrate the library’s value,” as cogently articulated in ACRL’s Academic Library Impact: Improving Practice and Essential Areas to Research (2017) and ACRL’s The Value of Academic Libraries: A Comprehensive Research Review and Report (2010). Recommendations for best practices in design, visual impact, and communication of assessment goals and processes on a website will be explored.
Connaway, L. S., Harvey, W., Kitzie, V., & Mikitish, S. (2017). Academic library impact: Improving practice and essential areas to research. Association of College & Research Libraries, a division of the American Library Association.
Oakleaf, M. (2010). The value of academic libraries: A comprehensive research review and report. Association of College & Research Libraries, a division of the American Library Association.
Quantifying the Value of the Academic Library
Rebecca Croxton and Anne Moore (University of North Carolina at Charlotte)
Student engagement and success are critical, with more than 40% of individuals seeking a four-year degree dropping out within six years (NCES, 2017). Tinto’s social integration theory posits that students need integration into formal and informal academic and social systems of the university to be successful (Tinto, 1993). Engagement strengthens students’ academic intentions, goals, and institutional commitment, thereby increasing the likelihood of graduation. While universities are implementing High Impact Practices (Kuh, O’Donnell, & Reed, 2013) to engage and retain students, myriad other factors may be at play. Through the lens of social integration theory, formal integration may also include (1) library engagement, (2) use of student support services, and (3) participation in co- and extracurricular activities.
To determine which engagement factors contribute to student success at a large, public, research university in the southeast, the university library, along with representatives from Academic Affairs, Student Affairs, and other academic and support units across campus have agreed to collaborate in the alignment and analysis of student data. This will not only allow the library to quantify its impact on student success, but will also help university leaders identify other critical areas of student engagement.
Methodology: A two-phase mixed model has been designed to include three data collection strategies. In Phase I, interviews with university stakeholders will occur, which will inform Phase II activities. In Phase II, researchers will access and align datasets and conduct statistical analyses (e.g., ANOVA, Regression) to identify significant factors relating to student engagement and success. Concurrently, all research activities will be documented and comments analyzed in order to create a transferable model for institutional data alignment and analysis.
- May 2018: Conduct, transcribe, and analyze stakeholder interviews
- June–July 2018: Align library data across multiple systems; access, align, and de-identify student data from other campus units
- August–September 2018: Statistically analyze student engagement and success data (e.g., ANOVA, Regression)
- October 2018: Summarize findings and prepare for dissemination
Findings: Findings will be ready for dissemination by October 2018. As outlined above, qualitative data collection will begin in May 2018. During the summer, student engagement and success data from multiple university units will be aligned and analyzed. In early fall, findings will be summarized and available for dissemination.
Practical Implications and Value: Study findings will quantify the library’s role in student success, thereby demonstrating value when competing for university resources. Alignment of institutional metrics will also help university leaders identify key factors that contribute to student success. A transferable model will be developed for institutional data alignment and assessment which includes the library.
Kuh, G., O’Donnell, K., &; Reed, S. (2013). Ensuring quality and taking high-impact practices to scale. Washington, DC: Association of American Colleges and Universities.
National Center for Education Statistics, Institute of Education Sciences. (2014). Fast facts: Graduation rates. Retrieved from https://nces.ed.gov/fastfacts/display.asp?id=40
Tinto, V. (1993). Leaving college: Rethinking the causes and cures of student attrition (2nd ed.). Chicago, IL: The University of Chicago Press.
Textbook Affordability Options: Assessing eBook Purchase Models for Value and Impact
Athena Hoeppner and Sara Duff (University of Central Florida)
Textbook affordability (TA) is gaining momentum in academia. Libraries are well positioned to implement TA projects by leveraging their ebook collections and purchasing power. However, the ebooks have been acquired under a variety of models with different purchase models, DRM restrictions, simultaneous usage levels, access options, and platforms. Currently, libraries are making best-guesses about which ebook options provide the best ROI and UX for assigned textbooks. We lack established best practices based on a body of quantitative and qualitative data to guide acquisitions decisions.
Two years ago, the University of Central Florida Libraries Acquisitions began a series of textbook affordability projects. After demonstrating a high ROI for students based on assigned texts already in the collection, we proactively purchased a subset of assigned textbooks when a DRM-friendly version was available. We now have the complete list of assigned textbooks and are in the process of identifying titles the library owns, which can be purchased, and in which model. During the Fall semester, the library will systematically gather data on the number of matched textbooks, the spending to acquire the online versions, the chosen purchase model and platform, the courses and disciplines, number of students, and usage of the identified texts.
These three projects, combined with usage information, provide a base of data we can use to look for trends in use relating to usage per platform, DRM model, outreach efforts, and program and can compare use of the textbooks with “regular” ebooks. The findings have implications for collection and purchasing decisions, and will help libraries address questions such as:
- Are DRM free or unlimited user models worth the extra cost?
- Should we prefer publisher-hosted versus aggregator-hosted ebooks?
- Which subjects or programs yield the best ROI or usage?
- What is the tipping point for nonlinear being a viable model for online courses?
- What factors should be used to prioritize purchases?
The Continuing Adventures of Library Learning Analytics: Exploring the Relationship between Library Skills Training and Student Success.
Selena Killick, Richard Nurse, and Helen Clough (The Open University)
The Open University (OU) is the UK’s largest academic institution dedicated to distance learning, with over 173,000 students. Learning analytics is a key organisational strategic driver at the OU where we have been a leader in this research field internationally (Rienties, Toetenel, 2016). Library Services within the university provide students and staff with access to an extensive online collection of library resources; digital and information literacy skills training and 24/7 support. This paper is a continuation of our work in the field of Library Learning Analytics. Previously published research has identified a positive relationship between library collection access and student success (Nurse, Baker and Gambles, 2018), but what about our skills training provision?
Library Services have been delivering a suite of online synchronous information literacy training sessions embedded into the curriculum in partnership with faculty colleagues since 2014. Attendance is optional for the students and approximately 20% of qualifications have added the library sessions to their tuition strategies. Following a platform provider change in 2017 data on student attendance at online tutorials, and any subsequent views of sessions after the event, have been collected as part of the institutional learning analytics strategy. This research will investigate the relationship between students who participate in the library-provided training sessions during the academic year 2017–18 and their attainment at the end of the module of study. Attainment data, defined as fail, pass or pass with distinction, will be available in October 2018. Attainment scores of students who chose to attend live, and those who watched the session at a later date, will be compared with students from the same module who did not participate. The research will be conducted in accordance with the institutional Ethical use of Student Data for Learning Analytics Policy (The Open University, 2014). Findings of the research will be completed by November 2018.
An early pilot study with a single module has found that students who participated in a training session had on average a 6.5% higher score in their next assignment than the students who chose not to attend. The initial pilot also suggests that there may be a difference between students attending live and those viewing the recording. It should be noted however that a number of factors will impact on student success alongside the library training session.
The drivers for this research are to identify if the online library training sessions are providing an impact on student success in line with key institutional strategic drivers. If they are having a positive effect the information will be used to advocate the service with key stakeholders with an aim to increase resource for the service; with faculty to ensure students from all disciplines are able to benefit; and with students to encourage participation. If they are not having a positive impact on student success future research will be conducted into the reasons why, with the ultimate aim of improving student success.
Nurse, R., Baker, K. and Gambles, A. (2018). ‘Library resources, student success and the distance-learning university’, Information and Learning Science, 119(1/2), pp. 77–86. doi: 10.1108/ILS-03-2017-0022.
Rienties & Toetenel, 2016. The impact of learning design on student behaviour, satisfaction and performance: A cross-institutional comparison across 151 modules. Computers in Human Behavior, 60, pp.333–341.
The Open University (2014). Policy on Ethical use of Student Data for Learning Analytics. Milton Keynes. Available at: http://www.open.ac.uk/students/charter/sites/www.open.ac.uk.students.charter/files/files/ethical-use-of-student-data-policy.pdf.
Impact and Ethics: A Meta-Analysis of Library Impact Studies
M. Brooke Robertshaw (Oregon State University) and Andrew Asher (Indiana University)
Following trends in higher education that emphasize quantitative analytical approaches to learn about educational outcomes, academic libraries are increasingly turning to “big data” methods in an attempt to demonstrate their impacts on student learning and demonstrate their value to the university’s educational mission. By applying learning analytics techniques to library use and instructional data, libraries have especially focused on attempting to measure the impact of the library on student retention and achievement measures (Crawford, 2015; Cox & Jantti, 2012; Soria, Fransen & Nackerud, 2013).
This presentation has two purposes. The first purpose is is discuss the methodological issues which were detected while conducting a meta-analysis on library impact studies. The second purpose is to present findings from a meta-analysis conducted on studies that were published between January 1, 2008, and April 30, 2018.
There are currently debates within the learning analytics field about informed consent and transparency when undertaking these methodological approaches (Prinsloo & Slade, 2013; Richards & King, 2013). These debates exist because learning analytics studies require large datasets of personally identifiable information (PII) that could be inherently risky due to the potential for breaching the confidentiality and privacy rights of individuals and groups within the dataset. The methodological issues that were detected while conducting this meta-analysis fall within this this debate because these studies subject individuals in library datasets to the same risks outlined above.
This presentation will also explore the results of the statistical findings from this meta-analysis of learning analytics studies in libraries that examine the effects of library use on students’ retention and GPA. Based on these results, we will delve back into the debate about privacy, transparency, learning analytics, and libraries. We will examine the issue through through two lenses—a pragmatic lens, and an ethical lens. While we recognize the institutional importance of these types of studies as well as the environmental conditions that create pressure on libraries to participate in learning analytics initiatives, we will also question the beneficence of these studies from a research ethics point of view.
Session 4: Space I
Assessing a Graduate Commons in the Library: Graduate Students Need an Identified Third Space
Susan Beatty (University of Calgary)
Graduate students need space to do their work. Their needs tend to be linked to the type of research they are doing and the nature of their work. This assessment of library space provides an opportunity to understand not only graduate space preferences but why that space should be in a library. To improve service and spaces, academic libraries are looking to develop specific spaces for different user gorups: undergraduates, graduates and faculty. In response to a perceived need, an academic library undertook to redesign a reading room in a primarily undergraduate library to create a secure space for graduate students. After the end of the first year of operation, students were surveyed on the effectiveness of the new Graduate Commons and its utility for graduate students.
Design: Students who registered to use the commons were asked to complete an online survey identifying which elements of the space they preferred, the nature of the work that they accomplished in the space, and length of stay, frequency, preferred times and days of the week for their visits. They were also asked to rate the features of the space, indicate their preferred seating and their activities. Additionally students commented on their goals in using the space and how the space compared to other graduate spaces on campus.
Findings: Students value a guaranteed, secure space that is their own. They need to work in a space that enables them to achieve their goals. Students prefer spaces that provide natural light, individualized and flexible seating, quiet spaces for concentration, good wifi access, secure surroundings and a place to use and store their stuff. Using their time well means that graduate students read, write, study, conduct research, as well as mark papers and create class assignments, in order to accomplish their learning, research and productivity goals. In general, the variety of their activities and the pace of work needed to accomplish their goals are supported and enhanced by guaranteed access to their space in the library.
Implications: By going beyond the simple feedback survey focussed on likeability, this survey yields results which illuminate desirable features of a graduate space that supports and enhances student learning. Most students preferred the unique features of the commons versus the spaces offered to them across campus. They could get things done faster and with fewer disruptions in the library space. Additionally, results further understanding of the behaviours of students in the library and the value of identifiable spaces for graduate students. By applying a better understanding of student learning behaviours in the library and the value unique library space holds for students to our planning, libraries can be seen to be a vital learning space for all users.
Seating Our Patrons: a Multi-Year Approach to Creating and Assessing User Space
Margaret Fain and Jennifer Hughes (Coastal Carolina University)
Like many academic libraries, Kimbel Library at Coastal Carolina University has a space problem. Use of the library building increases as our student population grows, while our physical facility has barely kept pace. Between 2009 and 2018, the student population grew by 27%, from 8,100 to 10,300. In the same time period, the square footage of the library increased 38%, but the actual square footage only increased from 44,400 to 61,200 square feet. In an effort to increase seating and workspaces for students, over the past nine years we have been conducting the Kimbel Library Space Study which uses assessment data to effect change in floor plans and seating options. This paper will outline the process through which we developed a longitudinal study that utilized multiple assessment measures for continual improvement of library space. Session attendees will receive information that will allow them to transition elements of this study to their home libraries.
The space study included two administrations of the LibQual survey, one in 2009 and one in 2012, in the same semester that the new 16,800 square foot Learning Commons opened. In 2014, an internal space use study was developed using an ethnographic approach. 2016 saw the implementation of the final changes prompted by that study. In 2018, seat count data was evaluated to determine the impact of those changes. One initial target was met, that of reducing the student seat ratio. In 2009, with a population of 8,100 students, the library student seat ratio was 18:1, in 2018 the seat ratio is now 11:1.
We used quantitative measures (LibQual, seat use surveys, seat counts, building use counts) and qualitative measures (LibQual comments, whiteboard surveys) to create composite pictures of how students were actually using available space and what needs were being met or unmet. While the initial cost of the LibQual surveys was high, the in-house assessments were low cost.
We found that students had a variety of needs and were actively creating the spaces they wanted to be in. Our task was to ensure that the physical space was conducive to the expected use. This involved removing tables from quiet areas and increasing them in noise zones; rearranging furniture to maximize access to electrical outlets, and changing food and drink policies. When designing the library addition, furniture selection was based on versatility and the ability to promote formal and informal collaborative spaces. Throughout the nine year cycle, we assessed existing space, created and implemented changes based on the data, and assessed the changes; developing new approaches to space use with each assessment cycle. The final result is a library that accommodates students and makes the most effective use of the limited space available.
Headcounts on Steroids: A Lightweight Method for Evaluating Space and Furniture Use
Katherine Gerwig and Carolyn Bishoff (University of Minnesota)
Purpose: The study identified patterns of space and furniture use to inform planning and vision for the busiest library on the University of Minnesota-Twin Cities campus.
Methodology: Library staff manually gathered headcount and user behavior data in Walter Library during fall 2017 and spring 2018 semesters. Data was gathered three times a day, three days per week, during three weeks throughout the semester. The data included counts of people by furniture type and was augmented with time and location data. These data were combined with total seat counts by furniture type, room, and floor and compared across time and space. The instrument was updated and refined to improve data collection.
Findings: Library users’ furniture preferences changed drastically from room to room. We found that spaces with furniture and atmosphere designed for collaborative work were very popular, as were spaces designed for quiet, individual study. Furniture supportive of individual study were underutilized in rooms and areas more conducive to group or parallel study and vice versa.
Implications: We want flexible spaces and a nimble decision making process but have limitations due to the constraints of our historic building. The study has encouraged creative, user-centered thinking. The methodology is lightweight enough to repeat the study each semester and at the same time produces actionable information that have informed major decisions and a vision for our library space as a whole. The datasets we generate answer big picture questions about library use and inform individual decisions about the placement and use of pieces of furniture. Most importantly, the study has challenged many of our assumptions about how people use the library’s spaces.
Developing the Metrics to Assess the Library’s Active Learning Spaces
Karen Hum, Ningning Kong, Yue Li, and Nanette Andersson (Purdue University)
Located at the heart of Purdue campus, the Wilmeth Active Learning Center is a 164,000 square-foot building serving as a central place for active learning classrooms and library space. The building was initiated by the university libraries, and just open to public from fall 2017. During the building design, library faculty and staff have worked with learning-design experts and designed the space with various active-learning classrooms, library and collaboration spaces. So, it is easy for students to move from one form of learning to the other. The building opens 24/7.
Following the success of design and opening of the building, the libraries is interested to assess the space usage for different part of the building, which will guide us in providing library services, staffing, and adjustment in the future. The research questions are: 1) how the space are used by students at different time? 2) what library service/technology were used by students? 3) what kind of activities students are engaged in the building? With this in mind, we have worked collaboratively to design a mobile-device based, location-enabled system in order to collect the information at designated time periods by library staff members.
According to our research questions, we first designed the database schema and implemented in an enterprise database. Since the data collection is based on location information, we used Geographic Information Systems (GIS) to aid in the data collection process. The database is composed of digital maps of the floor plans for different building levels, as well as students’ activity information associated with different designed spaces, such as total number of students, technology used, learning activities. Then, we developed a mobile device based interface to connect the database, so that library staff members can enter the information as they walk around the building for data collection. In addition, we also designed the secured login system so that we can track the information collection sessions. Finally, we automated the reporting system so that data collection reports can be generated in a daily basis.
This space usage data collection system has been tested during the fall 2017. Detailed training materials have been provided to the library staff members to ensure the data and technology consistency. Based on the training feedbacks, the database and data collection interface have been modified to avoid possible confusions between different user groups. Then the system was put on production. Observation data were collected every evening by night building attendants and during daytime in two designed time period. The results have indicated that it is an effective system for collecting library space usage information with minimal training requirements. It enables different staff members to collect consistent information. The initial analysis also helped us to understand the learning space usage statistics and different learning activities happened in the interspersed space, which will work as a guideline for our future service design.
Participatory Data-Gathering and Community Building
Emily Puckett Rodgers, Denise Leyton, and Kat King (University of Michigan)
The University of Michigan Library’s Library Environments department was established in 2016 to facilitate strategic directions and resource investment in our space design and planning efforts across our major library buildings housed on four campuses. Our system hosts over 4,200,000 users per year with 652,848 square feet of public, collections, and staff space, including two 24/7 facilities. The Library Environments department facilitates user and data-driven research to inform projects meant to create safe, welcoming, and accessible library spaces. We also lead strategic space planning efforts to transform our buildings so that they better represent and support our expertise, tools, collections and the needs of our academic community.
Because the scale and distribution of our services, buildings, and collections span different buildings across multiple campuses, our small team of three must rely on a community-centric effort to achieve our work. Our paper and presentation will describe our distributed, team and consultation-based approaches to large-scale strategic data collection and assessment. We will describe how we built our network of partners and provide examples of our programs and their impact to date.
Our department approaches our work based on a philosophy of deep collaboration and consultation. Last year, we created a “data stewards team” of staff in different departments that leverages their interests interest and skills. This team reviews our data collection for accuracy and develops functional spreadsheets and monthly reports for staff across six departments to use in decision-making. Their perspective also informs our departmental processes from a ground-up perspective.
With our staff of over 80 division employees, we have developed a data collection program that includes training, participation in specific initiatives, and cross-departmental partnerships. Staff are trained to use data-gathering tools such as Suma, Desk Tracker, and to pull and read specialized reports from our Integrated Library System. We use these to gather statistics related to patron use of open study space within our buildings, activity at service desks and our building-to-building materials delivery. Alongside these efforts, we are developing processes and workflows to share the data we collect across our organization so that it can contribute to our ability to make data-driven decisions throughout our organization.
We are in year two of this large-scale effort to collect, track, and report on patterns of activity across our buildings. Although we are still in an early stage of implementation, our data collections effort have already contributed to resource investments and decisions. We have made furniture purchase decisions based on patterns of activity observed within our spaces, we have used it to inform collection access and distribution, and to inform student staff training. We are also developing a data literacy campaign to train supervisory staff how to understand and use our monthly reports in their scheduling efforts, and we have identified additional information to incorporate in our wayfinding signage across our buildings. These efforts have also generated partnerships and collaborations across our organization, especially with our Collections and Learning and Teaching divisions.
Session 5: Methods and Tools I
Using the LibQUAL+ Survey to Inform Strategic Planning
Patricia Andersen and Christine Baker (Colorado School of Mines)
Purpose: The Arthur Lakes Library at Colorado School of Mines participated in an extensive strategic planning process in the spring of 2017 resulting in the development of a Strategic Plan for 2017–2020. The impetus for this planning process was the addition of seven new faculty and staff members, including a new university librarian. Strategic planning involved input from all library employees as well as library stakeholders (students, faculty, and university staff members). In February 2018, the library’s Assessment Committee conducted the latest round of LibQUAL+ Survey data collection. The library has participated in the LibQUAL+ Survey every four years since 2003. Our main focus with previous surveys had been to make improvements in the library based on survey comments. After reviewing the results and comments of the 2018 LibQUAL+ Survey, committee members Patricia Andersen and Christine Baker observed that information gleaned from the survey data and comments could be connected to goals and objectives in the library’s Strategic Plan. In-depth analysis of survey data and comments could be used to assess relevance and achievement of goals in the current plan and as a tool for developing future strategic plans.
Approach: The first step involved analyzing the LibQUAL+ data and utilizing a coding system for the comments. The authors chose to adapt Brown University Library’s Methodology for Coding Qualitative Data (User Comments). Once the comments were categorized and the LibQUAL+ Survey data analyzed, connections between the results and the Strategic Plan were identified. The next step entailed an in-depth examination of the library’s Strategic Plan highlighting components that related to data and comments from the Survey.
Findings: We found that many of the comments (both positive and negative) linked directly to goals, objectives, and actions in the Strategic Plan. Most of the comments involved physical space and use of space (e.g. more study space/rooms or needs updating, etc.) and ambiance (e.g. too noisy, good natural light, etc.). The responses to the core questions for Library as Place are aligned with the comments. The library is currently advocating for a renovation and the results from this survey demonstrate that stakeholders agree with the need to improve library spaces and ambiance. Several stakeholders mentioned the need for renovating the existing space or building a new library, adding evidence to the library’s advocacy efforts. Addressing other aspects of the Strategic Plan, the library recently acquired new resources and implemented new services prior to the 2018 LibQUAL+ Survey. Survey data and comments indicated that these resources and services were both welcomed and well-publicized.
Practical Implications or Value: Strategic Planning and LibQUAL+ Survey results and comments can be used together to assess resources, services, and space in the library. 2018 LibQUAL+ Survey results and comments support and validate the direction of our current Strategic Plan and can be used as an assessment method as we move forward with our plan and develop future strategic plans.
Impacting Student Success: A Practical Guide to Assessing Library Services at the Community College Level
Faith Bradham (Bakersfield College)
Library assessment is a well-established necessity for proving a library’s worth to administration. However, it is difficult to find reports of assessments for libraries at two-year institutions. In addition, few library assessments focus on library services and instead concentrate on library collections. Library services can include reference, information literacy instruction, and library outreach, all of which directly tie to students’ academic success. This paper seeks to fill the gap in the literature on community college assessment of library services by presenting the results of a mixed methods assessment of library services at Bakersfield College during the Fall 2017 semester.
Because the Bakersfield College library had never completed a formal assessment of its services before Fall 2017, the assessment design was built from the ground up. The questions this assessment answered were:
- Does the current level of library services adequately fulfill the needs of the student population at Bakersfield College?
- How should the library change or enhance its services to better meet the needs of its community?
This mixed-methods assessment was accomplished by gathering quantitative usage statistics that included daily tallying of student presence in the library, the breadth of the library’s information literacy instruction, and the amount of reference help given. The qualitative aspect of the assessment was accomplished through targeted surveys for students and faculty at the college.
The results of this study were quite positive. Based on these results, the library is highly used by the campus community and is very positively perceived by both students and faculty at Bakersfield College. Gaps in services were identified as a lack of consistently quiet study space for students in the library as well as a lack of awareness among faculty of the services they could use at the library. As a result, the library has implemented steps for addressing these gaps. This assessment was structured to be easily replicable and the library plans to continue assessing its services annually with this model of assessment, with the goal for the library to have an ever greater impact on its students’ academic success. This assessment is replicable for any community college library that wishes to begin its own formal assessment practices, as well as any academic library interested in beginning to assess its services.
Disparate Data Sources: Assessing a Library Program Impact on Student Achievement
Karin Chang and Jennifer Boden (Kansas City Area Education Research Consortium, University of Kansas)
Background: The Mid-Continent Public Library (MCPL) has been providing summer reading programs in the Kansas City metro area since the 1970s. The programs serve approximately 7,500 school-age students each summer, encouraging them to engage in various summer learning activities. In 2015, MCPL partnered with the Kansas City Area Education Research Consortium (KC-AERC) to assess the effect of its summer reading program on student achievement.
Methods: To assess program impact, data sharing agreements were established with eight public school districts. Student demographic characteristics and academic performance data (math and reading benchmark scores) were collected from schools for years before and after the summer program. One major challenge to the study was the use of various benchmark assessment tools within and across districts. To overcome this challenge, performance data were standardized by assessment tool. Researchers then created a matched sample to compare intervention and nonintervention students.
Findings: After participating in the summer program, students scored higher than students who did not participate in the program. MCPL programming was especially effective for male students and low-income students. These findings have been replicated across several years.
This presentation will be of interest to researchers studying library program outcomes for K–6 youth. The paper provides detailed information on the methods used to collect and utilize data from disparate sources.
Qualifying for Services: Investigating the Unmet Needs of Qualitative Researchers on Campus
Alexa Pearce, Alix Keener, Russel Peterson, Caroline He, Karen Downing, and Elizabeth Yakel (University of Michigan)
This paper will present initial findings from a study in progress that investigates the potential for libraries to improve support for qualitative research. Though qualitative research is not new, its value in elevating stories that cannot emerge from quantitative methods alone has gained traction in recent years (Pelto, 2015). Accordingly, many disciplines have experienced increased interest in, and adoption of, emerging qualitative and mixed methods approaches (Alasuutari, 2009). Libraries have not generally matched this trend with the provision of dedicated tools and expertise. While libraries have built, enhanced, and expanded service offerings to support scholarly needs across all stages of the research life cycle, with particular emphasis on research data management, these services have tended to privilege quantitative approaches (Swygart-Hobaugh, 2016). Similarly, researchers are likely to encounter greater support for quantitative methods than qualitative ones elsewhere on campus and in dialogue with external funders (Denzen and Lincoln, 2011; Lather and St. Pierre, 2013).
This paper will report on a needs assessment, conducted through a series of stakeholder interviews, in order to better understand and eliminate existing gaps in services. The authors have conducted interviews with 12 campus stakeholders, including faculty researchers, doctoral students, and librarians, selected from multiple programs at a research intensive, four-year institution. The interviews were conducted during the first of a multi-year grant funded project that seeks to develop models for library assessment through collaboration among library professionals and faculty and graduate students from the institution’s school of information. Researchers from social sciences disciplines are most heavily represented in the initial round of interviews, though the needs of researchers in the health sciences are also reflected. All interviews have been transcribed and coded thematically, using a grounded theory approach.
Currently, findings are preliminary and will require additional analysis using NVivo software. Analysis to this point has indicated that researchers do not perceive the library as a source of support or partnership in the provision or development of tools and expertise for qualitative analysis, though there is generally strong awareness of library services that support quantitative analysis. Additionally, qualitative researchers report many barriers to carrying out their scholarship, related to availability and reliability of both campus support and external funding. Over time, the four-year institution considered in this study has provided some qualitative support, though inconsistently and rarely based in the library. Moreover, findings support and reinforce existing evidence that qualitative researchers tend to rely on self-formed communities of practice in order to develop methodological and tool-based competencies (Roger and Halas, 2012).
Libraries have experienced notable successes in their efforts to build, market, and assess research data services. In many cases, libraries have situated research data services as components within comprehensive portfolios of research life cycle service offerings. This paper’s findings suggest that libraries have an opportunity to make these services more inclusive by incorporating qualitative analysis and the management of qualitative data among the full range of scholarly activities that they recognize and support.
Community College Libraries & Academic Support for Student Success
Christine Wolff-Eisenberg (Ithaka S+R) and Dr. Braddlee (Northern Virginia Community College)
Community colleges are vital engines of our higher education system, enrolling 39% of all undergraduate students with special emphasis on serving underrepresented minorities, low-income students, first-generation college students, new Americans, and other diverse populations. The vast majority of research on how to adapt library services to support new priorities has been conducted at four year colleges and universities, and the definitions of student success used in these projects have often derived from higher education institutions, state boards of education, and the federal government, thus omitting the perspective of the student in what defines success. To continue evolving in support of their students, community college libraries need strategic intelligence about how to adapt their services. Ithaka S+R and Northern Virginia Community College, along with six other community college partners and with support from the Institute of Museum and Library Services, are taking an important step to strengthen the position of the community college library through the Community College Libraries & Academic Support for Student Success (CCLASSS) project.
Christine Wolff-Eisenberg from Ithaka S+R and Dr. Braddlee from Northern Virginia Community College will speak about the first phase of this project which focused on understanding how student success can be defined in a way that is inclusive of the way that students express their own needs. The presenters will share findings from interviews that were conducted across the seven community colleges involved in this project on students’ general experiences, education goals, coursework and academics, and definitions of success. They will also discuss the library service offerings that the seven community colleges have generated, based on the interview findings, and will be testing via survey during the time when the Library Assessment Conference takes place. When the project is completed in 2019, in addition to publishing reports of findings, the research dataset and survey instrument will be released under open licenses with the intention that other institutions may both benefit from and contribute to the initiative.
This session will be valuable for attendees interested in employing qualitative methods to understand the practices and needs of their users as well as those who are interested in learning more about the practices and needs of community college students broadly.
Session 6: Organizational Issues I
The Career Paths of Assessment Librarians: An Exploration of Professional Growth
Sarah Murphy (The Ohio State University)
Purpose: As the ability to gather, analyze, and use evidence to inform decision-making, as well as articulate the impact of library programs and services has emerged as a key leadership and management competency for library administrators, assessment librarians may be ideal candidates for higher-level administrative roles. Whether assessment librarians are able to mature and grow into higher-level administrative roles is of particular interest. This study explores the career paths of Library Assessment Conference attendees from 2008 through late 2017. It questions whether there is a typical career pattern for assessment librarians by examining the education and experiences of individuals solely tasked with library assessment, or with assessment included within their job titles. Specifically, the study explores three questions: 1) do assessment librarians have a common educational background beyond the MLS? 2) is assessment is a role typically assumed by entry-level or mid-career librarians? and 3) do assessment librarians progress to higher-level administrative or leadership roles?
Design/Methodology/Approach: This study updated methodology used for a previous study of associate library directors by Moran, Leonard, and Zellers (2009) which relied on information gleaned from the American Library Directory and a study of academic law library directors by Slinger and Slinger (2015) which utilized CVs harvested online. This study focused on publicly available LinkedIn profiles, after finding that nearly 70% of LAC attendees since 2008 had complete profiles which met study criteria. Select data from these profiles was harvested over a two-week period in August 2017 and arranged for analysis following procedures outlined by Koch, Forgues, and Monties (2017).
Findings: By 2016, 63.1% of LAC attendees had job titles which indicated they were solely responsible for assessment. The educational backgrounds of the 194 individuals either solely tasked with assessment or who had assessment included within their job titles varied widely, from several undergraduate and graduate degrees in English, to graduate degrees in public and business administration in addition to the MLS. While the average number of years of work experience between earning a professional library degree and assuming an assessment librarian role was 7.4 years for those working for doctoral institutions and 12.7 years for master’s degree institutions, a visual distribution of the number of years of experience before an individual became an assessment librarian revealed that most LAC attendees recently joined the profession. More than 107 had less than five years of experience.
Practical Implications or Value: Results indicated that librarians tasked with assessment do not have a common educational background beyond the MLS. As a greater number of librarians are assuming the role of assessment librarian post-graduation from library school, it may be too early to determine whether assessment librarians are moving into management roles. Since the majority of assessment librarians included in this study were new to the field, more work is needed to determine both the skills and experiences these individuals bring to the job, and whether the skills and experiences they obtain on the job help them mature and grow into higher-level leadership and administrative roles.
Assessing the Success of a Mentoring Program for Academic Librarians
Karen Harker, Catherine Sassen, Seti Keshmiripour, Marcia McIntosh, and Erin O’Toole (University of North Texas)
Purpose: A continuous cycle of assessment contributes to the effectiveness, relevance, and sustainability of a mentoring program. Lunsford (2016) has identified several reasons for assessing a mentoring program. First, assessment can provide information about changes needed in the program. In addition, assessment results can identify how resources should be allocated to improve the program. Furthermore, assessment information highlighting the success of the program can be shared with stakeholders as well as current and potential participants, thus contributing to the continuation of the program. The assessment of many mentoring programs is limited to participant satisfaction. Although this measure could be used to address areas of dissatisfaction, it is limited in its scope. In many cases, such assessments do not determine the impact of the program on the participants’ abilities, skills, and future careers.
Approach: This case study concerns the assessment of a mentoring program implemented at a large academic library. The program includes mentor-protégé dyads, peer-mentoring groups, and mentor training. Assessment measures address the goals of the mentoring program, which include increasing confidence of participants, improving mentoring competencies, and expanding future participation in the program. Assessment results are drawn from three years of administering the Mentoring Competencies Assessment (Center for the Improvement of Mentored Experiences in Research, 2017a and 2017b) and the End of Program Evaluation (University of Illinois at Chicago Administrative Professional Mentoring Program 2014). Results also include information gathered from a focus groups conducted with program participants.
Findings: The results from these assessments indicate that the program is meeting its goals of improving mentor competencies and improving the confidence of participants. The success of the program has been instrumental in securing administrative support as well as attracting new participants.
Practical Implications: This case study is of interest to mentoring program organizers because it demonstrates why assessment is essential to the success and sustainability of a mentoring program. The study also provides practical information about how to apply multiple measurements to gather information that can be used to implement needed changes in a mentoring program.
References: Center for the Improvement of Mentored Experiences in Research. (2017a). Post MCA Survey. [Measurement instrument]. Retrieved from http://cimerproject.org/-/evaluation/mentor-training
Center for the Improvement of Mentored Experiences in Research. (2017b). Pre MCA Survey. [Measurement instrument]. Retrieved from http://cimerproject.org/-/evaluation/mentor-training
University of Illinois at Chicago. Administrative Professional Mentoring Program. (2014). End of Program Evaluation. [Measurement instrument].
Obligations and Intentions: An Exploratory Study of Indirect Cost Recovery Monies from Research Grants as a Revenue Stream for Funding Research Library Budgets
Devin Savage (Illinois Institute of Technology) and Chad Kahl (Illinois State University)
As revenue streams for academic library budgets come under increasing scrutiny due to the increasing instability in higher education funding and questions about future sustainability of library expenditures, one revenue stream in particular has not been broadly understood or studied. This revenue stream is that slice of grant money dedicated for academic libraries’ support of research, also known as Indirect Cost Recovery (ICR). This money is specifically written into research awards and processed by academic institutions in order to support overhead costs, including access to research literature and information. There may be perceived benefits or drawbacks of different models of handling ICR monies, but there has been a significant dearth of available literature on this topic.
This study examines the level of awareness of ICR at academic libraries at Carnegie-designated highest and higher research activity institutions; whether they have specific policies about either ICR designation or expenditures; how ICR fits in with their other revenue streams; and identifies emergent organizational/institutional relationships, themes, and/or correlations. Findings from a national survey, that targeted key stakeholders at academic libraries in the aforementioned institutions. will be discussed and analyzed, as well as follow-up interviews on the topic. Interviews were conducted for further understanding of several key issues related to ICR. Notably, is there assessment of the application of ICR monies to support resource acquisition, support services, or other uses? How are the effects of the use of ICR monies understood, valued, and/or articulated by the library? The practical applications for this study should help to serve and inform any academic library organization interested in understanding and leveraging possible revenue streams.
Developing Objective Criteria for Promotion
Heather Scalf (University of Texas Arlington)
While the Association of College and Research Libraries provides guidelines for appointment, promotion and tenure of librarians with faculty status, few academic libraries have created an objective set of categories and activities that is designed to reflect the broad range of professional work that can be produced by librarians and archivists in today’s academic libraries. This paper will be a brief discussion and a description of the policy and the rubric created by the University of Texas Arlington (UTA) Libraries Associates of the Faculty Promotion Policy Task Force. The UTA Libraries has had a promotion policy, called Career Status, since 1999, that applies to librarians and archivists. It was intended to provide a mechanism for advancement outside of the management structure within the libraries. In 2012, a task force was charged by the dean of libraries to begin the process of drafting a new policy and an associated evaluation rubric. The policy and the rubric were designed to provide an objective and progressive tool that could be used for candidates to self-assess their progress over the course of their career, and for the promotion committee to evaluate the evidence submitted in the candidate’s dossier when an application for promotion was received. They address performance and evidence in three categories; librarianship, scholarly materials and activities, and service.
The rubric is designed to not only expand upon the evidence definitions created by the policy, but also to provide examples and criteria that will indicate the level of success evidenced by the supporting documentation. It provides a scoring strategy for each of the three evaluated categories based upon quantity, quality or complexity of the activity that evaluates not only traditional types of scholarship and service, but also emerging scholarship and service as well. It is a tool that can be used to assist librarians and archivists in their career development, and is applicable as they indicate whether or not they are pursuing the promotion track or not.
Thus far, the rubric has been applied to three different pools of promotion candidates, with minor changes recommended. It has been noted that both candidates and committee members alike will benefit from a thorough reading and understanding of the documents. While some ambiguity within both documents was intentional in order to allow for activity that might not have been imagined by the committee during the creation of the document, a lack of understanding of the basic categories has motivated the Promotion committee to offer both training and one on one mentoring through the process to any candidate who requests assistance.
The completion and application of the rubric has practical implications in libraries where librarians and/or archivists do not have faculty status within their university system. This policy and rubric are not only valuable to UTA but could also be adapted for use in any other academic library, as well as in other types of libraries.
Meta-Assessment: The ARL Assessment Framework in Practice at Montana State University
Scott Young and David Swedman (Montana State University); Martha Kyrillidou (QualityMetrics, LLC)
How can we apply an assessment lens to our own assessment practices? To answer this question, we follow a mixed-methods approach to assess the assessment framework described in the recent ARL Assessment Program Visioning Task Force Recommendations (December 17, 2017). Our research synthesizes the evidence and insights gathered through three methods: a case study analysis, a comparative analysis, and a gap analysis.
First, we examine a case study of a UX and Assessment (UX&A) program recently developed at Montana State University (MSU). The vision of the UX&A program at MSU is to build and sustain a library that is useful, usable, and desirable for our diverse community of users. UX&A personnel work collaboratively with other library departments to continually measure, assess, and improve users’ experience of library services and instruction, both physical and online. This new UX&A program was developed in tandem with a new library strategic plan, which is based on the Balanced Scorecard framework.
With the new assessment program and strategic plan in place, we conducted a second phase of research: a comparative analysis of the MSU UX&A program vis-à-vis the assessment landscape described in the ARL Recommendations. In this analysis, we highlight which framework elements are currently in place, which elements are in development, and which still need to be developed at MSU.
Next, we conducted a gap analysis comparing the ARL recommendations with established and emerging user experience and assessment programs in place at other research libraries to determine if there are additional elements outside of these recommendations that may be useful for describing, assessing, and improving a library’s assessment framework.
Finally, we synthesized the insights gathered from our meta-assessment to create an enhanced version of the ARL framework as applied to the MSU library. In terms of practical impact, this enhanced meta-assessment framework can be applied to comprehensively evaluate and improve a library’s user experience and assessment ecosystem. Our research ultimately demonstrates and models an approach for meta-assessment that can help inform the development of more effective and sustainable library UX and assessment programs, for the ultimate benefit of our users.
Session 7: Digital Libraries
Testing Assumptions—Does Enhancing Subject Terms Increase Use of Digital Library Content?
Chelsea Dinsmore and Todd Digby (University of Florida)
In modern library systems, access to the digital content is heavily dependent on effective metadata. The University of Florida (UF) Digital Collections (UFDC) are an actively growing, open access, digital library comprised of over 500,000 records. As with any large-scale digital library project, a well-known challenge is the varying quality and quantity of legacy metadata available for each title. Inconsistent metadata makes digitalized materials harder to find. If users cannot find the content they are looking for, a great deal of human effort has been wasted and the investment in digital collections is not being realized. Subject terms are often one of the most efficient methods for accessing desired materials and subject terms created from controlled vocabularies deliver the most consistent results. To date, applying and editing subject metadata has been a record-by-record, labor-intensive process, making the prospect of retrospective projects cost-prohibitive. The UF team is investigating the capacity of research library staff to implement a Machine Assisted Indexing system to automate the process of selecting and applying subject terms, based on the use of a rule set combined with controlled vocabularies, to the metadata of a body of already digitized content. To execute the project, the Smathers Libraries team at UF is collaborating with Access Innovations (AI) consultants to implement a machine-assisted indexing system to mitigate the challenges discussed above.
At this time pilots are being run on two collections. The UF team is looking comparing access rates of the Electronic Theses and Dissertations (ETD’s) collection before and after enhancing the subject metadata. The hypothesis is that when added to the existing and openly available catalog records, the enhanced metadata should improve accessibility, and by extension, use of the impacted content. This can be measured by looking at page hit rates and overall collection use rates before and after the enhancement. A second pilot assessment effort focuses on a long run of a journal with strong historical ties to agriculture in Florida. Random issues of the title were selected for machine assisted indexing and the use of those issues will be measures against the use of the other issues in the series.
The paper will address our methods and outcomes of these two pilot projects. Next steps and more in-depth assessment methodologies will also be discussed. Through this assessment we look to improve and streamline our workflows and determine if our enhancements have increased access and discovery of these pilot digital collections.
Assessing Transformation: Findings from the Measuring Reuse Project
Santi Thompson (University of Houston), Caroline Muglia (University of Southern California), Genya O’Gara (Virtual Library of Virginia), Liz Woolcott (Utah State University), Ayla Stein (University of Illinois at Urbana-Champaign), and Elizabeth Kelly (Loyola University New Orleans)
Purpose: A key indicator of the impact of digital collections is content reuse, specifically how materials are utilized and repurposed. There have been ongoing efforts within the digital library community to demonstrate and measure reuse. These types of investigations have produced meaningful results, but efforts have been scattered among institutions and organization types. Additionally, in this relatively new arena, few recommended practices or gold-standards have emerged. The lack of reuse metrics, combined with the lack of community assessment norms, has a two-fold effect: it makes it difficult for institutions to develop, using appropriate data, strong infrastructures and collections that are responsive to user needs; and it suppresses the ability of a digital library to demonstrate its value to stakeholders, including administrators, funding bodies, and potential users.
An IMLS national leadership grant, Developing a Framework for Measuring Reuse of Digital Objects (LG-73-17-0002-17) attempts to address the challenges faced in assessing content reuse through a comprehensive needs assessment of the digital library community. The eventual goal behind the grant is a multidimensional framework to support digital libraries in demonstrating their value and better advocating for the resources and platforms that will best serve their communities. In order to do this, a deeper understanding of how digital objects are used and repurposed is critical. This must be coupled with an understanding of what types of resources already exist, how reuse is or is not valued differently from access use, and what approaches and tools need to be developed.
Design, Methodology, or Approach: This section of the paper will mirror the methods used to compile data for the project. It will include details on: (a) an initial survey, which identified how cultural heritage organizations currently assess digital library reuse, barriers to assessing reuse, and community priorities for potential solutions; (b) content reuse needs, limitations, and success stories derived from focus group sessions held in-person and virtually during the span of the grant project; (c) a follow up survey, which focused on determining priorities for potential toolkit components and functionality among survey participants; and (d) reuse use cases and functional requirements to construct a future toolkit for reuse assessment
Findings: This section of the paper will share the quantitative and qualitative analysis and results of the community needs assessment conducted in 2017 and 2018 that aims to illuminate digital library reuse strengths, weaknesses, and community applications. Specifically, it will outline the results and implications of a community survey that rates and prioritizes use cases and functional requirements for a future assessment toolkit. It will also include data and results from an initial survey and focus groups conducted by the project team.
Practical Implications: The paper will engage in discussions on the long-term outputs of a successfully implemented toolkit, including the ability to identify sustainable and vetted assessment techniques that can be applied at a wide range of institutions and encourage the development of streamlined approaches and best practices for communicating the economic, scientific, educational, scholarly, cultural, and social impact of digital collections.
Building the Measuring Stick: A Model for Continuous Review and Improvement of Institutional Repository Policies
Christy Shorey (University of Florida)
The Institutional Repository at the University of Florida (IR@UF) was founded in 2006, and the policies have not been substantially reviewed or updated since that time. As the new Institutional Repository (IR) manager, I wanted to create a list of current IR best practices and policies from peer institutions. Once collected, this list would serve as a guide to identify needed updates that would allow the IR@UF to best address the needs of the UF community within the current scholarly publishing environment.
The first step was a literature review to identify policies necessary for a thriving IR. I then compared current IR@UF policies, both public and internal, to identify where there were missing or weak policies. Evaluating the size of their IRs, years they were founded, and the types of objects collected, we identified 25 peer institutions. I did an environmental scan of these IRs by visiting their websites, searching for documentation of policies and practices.
With the help of our assessment librarian, Laura Spears, I created a Qualtrics survey, drawing from the environmental scan to craft focused questions about policies in four areas: Administration, Submissions, Collections, and Other (e.g., theses and dissertations, how related items are treated, etc.). I invited the 25 peer institutions to participate in the survey as a pilot; 15 replied. Using these results and feedback from peers, I updated the survey and sent it to a broader audience, yielding 95 domestic and international participants.
Some trends were easily identifiable, such as a majority of IRs being hosted on the DSpace or Digital Commons platforms, and spikes in creation of IRs in 2006, 2011 and 2015. Also present were general trends of how metadata was collected to describe items within the IR, and who set and maintained policies. Initial policies were developed mostly by advisory boards or a library department, while maintenance was primarily handled by a library department or an individual.
This qualitative analysis was started to measure the IR@UF policies and suggest policy revisions. The results of this research speak to broader implications. It’s clear the state of institutional repositories within the scope of scholarly communication is currently under scrutiny, as evidenced by recent articles such as Clifford Lynch’s 2017 Updating the Agenda for Academic Libraries and Scholarly Communications and The Evolving Institutional Repository Landscape, a Choice White Paper by Judy Luther released earlier this year. Recent surveys of the current landscape of IRs look at topics from metadata collection, to the creation and maintenance of IRs in Canada, as well as the use of Family Educational Rights and Privacy Act (FERPA) regulations for student works in IRs. Within this focus on IRs, understanding the policies of peer institutions is an important factor. My survey provides results from 95 participants, and also serves as a case study other institutions can review based on their needs in comparison to peer institutions. The results of the survey also suggest further research opportunities into the relationship between IR platform and policies.
Bringing IRUS to the USA: International Collaborations to Standardize and Assess Repository Usage Statistics
Jo Lambert and Ross Macintyre (JISC); Santi Thompson (University of Houston)
Purpose: The value of Open Access (OA) in supporting effective research is widely recognised. Institutional repositories perform a key role, facilitating global knowledge sharing and enabling academic institutions to share research outputs with a wider audience. Within this context, measuring the reach of research is key. Tracking, monitoring and benchmarking usage of scholarly resources helps to demonstrate value and impact. It supports understanding of an institution’s research, identifies emerging trends against a local, a national and often an international context, and informs both policy and process for a wide range of stakeholders.
Part of Jisc’s Open Access offer, IRUS-UK (Institutional Repository Usage Statistics) enables Institutional Repositories to share and compare usage data based on the COUNTER standard. The service provides access to authoritative, standards-based statistics supporting universities to gain a better understanding of the breakdown and usage of their institution’s research, which they can share with key stakeholders. It provides a clear indication of the significant level of repository usage in the UK, and through the IRUS-USA pilot project it offers potential for national benchmarking in the US, as well as international comparison.
This session will focus on the IRUS-USA pilot project, a joint effort between Jisc in the UK and the Council on Library Resources (CLIR)/Digital Library Federation (DLF) in the US. The presentation will highlight the outcomes of the pilot and suggest opportunities for international collaboration to support comparison and benchmarking.
Design, methodology, or approach: The benefits of IRUS-UK, particularly the ability to access standards-compliant usage data so that participating institutions can run complex reports, do cross-institutional comparisons, and generally better visualize and benchmark their own usage statistics, was an attractive feature for members of the DLF and CLIR, who have worked with Jisc to bring IRUS-USA to the United States.
The IRUS-USA pilot project includes eight higher education institutions across America (University of Virginia, University of Pittsburgh, Montana State University, Indiana University, University of Maryland, University of Houston, Smithsonian Libraries, and the California Institute of Technology).
Following a yearlong period of access to a web portal and the compiling of usage statistics, pilot participants will engage in a survey and focus group to assess the benefits and barriers of IRUS-USA as well as the opportunities to expand the pilot to a larger, sustained service model.
Findings: Drawing upon quantitative and qualitative assessment data, this session will update the LAC community on the IRUS-USA pilot project achievements and lessons learned at the pilot’s conclusion. It will highlight examples of best practice, as well as opportunities for international coordination and cooperation. It will conclude by sharing the benefits of standardizing data to create clear and understandable impact measures and emerging common practices with similar tools and metrics.
Practical implications or value: The IRUS-USA Pilot Project replicates an approach initiated in the UK to aggregate COUNTER-conformant (standards based and comparable) usage statistics across different repositories to support comparison and benchmarking. At the same time it explores opportunities for international measurement and assessment and the potential impact.
Launching the Resource Repository for Assessment Librarians: From Needs Assessment to Pilot and Beyond
Nancy Turner (Temple University), Kirsten Kinsley (Florida State University), and Melissa Becher (American University)
Under the auspices of the LLAMA Assessment Community of Practice’s Organizational Practices Committee, we conducted a needs assessment survey in the Spring of 2017. The purpose was to gauge interest in an open access repository for assessment resources to include instruments, literature reviews, datasets, and unpublished “grey literature”. The repository would serve also as a space for networking and support to professionals engaged with library assessment. We received an enthusiastic 379 survey responses. Users’ most desired functionalities included:
- Search the repository for examples of instruments or tools to use (95.5%)
- Review results, reports or case studies as examples of best practice (85.1%)
- Deposit instruments (surveys, rubrics, questionnaires) or tools created by my library or organization to share with others (82.2%)
- Locate peers or colleagues doing similar work as me (77.2%)
- Review findings from similar studies to compare to my own findings (76.4%)
With these results in hand, we moved forward with a request to the LLAMA Executive Committee for financial support to test the feasibility of using Omeka.net as a repository platform. Our successful bid resulted in the website libraryassessment.omeka.net, and a call for volunteer participation yielded work groups to develop the technical infrastructure and metadata standards, policies and procedures for submission, and to conduct usability testing on the pilot site.
We will conduct usability testing in May of 2018 to gather feedback on the ease with which prospective users can upload an item, apply metadata accurately, search the repository for relevant materials and download an item of interest. Results of those tests will allow us to assess how closely our prototype meets the expectations of users and their functional requirements. These questions, and recommendations for moving forward, will be addressed in our report to the LLAMA Executive group in Summer 2018.
While still in pilot phases we’ve learned that enthusiasm for this important resource for the professional brings high expectations and challenges: For instance,
- How do we create a repository of quality resources that is also comprehensive in scope, while maintaining a space that is welcoming and receptive to library professionals of all types?
- How do we balance the need for editorial rigour and consistent metadata standards with a barrier-free submission process?
- What incentives for use and for sharing would be of value to librarians working in large academic settings as well as small public libraries?
- How do we foster and sustain an online community environment that is can be supported into the future, remaining vibrant and growing?
Our presentation will describe this important initiative that supports librarians doing the work of assessment, but also speaks to more general challenges in building an open access repository that is cost-effective, sustainable, and meets the needs of a wide range of practitioners.
Session 8: Diversity, Equity, and Inclusion
Taking AIM: Integrating Organization Development into the Creation of a Diversity, Equity & Inclusion Audit
Kawanna Bright (University of Denver) and Nikhat Ghouse (American University)
The assessment of diversity, equity, and inclusion (DEI) is often limited to a simple count of people and materials; or the use of climate assessments originally built for nonlibrary settings and may not be suitable for academic libraries. While common, these approaches are limited in their ability to fully assess DEI initiatives and do not provide libraries with a personalized way to assess DEI within their own organizations. This paper details the first phase of the development of an audit designed to offer a more holistic assessment of DEI within an academic library setting. Currently called AIM, this audit is designed to serve as a diagnostic tool that indicates both status and progress in terms of DEI efforts. The development of the audit relied on the application of research methodology and Organization Development (OD) principles, particularly the Galbraith Star Model with a focus on strategy, structure, processes, rewards, people, and external environment. The Galbraith Star Model integrates DEI efforts into the larger library organization and institution. In addition to these areas, the audit allows libraries to determine where they are in terms of their AIM: Awareness, Intentionality, and ability to Measure their DEI efforts. While the entire audit will not be shared, the overall structure and two sections will be shared in full detail. This paper will focus on the steps that went into the integration of OD into the audit development process, how this helped to both strengthen and provide structure to the audit, and report preliminary results from the piloting phase of audit development.
A Consideration of Power Structures (and the Tension They Create) in Library Assessment Activities
Ebony Magnus (Southern Alberta Institute of Technology); Jackie Belanger and Maggie Faber (University of Washington)
Purpose: Assessment is not neutral. Evidence is not infallible. Data are not immune to oppressive structures of power. In this paper, we will take up these principles as a starting point to explore how librarians might meaningfully engage critical perspectives to interrogate the structures of power and methodologies that both motivate and facilitate assessment work in academic libraries.
Approach: This presentation draws on work in progress that considers critical methodologies employed in social sciences, data studies, and educational research, and talks back to questions posed by these methodologies to imagine what inclusive assessment practices might look like in different institutional contexts. The goal of our work, and this paper, is to engage librarians in nuanced critical discussions about assessment.
Work on critical assessment has largely taken place in the context of library instruction and in critiques of library assessment that highlight the alignment of quantitative approaches and efforts to demonstrate library value with neoliberal trends towards the commodification of higher education. In response to this perceived overemphasis on quantitative approaches and demonstrating value, there has been a growing interest in qualitative methods such as ethnography. However, such qualitative approaches also come with potentially problematic histories and inequities.
In this presentation, we will share professional and personal experiences that led us to explore structures of power inherent in our assessment work. We will describe the sites of greatest tension in our daily work—the practices in which systemic influence has become most apparent, yet can’t be entirely undone. We do not seek to offer packaged solutions (if they even exist); but will explore ways in which librarians might begin to interrogate bias and power in our assessment activities.
Findings: This paper will detail the ongoing outcomes of our research and structured discussions, which have led into a concerted effort to recognize inequity and work towards correcting power imbalances in our everyday assessment work. There are no formal quantitative findings, nor can generalizable statements be made. The aim of the paper is instead to to explore what it means to consider practical approaches to assessment through a lens focused on power and equity.
Value: While assessment literature and conferences have focused on issues such as student data and privacy (exemplified by a recent call for papers for a special issue of Library Trends on critical approaches to learning analytics), there has been less attention among assessment professionals to broader critical perspectives on their work. This paper aims to expand the current discussion of assessment in order to recognize and more effectively address issues of inclusion and inequality in various aspects of our practices.
We expect that most assessment practitioners have at some point come up against a conflict between institutional priority, administration expectations, student experiences, and methodological integrity. The presenters hope that this paper will spark discussions about how assessment practitioners engage meaningfully with the potential tensions in this work.
Social Justice Metrics for Libraries: A Community Identification of Possibilities
Lisa Hinchliffe (University of Illinois at Urbana-Champaign), Krystal Wyatt-Baxter (University of Texas), and Cameron Tuai (Drake University)
The mission of the academic institution to support student learning and success has embedded within it the need to attend to issues of equity, inclusion, and justice. However, while much work has been done to identify metrics for learning and success outcomes (e.g., the ACRL Assessment in Action project), little attention has been given to identifying potential metrics for library contribution to social justice efforts. Moreover, academic institutions likely have as part of their mission community engagement and social development and libraries will also contribute to this public effort. This paper seeks to begin to fill the gap in the assessment practice relative to social justice by offering potential metrics for consideration as well as methods and strategies for gathering and analyzing data as evidence related to those metrics.
The development of the draft social justice metrics, methods, and strategies is based on a series of community of practice conversations held during 2017–2018. Participants self-identified as through a call for participants on the ARL-Assess listserv and the work was supported through conference calls, shared documents, a Zotero library, and a Slack discussion channel. Democratically organized, subgroups focused on different areas of interest with the goal of exploring a range of possibilities. At the time of this proposal, the conversations are ongoing and so specific metrics, methods, and strategies have not yet been finalized; however, areas under discussion include information literacy, staff cultural competence, diversity in hiring practices, legitimacy, partnerships with campus diversity organizations and units, and safe spaces.
By presenting the potential metrics, methods, and strategies in this paper, the authors wish to garner broader community input into how libraries can pursue and document contributions to social justice and the feasibility of different approaches as well as catalyzing pilot projects to gather and analyze data across institutions to evaluate the different possibilities.
Setting our Cites on Gender: Toward Development of Inclusive Scholarly Support Services for All Faculty
Laura Robinson and Anna Newman (Worcester Polytechnic Institute)
Understanding gendered practices and biases in scholarly communication can help librarians develop the right mix of relevant faculty support to encourage diversity, equity, and inclusion on our campuses, while contributing to broader work in strengthening equity in research practices. A number of recent studies explore gender differences and biases in peer-review (Helmer, Schottdorf, Neef, & Battaglia, 2017) and citation practices (King, Bergstrom, Correll, Jacquet, & West, 2017; Nielsen, 2016), which are key issues for librarians to consider when providing services in these areas. This work reports on a method to understand gender-specific faculty practices throughout the research and scholarly lifecycle, with particular focus on awareness of and attitudes toward research profile development, open access, and citation metrics and practices. We completed brief structured interviews with 20 faculty across disciplines and at varied points in their career trajectory, divided evenly by gender identification, in order to understand the following: Are there differences by gender in what scholarly profiles and social media accounts faculty wish to maintain? Which impact measures are prioritized, and how and why are these profiles and measures used? What motivates faculty to participate in open access publishing, or what are the deterrents? Considering the answers to these questions, how do librarians best market and deliver the appropriate services as we struggle for funding and time? Results showed that our male subjects were more active in the areas we explored while several women indicated hesitancy to engage in scholarly profile building due to personal security and privacy issues based on being female. Female subjects had direct examples of gender biases they or their colleagues had experienced, whereas several male subjects acknowledged biases but were not aware of particular examples in their disciplines. Few subjects of either gender deemed traditional impact measures as an accurate reflection of the importance of their work, and most subjects suggested measures that would be more meaningful and more customized to illustrate real-world value. This study has illustrated the array of faculty needs on our campus as well as the array of mindsets and gendered experiences that we must consider when providing faculty research services; future work exploring gendered practices by discipline and faculty rank will further elucidate these considerations.
Assessing the Social Value of Library Services at Drake University
Cameron Tuai (Drake University)
Research Question: How did Drake University Cowles Library assess the social value of its library services within the university mandated framework of the balanced scorecard.
Design: Using a case study method, this paper tested two propositions: (a) The social value of library services can be assessed using organizational theories of legitimacy; and (b) that the assessed value will be recognized in terms of the library’s influence on the allocation of resources tied to supporting the university’s social mission. To test these propositions, the library’s Planning and Assessment Committee applied concepts of legitimacy to consolidate its current strategic documents into a balanced scorecard format. This effort involved: (a) Aligning the ideals of the university’s social mission to the beliefs of the library profession; (b) Connecting the values of the university’s social mission to library practice; and (c) Linking the principles guiding the university’s evaluation of its social mission to library standards.
Findings: Applying theories of legitimacy, the Planning and Assessment Committee began its assessment of the social value of library services by reviewing university documents to identify the beliefs through which the university sought to realize the ideals of its social mission. These beliefs included: (a) community engaged service; (b) diversity and inclusiveness; and (c) global and intercultural competencies. The committee then reviewed library strategic documents and identified specific services that supported these beliefs, such as, instruction for first generation students, and multicultural based collections. Lastly, the committee reviewed other university units’ balanced scorecards and met with personnel charged with supporting the university’s social mission in order to develop a vocabulary for describing the social value of library services in terms that would be understood and valued by the university community.
Early findings support the study’s two propositions. Internally, the library has developed greater capacity to recognize and develop the social value of its services. Externally, the library has become more purposeful in establishing strategic partnerships to support the university’s social mission. For example, a partnership with the Office of International Programs has provided the library with the resources necessary to assess library staff and faculty intercultural competency. The provision of these resources support the study’s proposition, in that the university understands the social value of the library to the degree that it provides the resources necessary to sustain and grow the services that create this value.
Practical Implications: As university’s rediscover the import of their social mission, libraries have an opportunity to improve the validity of their assessment efforts by including measures that account for the social value of library services. The concepts of legitimacy provides one means for assessing this social value in terms of that the university both understands and supports through the allocation of resources.
Assessing and Improving the Experience of Underrepresented Populations: A Participatory Design Approach
Scott Young and David Swedman (Montana State University); Haille Fargo, Steve Borrelli, Zoe Chao, and Carmen Gass (Pennsylvania State University)
How can we ensure that underrepresented populations succeed at our institutions? Participatory Design offers one answer to this question. Participatory Design is a socially-active, values-driven approach to co-creation that seeks to give voice to those who have been traditionally unheard. A team of librarians from Penn State University (PSU) and Montana State University (MSU) followed a parallel design process with two different populations: PSU worked with first-generation students and MSU worked with Native American students. Each project team facilitated a series of 10 workshops with student participants that followed a three-phase process: exploration, generation, and evaluation.
During the exploration phase, we explored topics, concepts, and problems relating to the library experience of participants.
During the generation phase, we generated ideas and potential solutions around key topics explored in the first phase.
During the evaluation phase, we evaluated and implemented the most desirable, feasible, and viable solutions generated in the second phase.
Through this process, the first-generation student group at PSU produced new service designs for engaging other first-generation students, while the Native American student group at MSU produced a seven-part poster series and social media campaign designed to welcome Native American students into the library.
In addition to co-designing new services, Participatory Design also aims to generate political outcomes that focus on empowering participants. The foundational values of Participatory Design include mutual learning, power sharing, and the equal recognition of expertise among all participants. Within this equity-focused, participatory framework, the students participants became expert library users who expressed readiness to advocate for the library to their peers. More than that, the students—members of underrepresented populations who often feel at the margins—developed a stronger sense of place and confidence on campus that will contribute to their success at our institutions. And for the librarian facilitators, the in-depth co-design process enhanced our ability to understand these student populations. We gained new insights into the experience of our student participants that we can apply to better serve these important populations. Ultimately, the Participatory Design process equipped us with the tools and insights to assess and improve the conditions of their success.
In this way, we found Participatory Design to be an empowering, compassionate, and effective approach for designing and assessing library services and experiences. This paper will present the principles of Participatory Design, our step-by-step process, and the challenges and limitations of this approach. The key takeaway of this paper will be practical recommendations for building a sustainable, participatory design and assessment program with underrepresented populations.
Session 9: Organizational Issues II
Strategic Library Assessment: Aligning with your University’s Strategic Plan
Kathy Crowe (University of North Carolina at Greensboro)
Demonstrating academic libraries’ value has been an important issue since the publication of ACRL’s The Value of Academic Libraries in 2010. More recently, in its 2017 report, ACRL Academic Library Impact: Improving Practice and Essential Areas to Research, ACRL recommended that libraries:
- Match library assessment to institutions’ missions
- Include library data in institutional data Collection
And, in ACRL’s new Standards for Libraries in Higher Education (2018), the Association included as performance indicators for the Institutional Effectiveness principle:
- The library defines and measures outcomes in the context of institutional mission.
- The library develops outcomes that are aligned with institutional, departmental and student affairs outcomes.
These guidelines confirm that academic library administrators and assessment librarians need to align strategic planning and assessment efforts with their university’s planning requirements. This positioning serves as an important strategy to show the library’s value and contributions to the campus.
At a High-Research Activity University, a new chancellor began a strategic planning process in fall 2016 with a focus on “big ideas” and “giant steps.” The planning framework emphasizes student, knowledge and regional transformation in three areas: Health and Wellness, Vibrant Communities and Global Connections. Each year, units are required to submit their strategic planning report to the Office of Accreditation and Assessment with objectives, measures and targets. At the end of the year units submit a report to show if these objectives were accomplished. The report includes findings to show if targets were met or not along with an action plan for improvement. For both the planning and annual reports units are asked to link their goals and accomplishments to the university’s strategic plan where appropriate. These reports are ultimately sent to the Chancellor’s Office and a campus-wide report is developed.
The university libraries linked several of its objectives to support student and knowledge transformation through information literacy, student employment opportunities, learning spaces and open educational resources. Specific metrics provided concrete evidence on goal attainment. It was gratifying that all the libraries’ initiatives that were submitted appeared in the chancellor’s first strategic planning report in spring 2017.
The libraries also took steps to develop specific assessment measures to provide additional evidence of its contributions to the strategic plan. The annual assessment plan includes studies that will support both the libraries’ and the university’s goals. For example, surveys justified the need for renovation goals that will enhance learning spaces for students and thus the university’s student transformation goals. A student success study supported information literacy goals.
This paper and session will provide a case study with practical direction on how libraries can construct and align their goals with their university’s strategic plan, identify metrics and develop assessments that demonstrate their value and contributions to their institution.
Tracking Unicorns: A Multi-Institutional Network Analysis of Library Functional Areas
Emily Guhde (Georgetown University) and Brian Keith (University of Florida)
Libraries and academic and research environments have experienced a sustained period of profound change. The work of library professionals has expanded into new diverse fields, like data management and assessment. Longstanding specializations, like reference and cataloging, have adapted. But how do the new and the traditional functional areas relate and connect in the work of libraries? This paper will utilize a unique source of big data and innovatively apply research methodologies to investigate the current state of librarianship and answer this question.
Since its launch in 2013, the ARL PD Bank has grown and now represents a unique digital collection and an associated metadata set depicting the ways in which work is defined and organized. These metadata include the identification of the functional area(s) for each position, like “GIS Systems/Data” or “Liaison.” These functional areas are selected from a comprehensive, controlled vocabulary (see https://arlpdbank.uflib.ufl.edu/docs/ARLPDBankDefinitions.pdf), which allows us to study the frequency of functional areas and the relationships between the functions within individual positions.
In this study, we use network analysis to examine the relationships between the 37 functional areas used as descriptive data. The relationships are in the form of co-occurrence of functional areas at the position level. For example, with what frequency does “Liaison“ co-occur with “Instruction”? “Offsite Storage” and “Preservation/Conservation”? We also explore gaps in co-occurrence to understand which functions most frequently occur in isolation. This analysis expands our understanding of functional relationships beyond anecdote and assumption.
With the organization of the dataset and preliminary analysis complete, a number of initial findings have emerged, including:
Professional positions tended to have more frequent co-occurrences between distinct functions, whereas support positions tended to have more frequent occurrences of functional isolation (e.g., “Library Information/Systems” only).
Among the support positions, six functions (Information Technology/Systems, Cataloging/Bibliographic Control/Metadata, Circulation, Acquisitions, Digital Services, and Archiving/Curatorial/Rare Books) occurred in isolation more frequently than they co-occurred with other distinct functions.
Among Senior Management positions, the most common co-occurrences were between distinct functions that represent traditional library services: Access Services and Circulation; Reference/Research and Collection Development/Management; Reference/Research and Instruction; Reference/Research and Liaison.
Further analysis is underway, as is the refinement of network visualizations which vividly demonstrate the functional relationships.
This research is valuable because network analysis and visualization has not yet been applied to studying library work. The size of this dataset, with over 1,800 positions and literally thousands of relationships depicted, is uniquely valuable as a resource to expand our understanding of how a diverse professional landscape is organized and how the requisite skills, knowledge and abilities coexist and relate. This research can inform library leaders in organizing the work of libraries and can orient professionals and MLS students as to the need for cross-functional skill sets for competing in this new environment. The findings can also inform MLS programs in considering the curricular connection with the professional experiences of their graduates.
Two Years and Change: Building a Sustainable Culture of User-focused Assessment
Katy O’Neill (Loyola Notre Dame Library)
As a part of the transformation initiated by the current director, library senior leadership at a midsize academic library made a commitment to assessment as a tool to document success and support lean operations. The strategy included hiring a two year, term-limited librarian in summer 2016 to focus on assessment and act as the change champion, building capacity within the staff. In the last two years, the library assessment culture has been recast to an environment where staff uses assessment in planning and decision making. After documenting the library’s assessment environment in a status report, the assessment librarian and library administration developed a plan to stabilize the data gathering infrastructure, implement best practices and standards, and build needed skills in survey design, data visualization, and other areas. Kotter and Schlesinger in their 1979 landmark article, Choosing Strategies for Change, suggested six methods for dealing with change resistance. The library administration used three of these approaches—education and communication, participation and involvement, and facilitation and support—to engage staff in assessment and to build a sustainable culture of user-focused assessment. These change management approaches may be applicable to all libraries. Today supervisors and staff recognize that data and assessment are an integral part of their ongoing work and consistently use assessment in developing user-focused services and identifying needed resources.
“Buttoned Up Assessment”: Organizational Culture and Improving Department-Wide Assessment Efforts at Brown University Library
Boaz Nadav-Manes and Jeanette Norris (Brown University)
In concert with library-wide strategic planning and collections realignment practice, Public Services and Technical Services departments at Brown are reimagining how assessment happens on a daily basis and how to provide meaningful information about our services and the effectiveness or our practices. Historically, the assessment of access and technical services departments has been grounded in counting how much we do for the purpose of reporting these figures out, while staff argue that how much they do is not a sufficient metric and is not representative of the work they actually do or the impact it has. In an effort to increase staff participation in assessment efforts and improve the meaningfulness of the data, we’re engaging all levels of staff in a project to design the metrics we use in assessing our work as well as the tools used to capture the data. We’re designing an assessment plan in an effort to improve our ability to prepare for future service and collections needs using predictive analytics, tell the story of our work’s impact, and ensure that our limited time and resources are spent on the work and collection support that truly make the most difference to our faculty and students, as well as our colleagues in the library.
We will discuss our process of designing the metrics, describe how they compare to previous assessment strategies, and how effective the process has been in encouraging staff buy-in for the process.
Finding Value in Unusual Places: Transforming Collaboration Workshop Data to Inform a Library Cooperative’s Strategic Plan
Laura Spears, Melissa Powers, and Bess De Farber (University of Florida)
Academic library professionals are increasingly involved with the academic research community, while they also support the libraries’ digital collections, web-based discovery systems, increasing use of social media and mobile devices in information-seeking. Public library professionals are faced with demanding social service support and information seeking needs by communities for which libraries are the key institutions providing equitable access. These demands require dynamic professional development where experts agree that “the informal network developed through many library leadership training programs is often the most valuable and durable benefit of training.”
Despite the long-term positive impact of facilitating networking opportunities for participants, this engagement can be costly and time-consuming. To address the need for engagement among regional library professionals, the Northeast Florida multitype library cooperative, NEFLIN, enlisted the Collaborating With Strangers (CoLABs) Workshop team to facilitate a 45-minute CoLAB mini-workshop during the region’s 2017 annual meeting. CoLABs offer methods for breaking down barriers, encouraging participants to share ideas and create new connections, resulting in participants’ most productive and memorable sessions experienced in a conference setting. The challenge of networking is that most people avoid conversing or working with “strangers,” disrupting their ability to access other people’s assets, the basic ingredients necessary for innovation. However, experience shows that discovering available individual or community-held assets, in a library, classroom, or at a conference, can dramatically lead to a greater sense of community and awareness of resources.
The conference workshop provided a structured environment where participants connected during one-on-one, three-minute speed-meetings, using profile cards produced during the workshop to practice 1) conversing with others; 2) discovering untapped resources; and 3) initiating cooperative, collaborative, or mentoring partnerships. More than 100 participants generated a large amount of qualitative data captured by the profile cards that included eleven demographic questions, contact information and four questions about the participant’s role in the library, passion for their work, projects or interests, and an unknown personal fact.
Using Bryson’s (2017) work in strategic planning for public and nonprofit organizations, the data provided a narrative describing the emerging themes structured as competencies, opportunities and community assets. These findings 1) tell a story about these regional libraries; 2) identify librarian needs and interests; and 3) suggest desired training and development. Our study will reveal substantial characteristics and interests of NEFLIN members, useful for both strategic planning and for advocacy initiatives led by the NEFLIN staff, member libraries and other vested stakeholders.
The resulting data also can be used to demonstrate the efficacy of CoLAB Workshops for quickly extracting substantial amounts of qualitative data and generating insights, some of which may result in potential long-term impacts on specific communities such as the libraries and patrons served by NEFLIN.
The study team aims to enhance the understanding of useful data hidden in unsuspecting activities like CoLABs, which speaks to the need for assessment in libraries to not only measure value, but also to capture it.
Zero to Sixty: Implementing Outcomes Assessment for an Entire Organization
Krystal Wyatt-Baxter (University of Texas)
Beginning in Fall 2017, a large research university library system was asked to participate in a campus-wide outcomes assessment program. This paper will detail the process of implementing the program, which involved helping staff from every area of the library write and carry out unit-level improvement-focused assessment plans. It will share lessons learned in the first year of implementation, and will highlight successful assessment methods and changes made based on findings.
Steps to implementing the program included holding workshops for library leadership to introduce the concept of “Continuous Improvement Plans;” working with individual leaders to determine the number of plans needed in their area and designate “plan writers” for each plan; holding workshops for plan writers; individual meetings with plan writers to review their drafts and implementation plans; and continuous assessment support throughout the process. While time consuming, this approach ensured that each plan was tailored to the area of the library that it addressed and that staff new to assessment received adequate support.
Each plan was devised to fulfill the “Continuous Improvement Framework” designed by the institutional assessment office, consisting of nested goals, outcomes, strategies for achieving outcomes, assessment methods, and targets. The reporting phase of the framework will consist of findings and next steps for each assessment method.
The first round of findings is due in September, but the effects of implementing the program are already apparent. Through the process of getting staff from across the library system involved in assessment, the culture of assessment has begun to change to one in which assessment is everyone’s responsibility. Staff are empowered to take ownership of assessment in their areas, and to use only methods that prove to be helpful in decision making.
Library staff without assessment experience are often weary of assessment and feel uncomfortable or incapable of implementing assessment methods in their areas. However, they are often best positioned to decide what methods and measures are the most useful for improving their day to day work. This paper details a practical approach to supporting library staff to assess their work using tailored, domain-specific methods that can be replicated by other assessment librarians.
Session 10: Methods and Tools II
From Indifference to Delight: Gauging Users’ Preferences Using the Kano Model
Gabriela Castro (Cornell University) and Zoe Chao (Pennsylvania State University)
How do you measure your users’ satisfaction with services or resources? How do you know if users care about a new product or service that you want to develop? Our ability to isolate pain points on the services or products we provide allows us to make better decisions about how to improve user experience. Doing this with increasingly limited resources requires succinct and efficient methods of gathering feedback.
The Kano model, developed in the 1980s by Professor Noriaki Kano, is a method to gauge customer satisfaction and the performance of a product or service by isolating component features and asking users how they feel about the presence or absence of each feature separately in the format of a simple questionnaire. The framework relies on a Likert-type scale for answers which are then charted onto a scoring table with a hierarchy from indifference to must-have.
Although the Kano model is frequently used in business to measure a customers’ satisfaction and needs, fewer studies have applied this method to libraries. In this study, assessment analyst at Cornell and user experience librarians at Penn State applied the Kano Model to three separate types of library services: space, digital signage, and web content. In the questions of space and signage, user responses identified the essential features they desired. Graduate students at Cornell showed strong interest in selected features of a planned graduate-only study room but expressed aversion to relinquishing other existing graduate-student spaces. Students at Penn State favored a more informative digital sign over one having attractive images. By aligning the users’ responses with the Kano evaluation table, the authors categorized perceptions of each feature of space design and digital signage and prioritized the functionality and attributes of each. Though the responses to the web content study failed to distill clear outcomes, the results helped clarify the proper scope of applying this method.
The Kano structured model of question and answer scored on a table was an efficient and time-saving method for both library staff and library patrons. It forced us to isolate from the start the service aspects we wanted feedback on without resorting to more time-consuming research methods. Likewise, it provided quick answers and supporting evidence to facilitate decision-making for our administrators, allowing us to calibrate our services to match our users’ needs.
From Default to Design: Design-Based Assessment for Libraries and Librarianship
Rachel Ivy Clarke (Syracuse University)
There can be no doubt about the recent rise of interest in assessment in librarianship. Popular assessment methods range from quantitative approaches, such as user surveys, usability heuristics, data and search logs, to qualitative techniques like user interviews, photo elicitation, immersive ethnographies, and more. Many discussions ensue about which of these scientific-based methods is best applicable to the library work at hand, but few have questioned the assumption that scientific methods are the most relevant and applicable assessment methods for librarianship overall.
However, new diverse perspectives on librarianship—distinct from science—are emerging. In recent years, a well-established record of research has demonstrated design as an alternative approach to science. Design is often conceptualized in a limited way in librarianship, focusing on architecture and interior spaces or technological applications like web user experience (UX). But design is not limited to to furniture choices or usability testing. Scholars have identified consistent factors and aspects of design processes across a diverse range of domains that unite design as a unique discipline, distinct from science. Designers from all fields—from architecture to engineering, from fashion to technology—undergo similar processes, revealing a common set of fundamental principles that underlie what Cross calls a “designerly way of knowing.” This is more than just the popular model of the “design thinking” process—it is a disciplinary approach to knowledge. While science observes and describes the existing world with the goal of replicability and prediction, design creates artifacts intended to solve problems and, ultimately, change the world from its existing state to a preferred state (Simon 1969, 1996; Cross 2011; Nelson and Stolterman 2012). Emergent research demonstrates that the field of librarianship is more aligned with these designerly ways of knowing than with science.
This paper argues for the inclusion of design evaluation techniques in library assessment. First, the paper introduces techniques common to design, such as rationale and critique, comparing and contrasting each with more traditional scientific approaches. After a brief definition of each technique, examples from both design fields and librarianship will be presented and discussed to illustrate application and relevance to the field. The paper concludes with a discussion of implications for library assessment, including barriers to communicating the value of design-based assessment, the need for advocacy regarding intellectually diverse methods of assessment, and how design-based approaches contribute to diversity and equity in librarianship and library services.
 Nigel Cross, “Design Research: A Disciplined Conversation,” Design Issues 15, no. 2 (1999); Nigel Cross, Design Thinking (Oxford: Berg, 2011).
 See for example Herbert Simon, The Sciences of the Artificial, Cambridge, MA: M.I.T. Press, 1969.; Donald A. Schön, The Reflective Practitioner: How Professionals Think in Action, New York, NY: Basic Books, 1983; Nigel Cross, Design Thinking, Oxford, UK: Berg, 2011; Harold G. Nelson and Erik Stolterman, The Design Way: Intentional Change in an Unpredictable World, 2nd ed., Cambridge, MA: MIT Press, 2012.
 Rachel Ivy Clarke, “It’s Not Rocket Library Science:Design Epistemology and American Librarianship,” (doctoral dissertation, University of Washington, 2016); Rachel Ivy Clarke, “Toward a Design Epistemology for Librarianship.” The Library Quarterly: Information, Community, Policy 88, no. 1 (2018): 41–59.
Reflections on Creating a Multi-site, Mixed Methods, and Interpretive Assessment Project
Darren Ilett and Natasha Floersch (University of Northern Colorado); Emily Dommermuth Gari, Juliann Couture, and Lindsay Roberts (University of Colorado Boulder); Renae Watson, Kristine Nowak, and Jimena Sagàs (Colorado State University)
Collaborative assessment research is capable of providing rich, multifaceted insights into library users’ behaviors and attitudes, but it can also present unique challenges. This paper describes the research experiences of our team of eight library faculty in conducting a collaborative assessment research project at three sites, all public research universities in a Mountain West state. The aim of the paper is to share with others who are interested in undertaking a similar project at their institutions so that they might identify and employ best practices while remaining aware of possible challenges.
Our assessment project explored first-generation college students’ use of and attitudes toward the libraries at the three sites. Much of the research on first-generation students in Library and Information Science is informed by deficit thinking regarding this student population. Instead, we employed a constructivist framework and a multi-site, mixed-methods, and interpretive research design that focused on students’ strengths and assets. While the initial survey provided a snapshot of how first-generation students used library services and spaces in quantitative terms, follow-up interviews added qualitative depth by giving us a glimpse into students’ experiences, knowledge, and skills. Currently, we have completed data collection and are analyzing both datasets. Our timeline is to complete analysis by Fall 2018.
In this paper, we focus on the researcher experience rather than the findings of our assessment project. From research design to data collection, data analysis, and reporting findings, we reflect on what worked well and what could be improved in our collaboration. Successful elements included drawing on diverse individual and institutional strengths, taking on clear but flexible roles, and communicating frequently and openly. Possible improvements could include earlier and more robust planning for funding sources, more frequent check-ins during data collection, and adjustments to data collection instruments.
Based on our experiences, we also consider the advantages and drawbacks of collaborative assessment research generally. On the one hand, a collaborative, mixed-methods research design draws from many data sources and types and thus offers a more complete picture of the first-generation student experience in the state. It also provides a chance to compare sites and thus view respective institutional practices in a new light. On the other hand, such a research design presents challenges in managing data, reconciling multiple researcher voices, and writing research results.
After this session, attendees will gain an appreciation for the possibilities of collaborative assessment research as well as ideas for making such a project a success.
Using LibQUAL+® as a Foundation for the Library’s Support of Accreditation and Re-Certification Efforts
Michael Maciel (Texas A&M University)
Academic libraries play a vital role in higher education. All six of the US regional accrediting agencies recognize this role and, as a result, include specific expectations that libraries must meet in order for an institution to become accredited or recertified.
The objective of this presentation will be to use LibQUAL+® as the foundation to build the necessary reporting and data collection analyses needed to meet regional accrediting agency guidelines and standards. LibQUAL+® is a powerful tool that has many diverse and impactful applications in library operations. The survey and its subsequent analytical tools, however, require additional assessment efforts and reports in order to meet regional (and programmatic) accrediting agency requisites.
This presentation will, using the six regional accrediting agency standards, first, demonstrate how the LibQUAL+® survey can act as a unifying theme for a successful compliance reporting effort, and second, the provide a template, using LibQUAL+® and other assessment devices, for compliance reporting.
The LibQUAL+® user-centered survey provides an excellent keystone to track and report on a library’s contributions to higher education. The survey findings can, and should, have a major impact on users’ access to relevant information resources, services, personnel, and facilities that support, and enhance, an institution’s learning, teaching and research missions. As such the LibQUAL+® program makes for an excellent building block for demonstrating both compliance to standards as well identifying areas for potential improvement or growth.
Specifically, the presentation will include assessment and reporting devices to address a library’s role in providing:
- Relevant and effective information resources,
- Easy-to-use access to these resources that support an institution’s learning, teaching and resources missions,
- Services that support the informed and responsible use of information resources,
- Facilities that provide environments that enable users to access, learn and teach information
- Qualified personnel to support the development, acquisition, access to, and user support of information resources,
- Qualified personnel to support the services and facilities that enable users to effectively learn, teach and research, and,
- Ultimately, the presentation will demonstrate how a library supports an institution’s mission and strategic plans.
Building a “Library Cube” from Scratch
Kirsten Kinsley, Louis Brooks, and Jesse Klein (Florida State University)
Purpose: For over a year, our academic library has been exploring how we might design a prototype of a multidimensional data warehouse or “Library Cube” to address the growing need to consolidate, store, access, and analyze more complex library statistics in a more cost-effective and sustainable way. We want to create a warehouse that will provide the infrastructure for analyses integrating both library and university-wide data. By creating this prototype, we are striving to find an innovative solution that will facilitate organizational transparency and demonstrate high impact practices that focus on ways to leverage library statistics in communicating our value to various stakeholders.
Currently, our library data is siloed and inaccessible to stakeholders who may want to offer a more nuanced view of the relationship between libraries, student outcomes, and institutional goals. The accessibility of libraries, use of materials and space, management and development of print and digital resources, engagement of students and faculty, and provision of specialized and evolving services—to name a few—are all essential to a comprehensive analysis of organizational success indicators. There is a significant need for a centralized, integrated system to manage and analyze this data. Though there are currently a few commercial systems available to libraries, such as LibInsights, that collate some data specific to library services, they tend to be cost-prohibitive and/or limited—certainly in amassing, linking, and standardizing data from several organizational units.
Design/Methodology/Approach: We are proposing the development of a prototype that brings together disparate datasets for comprehensive analysis—a system for which all planning and implementation documentation will be shared in easily accessible formats for use by other libraries. Our prototype data warehouse will:
- Be easy to setup, access, and utilize
- Be cost effective to implement, sustain, and operate
- Support multiple data sources and types
- Be easily modified and extended to meet different institutional needs
- Provide useful data analytics
- Offer methods and mechanisms to address ethical considerations about user privacy
Findings: We will report back on our ability to create a prototype which meets the six objectives listed above. By sharing what we learned from developing our own prototype we hope to facilitate the ability of other libraries to create a similar system that is readily applicable and adoptable.
Practical Implications or Value: By designing a usable prototype an academic library can have organization-wide access to data that is normally siloed both in where it is located, stored, and by the lack of connection to other important data points. Competition for limited resources and increasingly sophisticated data infrastructure in numerous sectors necessitates data-driven decisions in libraries that align with internal and external goals. University administration, faculty, and library leadership require evidenced-based performance evaluations as the basis for maintaining and/or increasing support for library activities. As the field of competition and opportunity continues to emphasize data-driven decision making, libraries are uniquely positioned to aggregate, collect, store, analyze, and disseminate data reflecting the multidimensionality of the university or community experience.
LibQUAL+ Results Bring More Questions than Answers
Kimberly Vardeman and Jingjing Wu (Texas Tech University)
The Texas Tech University Libraries conducted the LibQUAL+ survey in 2017. After receiving the survey results from LibQUAL, the libraries had many unanswered questions—What is the next step? How should the results be shared? What are the problem areas? Which problems should be addressed first?
The User Experience (UX) department and team changed the data visualization format and created infographics and new charts to highlight key findings. They coded open-ended responses into emergent topics and subtopics. The revisualization and reorganization of the data made relevant information more understandable to stakeholders.
The UX department and team identified the website as a potential problem area that merited further study. They collaborated with the web librarian to outline projects to gather more evidence to guide their action. In addition to subsequent surveys, they used a variety of research methods to assess the website: X/O tests to allocate valuable home page real estate to the services and features of most interest to users; card sorting to design a more understandable website navigation; heuristic evaluations of frequently used webpages; usability testing to evaluate whether common tasks could be performed easily; and A/B tests to compare different design prototypes. By the triangulation of several data sources, they made informed decisions about how to improve the website.
As an initial step, LibQUAL+ doesn’t offer specific answers, but suggests potential directions for further study. This paper presents ways to represent data in a meaningful way for a target audience. It describes how to iteratively test the UX of a website using several complementary methods following an exploratory survey. These strategies extend the value of survey results, making assessments more effective and practical. This pattern can be used not only for a website but for evaluating other services.
Concurrent Session IV
Session 11: Nontraditional Users
Perspectives on the Limits of Assessment: Ithaka S+R’s Indigenous Studies Project
Danielle Cooper (Ithaka S+R)
Library practices that are inclusive to Indigenous communities and scholarship are grounded in the recognition that Indigenous cultures feature unique experiences and knowledges. This work cannot be fully supported without a commitment to decolonizing academic libraries, including the assessment and user experience research that informs how library services and tools are designed and implemented. Indigenous studies scholars utilize methodologies that challenge Western conceptualizations of “knowledge” and “research,” which necessitate library services and tools that diverge from Western models of research support, including those provided by libraries. In recognition of this, 35 librarians at 12 academic libraries are currently conducting a collaborative qualitative study on supporting their users in Indigenous Studies utilizing Indigenous methodologies, to be completed in Winter 2019. The work is being undertaken with Ithaka S+R, a not-for-profit organization that specializes in conducting applied research in library settings.
This presentation explores the project’s unique methodology with the goal of reflecting on the limitations of mainstream library assessment approaches and sharing how Indigenous approaches can be developed in support of decolonizing academic library practices. Effective engagement with Indigenous studies is a multilayered and holistic process, necessitating library user research approaches that are attuned to connectivity and cooperation. The paper discusses this research process in detail including: working with Indigenous scholars and librarians to advise on all aspects of the project, building ongoing relationships with participants throughout and beyond the formal research engagement, developing data collection and analysis methods that are reflective of and responsive to Indigenous epistemologies, and identifying next steps arising from the research that are beneficial to Indigenous communities. By attending to and honoring user positionality, the project demonstrates how inclusive user research begins at the point of research design.
1G Needs are Student Needs: A Mixed-Methods Approach to Understanding the Experiences of First-Generation College Students
Emily Daly and Joyce Chapman (Duke University)
Colleges and universities are devoting increased attention to the needs of first-generation (1G) college students, who are attending college in record numbers yet not graduating at the same rate as their non-1G peers. Research indicates that 1G students experience unique struggles in a university setting—they often arrive without family experience in navigating college life, and many come to college with financial challenges and unique responsibilities that can have an impact on their success.
In 2017, a team of library staff became interested in learning more about 1G students at Duke University. The team reviewed relevant literature and interviewed colleagues at other libraries who had undertaken similar studies. The team then worked with campus offices to understand the overall landscape for 1G students, including services, scholarships, and support provided by other groups on campus. Library staff conducted six focus groups with 1G students and tracked the first-generation status of respondents to the library’s 2018 biennial user survey in order to analyze aggregate survey findings for 1G respondents. The 2018 user survey included a set of questions about how students perceive the libraries and how confident they feel about using library spaces and services. For instance, students rated the extent to which they agree with such statements as, “The library is an important part of my experience at Duke”; “For me, the library is a welcoming place”; and “I am confident in my ability to use library resources.”
This mixed methods study revealed several challenges that 1G students face and confirmed many others. For instance, students find that locating resources is not always straightforward and that finding books using call numbers can be daunting. Participants discussed financial stresses, particularly the cost of textbooks. The study revealed that getting help from library staff is important, but not always easy; that an ecosystem of supportive offices on campus is critical; and ultimately, that 1G needs are student needs—by developing services to support 1G students, the library is better positioned to help all students.
Research team members incorporated findings from the focus groups and survey results into a report that they shared with library staff, campus stakeholders, higher education communities interested in providing support for 1G students, and even potential donors interested in funding relevant services or programs. The report included a set of recommendations ranging in scale from posting information about library services to a campus listserv for 1G students or expanding the textbook lending program to funding a library position that would focus on developing programs for 1G students and others identified as potentially at-risk and promoting the library as a partner in support of undergraduate student success.
This paper summarizes the research team’s methodology and findings and describes ways that library staff and campus stakeholders have implemented the team’s recommendations to improve library services and access for 1G students.
Tracking the Elusive Undergraduate Nonuser: Triangulating a Senior Survey, Library Instruction Data, and LibQUAL+® Results
Laurel Littrell (Kansas State University)
Purpose: Generally, obtaining library use and satisfaction information from library users is not particularly difficult. Finding information from nonusers of the library is quite another matter.
Design and methodology: At Kansas State University, library assessment takes many forms as is typical for most institutions. Among these, a campus wide LibQUAL+® survey is launched every three to four years, and every semester, the central Office of Assessment sends out a locally developed “Senior Survey” to all students graduating with a bachelor’s degree. Information about library instruction is tracked through LibAnalytics.
Likert-type or rating scales from LibQUAL+® and many other surveys are familiar to most assessment librarians, but the K-State Senior Survey asks questions from a slightly different angle. In 2009, four questions regarding library use were added to the survey, and since that time these questions have been tracked longitudinally and compared with LibQUAL+® results, providing reinforcement and validity to the results of both surveys. Instead of being asked to rate their experiences or perceptions on a scale, the Senior Survey simply lists an attribute, such as “Library staff: assistance in research, effectively finding information, learning how to use the libraries” and asks the students to select one of four options:
- Used and was satisfied
- Used and was dissatisfied
- Knew about and did not use
- Did not know about
Along with four questions about the library, students are also asked about many other aspects of campus life, as the survey is quite extensive and comprehensive. Through the use of PowerBI, one can easily extract results from 2013 on by the undergraduate academic colleges and further compare these responses with LibQUAL+® discipline specific results, along with examining statistics regarding library instruction taking place in these disciplines.
Findings: Notable consistency exists between the LibQUAL+® data and the Senior Survey regarding library use and awareness in various disciplines. The level of activity in instruction mirrors these results as well and can assist librarians in pinpointing areas of the university that are underserved by the libraries, particularly for areas that showed greater dissatisfaction or were less aware of library resources available. Information from both the 2018 Senior Survey and 2018 LibQUAL+® results, with earlier longitudinal comparisons, will be shared along with samples of more detailed disciplinary analysis showing similarities and differences between the two sets of data, triangulated with library instruction statistics illustrating the degree of librarian interaction in these discipline areas.
Practical implications and value: For areas that report more “did not know about” or “did not use” responses, it may be possible to probe more deeply with students and instructors about why—what resources are they using instead? Are there locally available resources within the program? Are they using tools that are provided by the libraries but are unaware who provides them? How can we improve our communications and outreach with these areas?
The opportunity to identify areas of the university with more nonusers is a valuable tool and could provide a model for other institutions who utilize a similar survey for students.
Collaborative Assessment for Student Success: Analyzing Nontraditional Students’ Library Perceptions and Usage
Samantha Harlow and Karen Stanley Grigg (University of North Carolina at Greensboro)
Assessing the library’s impact on student success is vital for all library departments, but many assessments exclude nontraditional students. According to the National Center for Education Statistics (NCES, https://nces.ed.gov/pubs/web/97578e.asp), a nontraditional student is defined by many characteristics, such as: delayed enrollment into higher education programs, part-time attendance, working full-time, financial independence from parents, caretaking responsibilities, single parenthood, and having received a GED. Because nontraditional students can feel isolated from campus resources, reaching out to this population of students is crucial. Though there is literature on creating library outreach and instruction to nontraditional students there is a gap in the literature about assessing nontraditional students research needs. Creating assessments that include and focus on nontraditional students is key to improve library resources and services.
Two librarians from a midsize public university created assessments focused on two student groups with many nontraditional students: online students and transfer students. Assessing transfer students has been a long-standing strategic mission for this library, and growing assessment to online students is a more recent step. This paper will cover an introduction to nontraditional students, our assessment methodology and approach to these student groups, our findings, and the value of assessing nontraditional students at other institutions.
The purposes of these assessments were to target online and transfer students and assess their use and perceptions of library resources and services. This assessment plan to nontraditional students included many approaches. Transfer students were assessed through surveys, pretest and posttest, and focus groups. Assessing transfer students was also expanded under the Association of College & Research Libraries (ACRL) Assessment in Action (AiA) program. Online students were targeted through surveys and usability studies on library websites, as well as evaluating nontraditional student chat transcripts and creating online advisory groups. With all assessments, these librarians collaborated across library departments and across the institution. When assessing online students, public services, technical services, and library information technology teamed up for usability testing and surveys. To create the advisory board for online students, this librarian worked with nonlibrary departments across campus. With transfer students, the librarian teamed up with nonlibrary departments and regional community colleges.
The findings of our assessments indicate that better marketing of services is needed to nontraditional students, as well as an outreach plan to reach students at the beginning of their academic careers. Continuing collaboration is key in order to improve our resources, including working with nonlibrary departments to better communicate with nontraditional students to create workshops and orientations geared towards nontraditional students. Both librarians also see in the findings the need to work with regional community colleges to reach out to transfer students and learn from their online learning initiatives. Future directions of our findings indicate that possibly creating a position that would focus on student success, including nontraditional students, would help strengthen these initiatives on campus and regionally.
Session 12: Organizational Issues III
Toward a Culture of Inquiry: Reducing Barriers to Engagement in Assessment
Jeremy Buhler (University of British Columbia)
Many library assessment professionals are charged with implementing or developing a “culture of assessment” in their organizations. Based on observations at the University of British Columbia (UBC) Library, this paper will consider how terms like “assessment” and “culture of assessment” may unintentionally exclude some library employees and limit engagement with assessment activities. Drawing from the literature on inquiry-based learning, the author suggests that in some contexts the goals of assessment may be served better by fostering a culture of inquiry rather than assessment.
Ideas for this paper arose from facilitated group discussions with library managers and administrators about the practice of evidence-based planning at the UBC Library. Participants described the characteristics of their workplace that encourage evidence-based decision making, as well as the characteristics that inhibit or undermine it.
One participant described an environment where managers encourage employees to propose new ideas, but also expect them to provide convincing evidence to support their ideas. The managers guide employees where necessary, helping them understand what is realistic, what questions they may wish to pursue, and how to gather evidence. By creating an environment where relevant inquiry is rewarded, this branch effectively distributes assessment-related activities and maintains high motivation.
This enviable state of affairs is not the norm at UBC Library. Across the library there is much support for the concept of assessment: that it’s important, that it should be done, that we should have employees who specialize in this area. But in practice assessment is sometimes treated as a box to tick on a list of project tasks or as a specialized activity that is unfamiliar or intimidating. In these cases emphasis is often on the how of assessment rather than the why, and the process can become routine and unrewarding, detached from the spark of inquiry that could bring assessment to life.
In response the author is changing how he promotes assessment practices at UBC Library. A phrase like “culture of assessment” can suggest that assessment is a worthy end in itself. In contrast, promoting behaviors that support a culture of inquiry suggests that the goal is to ask and answer questions, to exercise our curiosity in the workplace. On the one hand this change of focus represents a shift from the how back to the why. It also has the potential to engage a wider audience in the library, since nearly everyone can relate to curiosity and the satisfaction of asking and answering good questions.
These ideas will be explored in several workshops with library managers in July and August 2018. Much content for the paper and conference presentation is expected to arise from discussions in the workshops.
Engaging Graduate Students in Research and Scholarly Life Cycle Practices: Localized Modeling of Scholarly Communication for Alignment with Strategic Initiatives
Anjum Najmi and Scott Lancaster (Texas A&M University)
Librarians have for some time been vocal about the fact that the scholarly communication business cycle, essential as it is to the function of an academic library is not economically sustainable. Academic and research libraries have been quick to rise to the challenge of integrating scholarly communication into library operations and connecting these efforts to broader institutional goals. After more than a decade of effort, significant problems continue to reshape the landscape of scholarly communication.
Developing scholarly research skills continues to be a paramount need and remains strongly related to one’s success as a research scholar. As academic institutions struggle to adapt, librarians should leverage their expertise to advocate for positive change. Reexamining areas of need and identifying likely paths for collaboration can be beneficial to determining value and impact in this ever-changing environment. This proposal examines graduate students’ place in the campus research and scholarly lifecycle expectations and practices to identify gaps, form partnerships, and find solutions.
The research and scholarship life cycle is “the creation, publication, discovery and dissemination of scholarly research” (Scholarly Communication Toolkit, 2016). Managing this scholarly record requires conducting effective literature searches, managing reference information, understanding modes of publication, and demonstrating productivity as a researcher. In support of these goals Andrea Ketchum suggests that a more effective approach would be for libraries to focus not on the product of scholarly communication but rather on its process (2017). This localized model will identify not only needs but also institutional stakeholders positioned to fill those needs, which in turn will assist the library to “demonstrate alignment with and impact on institutional outcomes” (ACRL Plan for Excellence, 2017). Resources and services can be mapped to research tasks and institutional needs and librarians can engage with faculty and graduate students alike to build collaborative relationships across departments.
This presentation will outline the process of designing a strategic plan, the elements of that plan, its implementation, and the results. The steps include reviewing the institution’s latest strategic plan, research class syllabi, and library statistical data, as well as consultations with the graduate school, a citation analysis of recent theses and dissertations, and outreach to the various departments and individual advisors. We identified three burning questions to address:
- What skills, tools and guidance do students need to conduct effective literature searches that are expected by their academic disciplines?
- How can students efficiently and effectively manage data and other information reducing preparation time?
- What do our students need to understand about copyright, both as consumers and creators of knowledge?
Through planning strategically libraries can demonstrate value to the community and institution they serve as being able to support all aspects of the scholarly research life cycle as well as research community needs.
Choose your Adventure: a Library Reorganization Case Study
Heather Scalf (University of Texas Arlington)
In 2013, after extensive learning, scanning and planning and armed with a new vision, the UT Arlington Libraries underwent a comprehensive reorganization. In order to facilitate this, the leadership team at the time had to determine what kind of roles would be required to succeed with the new focus, and what skills and characteristics would be necessary in those roles. After a five day retreat, 283 KSAPs—knowledge, skills, abilities and preferences—were identified. These KSAPs were then combined in the development of 63 possible roles in the new organization. The entire staff was then asked to engage with the reorganization process by participating in self-evaluation and reflection and completing two tools via survey. Each staff member used the KSAP tool and indicated whether they felt that they possessed certain skills at a beginner, intermediate or advanced level, as well as indicating their preference for certain types of work. Second, based upon the list provided of all possible roles within the new organization, they were asked to make a rank-ordered list of their top seven jobs, and a list of the five jobs that they did not want to do, with a commitment from the dean that they would not be placed into any of those five positions. All positions were structurally agnostic, as no one except the dean knew the final organizational structure, and every position, except the dean, the associate dean and a digital research fellow was on the table.
After completion of the two tools, the dean interviewed each staff member to discuss their choice of roles, using their self-evaluation as a means of adding context to the discussion. With 15 minutes per staff member, this stage of the process took about three weeks to complete. After these interviews, the dean placed staff into roles in the new structure that better reflected the new vision of the libraries. Notifications were made to the new leadership team first, followed by the remainder of the staff. The new organizational structure was shared with all staff the following week. Evaluation of the process showed that 86.5 percent of staff got a role that was in their top 3, with 62.8 percent getting their top choice, and no one was permanently assigned to a role that was on their bottom 5. One lesson learned after the process was complete was that it is critically important to manage expectations in a time of transition There were some organizational challenges inherent in such dramatic change that were exacerbated by the perception by some staff that the transition was as simple as stepping from one role into another, without consideration for the needs of our users. One of the guiding principles for the libraries is “perpetual beta” and the organization itself has continued to change and develop as staff have since chosen different roles for a better fit, in some cases, or as new departments have been formed to move strategic priorities forward.
Coming Full Circle: Exploring the Life Cycle of a Staff Training Program Evaluation
Jennifer Sweeney (San Jose State University)
Purpose: The Get Involved: Powered by Your Library Collaborative was developed to expand the visibility and contributions of skilled volunteers in public libraries by training library staff to effectively recruit, train, and manage skilled volunteers (California State Library, 2017). In its second of three years, this IMLS-funded grant program provides training, resources, and coaching support to public library staff participants in California, Arizona, Texas, and Idaho. The evaluation plan for the project was designed to measure outputs, outcomes, and impact throughout the program life cycle, as well as demonstration of the potential validity of the scaled approach for measuring program cost effectiveness, using the Kirkpatrick and Guskey training program evaluation models (Guskey, 2002; Kirkpatrick, 1996)
This paper will report on the design and implementation of the evaluation plan, as well as reflect on lessons learned for ongoing evaluation model development.
Evaluation Design: The evaluation design addresses the multiple dimensions of the Kirkpatrick and Guskey training evaluation models with a menu of instruments and tools, including surveys of participant perceptions of benefits of training and learning; surveys and interviews exploring effectiveness of implementation of new practices in the workplace; interviews to measure increase in understanding of how new knowledge enhances library service and public perception of libraries; and metrics to document changes in level of staff time and quality of engagement with volunteers and increase in participation of skilled volunteers.
Findings: Evaluation measures to date have included satisfaction and pre- and post-surveys for the on site trainings and selected polling for online webinars. Survey data has provided uniformly high ratings for presentation quality, trainer preparedness, qualifications, and other measures of training quality. Pre- and posttest measures indicated learning gained in all learning outcome areas, with highest gains in best practices for recruiting, resources for building volunteer engagement, and creating meaningful job descriptions. Metrics associated with use of online volunteer management sites, volunteer hours logged, and staff perceptions of the usefulness of resources will be available beginning in 2018.
Value of the project: Professional development is a necessary and deserved activity for information professionals whose work environments are in constant evolution. Training program funders are eager to see evidence of the effectiveness of continuing education investment, yet too often only brief satisfaction surveys or generic pre- and posttests are the sole tools used to capture outcome data, lacking validity.
The evaluation model developed for the Get Involved program captures a more robust measure of training effectiveness. Beyond reaction to program quality, the evaluation investigates achievement of learning outcomes, as well as a measure of the information and skill that is actually applied in everyday work, and the ultimate value or impact to the organization.
California State Library (2017). Get Involved: Powered By Your Library. Retrieved August 1, 2017 from http://getinvolvedclearinghouse.org/
Guskey, T. R. (2002). Does It Make a Difference? Evaluating Professional Development. Educational Leadership, 59(6), 45–51.
Kirkpatrick, D. L. (1996). Great ideas revisited: Revisiting Kirkpatrick’s four-level model. Training & Development, 50 (54–59).
Diffusing Organizational Change through Service Design and Iterative Assessment
Rachel Vacek, Emily Puckett Rodgers, and Meghan Sitar (University of Michigan)
At the University of Michigan Library, dozens of librarians and staff are engaged in a series of activities to reimagine the ways our organization designs and implements our services. Between 2016 and 2017, we collaborated with brightspot strategy to develop a service philosophy, framework, and principles to help us begin the process to transform our physical and digital spaces to better represent our expertise, collections, and tools, and to meet the evolving needs of our academic community today and tomorrow. As we look to transform our spaces to serve the needs of our research community, we are taking care to ensure that whatever form our buildings and web presence take, will follow the function and intent of our services.
Our efforts in this work are collaborative and distributed in nature, diffusing the shift in design and evaluation across the organization. With it, we aim to facilitate organizational change that puts our users at the center of service design and delivery. It also fundamentally recognizes that our departments play a role in supporting the academic needs of our faculty, students, and staff at the University of Michigan.
This work is structured by established approaches in design thinking and user-centered design. Multiple teams of librarians and staff are applying this approach to redesign services. Topics include consultation, digital scholarship, staff innovation, citation management, and developing a persona-based toolkit that any staff across our organization may use in efforts to design new or make improvements to existing services. While each team is using the same overall approach to its service design work, the application and outcomes are unique to each domain. Within four design cycles, each team engages in a retrospective to review the process, the impact of the work, and consider its potential effects on our organizational structures.
Additionally, the service design efforts support our organization’s adoption of an assessment-driven mindset through embedding evaluation into our processes. Once the service design phase is complete, each team will generate a series of pilots or prototypes to test aspects of their designs in the context of our organization. In the Summer and Fall of 2018, teams will implement those pilots and prototypes, testing their ability to scale effectively or meet our programmatic and mission-based goals yielding a series of small-scale, but impactful activities or processes that will further diffuse the design-thinking approach throughout our organization. Each of these sets of activities will be assessed before moving onto future stages of the work. We will also evaluate how they help us enact our service philosophy, framework, and principles in the practice of our everyday work.
Ultimately this work will yield a culture shift within our organization enabling us to embrace a user-centered, service-based approach to how we develop services, and how we expect to collaborate and connect with colleagues within our library and academic community. It will also enable us to embed assessment practices into various facets of our work, from the beginning stages of design through its testing and into implementation at scale
Session 13: Space II
Getting to Scale: Developing a Sustainable, Collaborative, Mixed-Method Approach to Space Assessment at the University of Washington Libraries
Jackie Belanger, Maggie Faber, and Jenna Nobs (University of Washington)
Purpose: What does a scalable, sustainable, multimethod approach to space assessment look like for a large research library? How can library staff be meaningfully engaged in this process? Studies on library space assessment abound, but scaling many of the approaches across a complex library system and institutional environment can be challenging. This paper details changes to space assessment at the University of Washington (UW) Libraries piloted during 2016–18. Presenters will discuss how they transformed the UW Libraries approach to space assessment to focus on ongoing data gathering, mixed methods, and staff engagement in order to establish a sustainable, coordinated program that was responsive to user needs. The strategies piloted in this project will be relevant to libraries of all types and sizes.
Design, Methodology, or Approach: The authors piloted a number of changes to space assessment during 2016–18, including:
- Assessment staff broadening their focus from a large-scale survey and ad hoc project consultation to working closely with libraries staff to identify fundamental questions about libraries users and the use of library spaces. This was framed as an ongoing, inclusive conversation about space assessment needs across the libraries system.
- Piloting and scaling a mixed-method approach involving space counts, observations, a survey, and targeted projects focused on specific questions.
- Convening key stakeholders to discuss data, collaboratively formulate recommendations, and act on results. Discussions also focused on achieving clarity about who was responsible for acting on results.
- Creating tailored reports and visualizations that combined multiple data sources to provide a more holistic picture of activity in specific spaces and across the libraries, and to highlight key messages from users. Staff discussions focused on questions arising out of the data and strategies for acting on results, increasing the likelihood that they would implement changes.
- Tools and methods can be easily adaptable for staff wishing to conduct ongoing data gathering and assessments in their own units.
- Using a mixed-method approach provides staff with a more holistic picture of how spaces are being used and the value library spaces provide to the campus community.
- Engaging staff throughout the process resulted in wider buy-in for assessments and greater enthusiasm for using the results.
- Starting with smaller pilots and scaling up allowed the authors to learn lessons about new approaches before implementing them more widely.
- Generating a list of space-related questions through staff discussions revealed where coordinated approaches might be taken. This helped to establish a schedule for ongoing assessments, manage the increasing demand for projects, and maintain consistent data gathering over time.
Practical Implications & Value: This paper will highlight strategies for developing a programmatic, mixed-method approach to space assessment that fully engages a variety of library staff. The paper will provide attendees with ideas for creating scalable and sustainable solutions at their own institutions, including tested ways to employ and scale a variety of methods and successful approaches that enable staff to take action on results.
Discovering Access: Uncovering the Connection between Office Spaces and the User Experience
Tobi Hines and Sara Wright (Cornell University)
Since the renovation of our library ten years ago, we have taken a user-centered approach to the improvement of our public spaces. As their needs have changed, we have taken every opportunity to listen to our users in order to create spaces and services that contribute to their success. But can we say the same about our approach to staff office spaces? In our access services department, the roles and responsibilities of staff have changed significantly over the past decade, and yet their office workspaces have stayed largely the same. In response to staff feedback, we have spent the last year working closely with them—and using many of the same user research methods we have used to learn more about our patrons—to design an office environment that will help them be successful in their work. Our findings revealed that many of the issues with the staff offices had a direct impact on our circulation desk and the quality of service being offered to our patrons, and what began as a behind-the-scenes effort to improve our work environment has become a complete overhaul of our desk and service model. We will share our methods and findings, as well as our plans to renovate the circulation desk and move towards an integrated single service point.
How Many Seats Do We Need in Our Library? A New Utilization-Based Forecasting Model
Martha Kyrillidou (QualityMetrics, LLC) and Elliot Felix (brightspot strategy)
Library assessment has been looking at physical space planning tools as libraries transform. As libraries are reimagining their services and spaces and incorporate smart spaces with active learning in mind, questions about the purpose of use of different spaces is constantly at the forefront. One of the most important drivers for library projects is often the amount of user seating to be provided. Unfortunately, institutions are poorly equipped to answer this important question. Library spaces, services, and staffing have changed and campuses are creating study spaces throughout their campuses. Peer benchmarking information is often difficult to obtain and even when you have it, knowing how to use it is complicated. Previous planning standards have been rescinded and remaining standards can’t be trusted. This leaves institutions in a bind.
Faced with this challenge, some institutions simply try to maximize their seating by carefully reducing the space allocated to collections and increasing the space allocated for people, hoping that demand will equal supply. Others target a percentage of their population in a throwback to previous standards or based on a peer average, without taking into effect the actual usage of their spaces. Others have no plan at all.
So, how can institutions more simply, reliably, and accurately ballpark the amount of user seating they need? By coupling peer analysis of library utilization with their student population so that their forecast is determined not simply by population but by the predicted usage of space as well. Using ARL data on gate count and population, we have developed a new methodology that uses the key metrics of “visit per seat” and “visit per student” in order to determine future seat needs:
- By taking a future student population and dividing by the peer average of “visits per student”, an institution can forecast their future gate count (being selective to use peers who’ve recently renovated and thus have a gate count that’s more indicative of the future state).
- By taking this future projected gate count and dividing it by the peer average of “visits per seat” institutions can then determine the future number of seats needed (and understand this as a percentage of their student population).
Both ARL and ACRL data collection interests reflect this new interest in library facilities. ARL most recently collected data on facilities in 2012. An initial analysis of this data collection was presented at the 2016 Library Assessment Conference. The research presented in that paper did not look at the seating capacity issue though. In this paper we are taking a closer look at the seating capacity data, offer a model for determining requirements and best practices based on a seating utilization model rather than simply an FTE-based model, and offer a tool that can be used and updated by the community as they make these decisions in the coming years. This will provide a valuable planning tool to hundreds of institutions faced with complex and costly decisions that will affect their libraries for decades to come.
Where Students Want to Spend the Night: A Two-Phase Examination of Overnight Study Spaces at the University of Florida
Laura Spears (University of Florida)
This paper presents a completed two-phase study about where to locate the overnight study hours at the University of Florida (UF). Over the past three years, the overnight study hours have shifted between two different library branches and a new, student-run facility that was intended to be a learning commons that would be managed by the UF Student Government. Each facility has its devotees and critics: the Marston Science Library, is loved for its often noisy but contemporary styling in lower floors that ascend into more traditional library spaces on the higher floors. Library West is the beloved humanities and social sciences main library that has limited parking nearby and is far from student housing. And Newell Hall, the newest space, is modern and brightly lit. Newell Hall was considered the prime location to host the overnight hours, with restricted access, campus police in house, and proximity to student housing.
But as was previously experienced when overnight hours moved from Library West to Marston, there was considerable resistance to overnight hours not being made available at a library, citing insufficient seating and quiet spaces. But since Student Government is the source of funding for these hours, the assumption was that Newell Hall should host them.
To understand the source of complaints, a two-phase study was conducted in the Fall 2017 and Spring 2018 semesters, examining the perceptions, preferences and needs of students; and, an occupancy study of the hours between 12 a.m. and 8 a.m. Examining traffic data was not enough as it does not capture users entering at 11 p.m. and remaining for multiple hours. So in addition to an online student survey collected in Fall 2017, we collected headcounts at each location, eight times per night between 12 a.m. and 8 a.m. for six weeks in Spring 2018.
This paper will present details of the findings that include: 1) when students were offered a choice between Library West and Newell Hall for overnight study hours, almost 15% chose Marston by writing in this vote; 2) students were commenting about space use in general, not just the overnight use, so the survey had marginal value about those hours; 3) students clearly indicated that certain elements make a facility a ‘library’, and these were not present at Newell Hall; 4) occupancy counts were analyzed for each facility’s capacity utilization, identifying features students used overnight such as public access computers and group study; and, 5) Newell Hall lacks seating to accommodate the occupancy required during much of the overnight hours.
This study presents findings of this two-phase study including the more than 5500 text comments submitted and the analysis of the occupancy and feature use in each facility during the overnight hours. The findings suggest that students have passionate and concrete ideas of what should be provided in an overnight study space and it suggests that study space design requires a more nuanced approach to providing the appropriate number of seats and types of features that users always want available, even overnight.
Session 14: Collections
When to Hold On and When to Let Go: A Distributed Retrospective Print Repository Program for Rarely-Held Monographs
Jean Blackburn (Vancouver Island University) and Lisa Petrachenko (University of Victoria)
As “last copy” shared print archive initiatives and strategies take shape around the world, a shared print monograph initiative undertaken by ten academic libraries in Western Canada demonstrates a promising distributed model of collaborative preservation and access to materials rarely-held and/or of regional interest. The Shared Print Archive Network (SPAN) Monograph Project, an initiative of the Council of Prairie and Pacific University Libraries consortium (COPPUL) undertaken in 2016 and 2017, complements and builds on the success of other SPAN shared print archive phases targeting journals and government publications. Outcomes for the SPAN Monograph Project include preserving the print record for members in a cost-effective way, providing reliable access to shared print archive materials, and creating opportunities for the reallocation of library space. Using OCLC’s GreenGlass® data modeling and visualization tool, group members analyzed their collective holdings and circulation data—7.3 million records across the ten participating libraries, supplemented with holdings data from several “comparator libraries” across Canada and the US—to assess collection usage, duplication, and dispersion of holdings. In the project’s distributed network approach, each participating library agreed to retain items identified by the retention model on behalf of others in the group. SPAN Monograph Project participants used a consensus process to develop a shared retention model focusing on local-interest materials and those rarely-held within the region and across Canadian research libraries, ultimately deciding to retain 16% of the group’s collective holdings. Other key decisions, also reached by consensus, included retention commitment terms and the design of a shelf-validation process. Group members have also employed project data at a local level to facilitate deselection of low use, widely held materials and advance other local collection management goals in responsible, sustainable, evidence-based ways. The COPPUL SPAN Monograph Project demonstrates a distributed model of shared print archiving wherein smaller institutions without explicit preservation mandates or extensive storage facilities can effectively contribute to solving the problem of preserving the print scholarly record into the future. All academic libraries have rare or unique materials within their collections; if shared print archives can be seen as a “print safety net,” ensuring reliable access to the print record into the future, then the distributed shared print archive model allows all libraries the opportunity to form part of the fabric.
Ranking Data Outliers for Collection Budget Analysis: Allocating for the Future
Elizabeth Brown, James Galbraith, Jill Dixon, and Mary Tuttle (Binghamton University Libraries)
Finding an objective and reliable means of allocating annual collection development budgets is a perennial challenge in research libraries. Many libraries tend to rely on methodologies such as applying standard inflationary increases across all or some types of funds. These method tends to maintain and perpetuate funding priorities from year to year. Changing campus needs, including new programs and curriculum, innovations in research methodologies and teaching, and new campus-wide strategic priorities constantly challenge us to overcome the collection allocation inertia that may set in if empirical data is not used to test budgetary assumptions and then allocate resources to meet changing priorities.
Faced with new campus-wide priorities, including the launch of new academic programs and emphasis on multidisciplinary initiatives, as well as fiscal pressures such as budget cuts for library materials, our library developed a system to evaluate our budget allocation methodology using over 20 factors.
Our methodology began by creating a dataset consisting of internal library and campus data, with externally created cost information for books and journals by discipline. Library data included costs for books, journals, electronic databases as well as circulation and interlibrary loan data. Campus data included faculty FTE, degrees granted, number of students by level (undergraduate and graduate), and course hours by department or program. External data included average book cost and serials cost by discipline. Each data category was ranked from highest to lowest in value. Library budget categories rankings (book, journal and databases) were compared to the rankings of all the other categories to determine under- and over-funded areas. A summary sheet was compiled to determine disciplines with under- and over-funded indicators. The summary sheet also indicates trends over the budget years examined.
With these recent findings we will be able to identify areas for reallocation of collections funds and better address anticipated campus curriculum and research needs. Qualitative measures will also be considered including support of general education courses, interdisciplinary nature of discipline, and dependence on monographs or journals. This empirical data methodology can be expanded to include future annual data to determine if areas of campus priorities have changed and to re-examine trends. Note: Empirical data has been compiled and analysis has begun. The results/findings will be implemented by fall 2018 (prior to the conference).
Collecting Globally, Connecting Locally: 21st Century Libraries
Susan Edwards and Chan Li (University of California, Berkeley)
Purpose: Many academic libraries strive to create collections closely aligned with institutional research and teaching needs. But how to make informed decisions about what researchers need in global studies, where the number of potential sources is vast? Usage statistics, the traditional assessment metric, are of limited value for inherently lower use non-English sources. Our research goal was to better understand what the social science faculty at Berkeley need to support their global research.
Methodology: This project used mixed methods to explore both faculty attitudes (how they feel/what they believe) and their behaviors (what they cite). A database of all faculty (509) in the social sciences at Berkeley was created, and their geographic focus was entered. The departments with significant research outside of the US/UK were Anthropology, History, Political Science and Sociology.
Publications of the 107 faculty (2013–2017) in those four departments with global research interests were identified through searching Scopus, Web of Science, CVs, disciplinary databases, and Google Scholar. References from Scopus were pulled using an API; the remainder were entered manually. Language coding was applied for all bibliographic references.
A survey was sent to the 107 faculty exploring how they acquire the non-English material they use, what kinds of foreign sources they use, how often they use this material and their level of satisfaction with the research support provided by the library.
- A high percentage (73%) of the faculty in the four disciplines had global research interests.
- The vast majority of their cited references were to English-language sources.
- The non-English sources cited comprised 9% for Anthropology and Political Science and 10% for Sociology. History was the outlier, with 32%!
- Twenty languages besides English were cited, the most common (in frequency order) were French, Spanish, Russian, Chinese and German.
- Scopus indexed only 61% of the publications of these faculty.
- The faculty survey (50% response rate) showed that that 78% felt the library support in this area was satisfactory or great, but also elicited concerns about potential cuts to collections.
Practical Applications: This work has already helped us get abetter sense of faculty needs and behaviors. An interactive visualization (http://rpubs.com/jq834488/296789) of the faculty’s primary geographic focus enabled us to see the distribution across all 500+ social sciences faculty, and the database of faculty research interests by country enabled area studies librarians to identify all the social sciences faculty with research interests in their areas. Our findings demonstrate the necessity for collaboration between area studies and subject librarians in the globally focused social sciences, and the importance of multilingual sources, especially for History. We may have identified a mismatch between some of our historic collection strengths and the faculty’s current research focus, including language usage, requiring further investigation. While overall we were pleased with the faculty’s level of satisfaction with our collections, we did elicit some negative comments which we will explore more deeply in structured interviews, the next phase of this research project.
The Collection Assessment is Done… Now What?
Karen Harker, Coby Condrey, and Laurel Crawford (University of North Texas)
Collection analyses, evaluations, and assessments are an important aspect of collection development services provided by libraries. Librarians in most academic institutions conduct evaluations as sporadic projects based on ad hoc needs, notably accreditation reviews or the influx of funding for a particular subject. Indeed, the lack of positive change (in policy, selection, funding, or patron perception) resulting from these time-consuming projects has been noted by some in collections management. Furthermore, while librarians allude to potential uses or outcomes of such evaluations in the form of “knowing the collection” or adjusting the “collection and managing activities to increase congruence between collection and [institutional] mission,” few professional resources on the topic provide specific methods of applying the results of these time-consuming and data-intensive assessments. The collection development leadership at the University of North Texas Libraries has opted to take a more inclusive approach: this library combines its evaluation cycle with an ongoing effort to incorporate findings such as gaps and strengths into subject-based projects to “enhance” targeted subsets of the overall collection.
We have implemented a method that continuously builds an academic library collection in a systematic manner that connects with regularly scheduled assessments that identify the strengths and gaps of subject-based holdings. While demand-driven acquisitions (DDA) can help fill gaps in monographs by fulfilling current information needs, it is not a reliable method to predict near-future needs. Routine collection evaluations inform collection development decisions by providing the framework for a more intensive collection development. This enhancement collection development requires carving out a segment of the budget specifically for this purpose. The amount of funding for any enhancement depends on the total available funding and the scope of gaps identified by the collection evaluation. Evaluations, in turn, are scheduled in advance and the scope of each evaluation may be adjusted based on available resources.
This paper will include information on the planning and execution of infrastructure changes necessary to coordinate the application of the large amount of data collected. The collection development librarians changed the collection budget structure, freeing funding for such enhancement projects. Selection was centralized to enable greater control over gap areas and evaluation of potential purchases. The collection development librarians frequently discussed principles of application of data and evidence-based decision-making and realigned staff duties to accommodate enhancement activities.
This method ensures that collection development efforts address gaps systematically and holistically. We reduce selection bias in any one disciplinary area by having a team of expert librarians consult on the overall enhancement plan. Enhancement planning may coincide with accreditation reporting deadlines or support new programs. The Collection Development unit can better manage workload and budget by planning ahead and analyzing needs for efficient decision-making.
In our paper, we will provide the context of our subject-based collection evaluations, details on the methods and metrics used, an overview of the planning of evaluations and enhancements, and specific examples of the application of the process of enhancement based on the results of the evaluations.
Assessing Textbook Cost and Course Data for a High-Impact Textbook Lending Program
Jan Kemp and Posie Aagaard (University of Texas at San Antonio)
Purpose: Open education initiatives are gaining traction at many higher education institutions, although the promise of open access or free textbooks for all courses is not yet a reality. Meanwhile, the high cost of textbooks is an obstacle to academic success for many students. A recent survey found that 31% of college students do not purchase their textbooks (Statista, 2017). At a public research university with an enrollment of 30,000, 47% of students are first-generation college students, and nearly that many are economically disadvantaged or eligible for Pell grants. Providing equal access to education is a key goal, and librarians made the decision to help support student success by offering a selection of textbooks on reserve. Given a relatively modest annual textbook allocation of $70,000, it was important to identify textbooks that would benefit the largest number of students, effectively supporting the university’s student retention and graduation goals. To maximize the impact of the program, librarians needed to assess the potential of textbooks to support student success.
Design: Librarians focused on supporting students most at risk for dropping out—freshmen and sophomores—looking primarily at lower-division, large survey or core classes with expensive textbooks. Librarians negotiated access to the university bookstore’s textbook information system, capturing course numbers, textbook titles, and costs for currently adopted textbooks. They also accessed data on enrollment for each course, aggregating the enrollments for all course sections. The university’s Office of Institutional Effectiveness provided data on the percentage of students who earned low grades of D, F, or who withdrew for each course. By aggregating the data on high textbook cost (over $100), high course enrollment, and poor student grade outcomes, a set of “high-impact” textbooks were identified. These results were also checked against the list of required core or gateway courses for undergraduate majors, and these textbooks received additional weight in the assessment. Based on course enrollment, the library purchased one to six copies of each title, usually one copy for every 100 students.
Findings: High-impact textbooks are heavily used by students. In FY2017, 4,000 textbooks accounted for over 50% (84,672) of the total physical materials circulation (168,725). The service desk maintains waiting lists for the most popular textbooks, and when demand dictates, additional copies are purchased.
Practical implications or value: The print textbook lending program is one piece of an overall textbook strategy. Physical copies have limitations, since each textbook can only be used by one person at a time. The library proactively pursues multiuser e-textbooks as a preferred option, and it leads the campus in the movement to incorporate OER materials into courses. In the future, the increased availability of textbooks in electronic format and OER together should reduce the demand for print textbooks on reserve. Until then, the high rate of circulation for print textbooks reflects a compelling need and provides a solid justification to expend funds for high-impact textbooks. Considering that 4,000 textbooks generate half of the library’s print circulation, this seems like an excellent value
Mining EZProxy Data: User Demographics and Electronic Resources
Ellie Kohler and Connie Stovall (Virginia Tech)
After a mandate to utilize data to demonstrate impact on student success, Virginia Tech’s university libraries began diving into previously untapped data sources. Given that the collections budget makes up 48% of the total library budget, roughly 90% of which streams to electronic resources, it was deemed necessary to make more direct connections between electronic resource usage and student success.
Usual practices prior to the charge involved analyzing usage from Counter reports and cost data, primarily for the purposes of serials budgeting and negotiations. Due to these past data collection analysis practices, the university libraries could only create basic inferences about library electronic resource users. In order to create more robust user inferences, Universities Libraries turned to EZProxy logs as well as university-collected student data and began a multiphase research project based on the connection of the two data streams.
The long-range purpose of the research project is to create better understanding of student user demographics by connecting electronic resource usage information with university-held student demographic information. Ultimately, plans include the measurement of the impact of university libraries on Virginia Tech’s overall success and constitutes the start of a broader systematic study of the impact of university libraries’ dollars spent on databases. Development of this study includes research into encryption and anonymization techniques, as well as current best practices in security of personal information. Discussion will include challenges, including on and off campus usage access and meeting resistance to utilizing personally identifiable data. The discussion will also include tools utilized in the study, which include EZproxy, Graylog, Python, and Tableau.
Concurrent Session V
Session 15: User Experience
Shopping for Sustainability: Reenvisioning the Secret Shopper Assessment
Tricia Boucher and Jessica McClean (Texas State University)
For years librarians have used Secret Shopper assessments to evaluate the quality of customer service patrons receive. These snapshots of interactions at library service points reveal both positive trends in service and opportunities for improvement in training and best practices. However, this application of the data has a relatively narrow value considering the labor-intensive quality of the Secret Shopper assessment—developing scenarios, training participants, following up on assignments, and evaluating data.
After collecting four years of Secret Shopper survey data, the question arose: how can we maximize the sustainability of the project by finding novel applications for the information collected?
We collect data across multiple modalities (e.g., reference transactions, circulation statistics, door counts, surveys). Examining all the data together paints a much fuller picture of our patrons’ use patterns and attitudes. It should also reveal data gaps that hinder our ability to make informed decisions. Working from an outcomes-based approach, we will consider whether we are using the Secret Shopper assessment appropriately, determine whether there are new applications for Secret Shopper data, and modify this assessment to match our needs. We expect we will use this exercise as the foundation for creating a holistic data collection plan in the future—one that will maximize the use of the data we choose to collect.
Tell Me What You Want, What You Really, Really Want: Understanding User Perspectives with Comparative Analysis
Zoe Chao (Pennsylvania State University)
As user experience (UX) gains traction in libraries, the focus of user research has shifted from usability to the broader question of user experience. Libraries’ online presence is no longer only about usability and findability, but about connecting to users and understanding their needs. From fall 2016 to spring 2018, 186 small-scale UX tests were conducted at the Penn State University’s main library entrance. Participants spent 5 to 10 minutes completing one or two tasks in exchange for a cup of coffee and a snack. In this setting, called the UX Café, different UX methods were used in addition to usability testing. In this paper, the author will demonstrate how undertaking a comparative analysis helps us understand user preferences for a library’s online environment.
In product development, competitive analysis measures a product’s strengths and weaknesses against competitors. However, our goal in using this method is more than to evaluate the product itself. We seek to learn user’s perspective. In our study, participants were asked to conduct one usability task, for example, “find the open hours for the Maps Library” on the website of three peer institutions, plus the Penn State University Libraries website. They were then given a short interview to share their opinions based on their experiences. The advantage of such a study is that the participants become more aware of possible alternative designs after seeing and experiencing other interfaces, which expands their vocabulary in describing their perspectives and preferences.
This comparative method was used to gauge user’s perception of the top navigation menus, the frequently used pages, and the search result pages for the Discovery tool at the Penn State University Libraries. It was evident that a small design decision can trigger users’ different reactions towards the interface. Nowadays, users are learning and adopting new technology at an unprecedented speed. Their expectations for the web environment continue to evolve. Creating positive online user experiences should be part of the libraries’ ongoing endeavor. Though the scope of each study is discrete due to the time constraints of the UX Café, we have gained broad insight on users’ perspectives, which will help guide future interface design decisions for the libraries’ website.
Comparing Apples to Oranges? Doing UX Work Across Time and Space
Andrew Darby and Kineret Ben-Knaan (University of Miami)
Usability testing is, famously, an iterative process. You test something, you make changes based upon the results, you test again. The recent website redesign process for the University of Miami Libraries began with a “Discovery & Content Analysis” phase, where (among other things) we looked at earlier user research, and determined areas for future research. Starting with this data, the UX Team did a series of tests over the course of the next year: card sorts, one-on-one tests, focus groups, “mini design sprints,” first-click tests, and tree tests. While there was no static set of questions or tasks that appeared in all tests, those which performed well were removed from the next test, while those which performed less well continued on. With over a year’s worth of testing, it is interesting and informative to look at the ups and downs of specific tasks across time and different testing methodologies.
An example might make this more concrete. We know from our user research that hours are important. The hours page is the third most visited page on our site, came up as important in focus groups, and appeared prominently in user-created mockups during mini design sprints. So when we did a first click test against three home page prototypes, one question asked users how they might find the Marine Library’s hours. The results weren’t great, with success rates varying between 50% and 63%. We hypothesized that the context (a static mockup without interactive menus) might be adversely affecting success, and so asked the question again as part of a TreeJack test of the information architecture. Our success rate improved to 98%, and Treejack’s “pietree” visualization demonstrated the two main paths users followed to get to the Marine Library’s hours. Our next test includes a task about finding those hours in the finalized design, which has an hours widget on the home page in desktop, and an hours button in the sticky footer of our mobile view.
This paper/presentation, therefore, will expose the audience to UX techniques and technologies using concrete examples from one year’s worth of user research. We believe that focusing on common library tasks, tested with a variety of methodologies and tools, will provide an innovative way to show how user research can help stakeholders visualize results and make decisions. Along the way we will pass on methodological tips, and assess the value of select UX tools (notably Optimal Sort, Treejack, Chalkmark, InVision).
Redesigning Harvard Library’s Website with User Research at Every Step
Amy Deschenes (Harvard University)
Harvard Library’s website, library.harvard.edu, was due for a total reenvisioning. The library’s web experience did not accurately reflect the breadth of services and depth of resources that we make available to our community. Additionally, the website needed an overhaul of the technical infrastructure, content strategy, and design. When we kicked off our redesign project, the web team defined our main guiding principle: to put the user at the center of everything we do. In order to keep our decision-making process user-driven, we conducted research with users at every point in the site’s planning and development. In total, we carried out a variety of user research methods with over 200 members of the Harvard community throughout the project.
Even before the project began, we conducted in-depth user interviews with a variety of user types to help us better understand our audiences. Based on the interview findings, we created detailed personas that were used as guides throughout the project. The personas were integral to our web team’s decision-making process. We used them to focus our content strategy discussions, write user stories, and define page goals. The personas proved invaluable at every stage of the project and kept us aligned with user needs.
We used prototype testing at several points during the redesign to test ideas and designs before building them. There were a variety of prototypes we tested: from low-fidelity wireframes on paperboard, to interactive prototypes tested on iPads, to high fidelity design mock-ups. Prototype testing provided us with actionable feedback early enough in the process so that we could incorporate user perspectives into the final designs, before any code was written. The prototype testing was conducted both in on-the-spot testing and via online surveys.
The other significant user research track on our project was around building our navigation menu. We used card sorting and tree testing to create and evaluate our navigation. This helped our team better understand how users talk about our services and which labels were clearest. It was helpful to have user feedback on how users would group and organize content on the site as well.
What the web team at Harvard discovered throughout our redesign project is that working with users from the start helped clarify our thinking and prioritize features throughout the new site. The results we gathered include insights into more user-friendly language, recommendations on organization of content, and the benefits of incorporating user feedback into a project decision-making process. Using a variety of methods and making the user the center of our process helped us to build a more usable website from the start.
Holistically Evaluating a Room Booking Service’s Transition from Online Booking to In-Person Access
Ruby Warren (University of Manitoba)
Through usability tests and user experience interviews, the University of Manitoba Libraries holistically evaluated the user experience of our room booking system from booking online to physically accessing the space. Participants were chosen to represent all libraries user groups eligible to book rooms in University of Manitoba Libraries, including undergraduate students, graduate students, and faculty members. Participants independently navigated the libraries’ online room booking system and talked about their thoughts and decisions while achieving set goals (booking a room for a set time period, leaving the computer, locating the room, and using the swipe mechanism). Participants then gave open-ended interviews about their feelings regarding the process and experience, and transcripts of each interaction will be analyzed using iterative process coding to identify three–five final themes. These themes and identified usability pitfalls will be clearly communicated within the paper so that other libraries may take them under advisement in their own contexts, and the practice of evaluating the usability of a library service from online beginning (impulse to book a room via the website) to completion (physically accessing the space) is still uncommon enough in library literature that study design could provide an example and foundation for future service user experience evaluation.
Session 16: Measurement and Measures Indicators II
Beyond “Will You Make Me a Dashboard?”: Meaningful Collaboration on Visualization and Reporting
Maggie Faber (University of Washington), Jeremy Buhler (University of British Columbia), and Frankie Wilson (University of Oxford)
Purpose: How do assessment practitioners collaborate with others? What do different institutions do to engage stakeholders in project design, reporting, and visualization development? In this paper, the presenters will explore three different approaches that have helped facilitate and streamline collaboration in each of their contexts: an iterative consultation process, a user story framework, and quick reporting as outreach and awareness building. This session will share key questions, approaches, and lessons learned to help move participants from “this looks great but what do I do with it?” to “I never would have seen this without you!”
Approach: The University of Washington (UW) Libraries have developed an iterative, highly-consultative process. Projects are conceived as part of an ongoing conversation and prioritized for what is strategic, actionable, and responds to a clearly-articulated need. Working through semistandardized questions and a review of current data systems ensures project stakeholders are clear with each other, that the team(s) providing data and advising on methods understand the purpose of the project, and that the dashboards reflect the context and tasks the stakeholders are trying to accomplish.
At the University of British Columbia (UBC) Library any employee may contact the assessment team for a report or visualization. Requests expected to have a greater impact receive more attention, but by working with all who express a need the assessment team develops relationships and increases awareness of the potential for integrated data into decisions and workflows. The long term goal is to improve data literacy and encourage the meaningful integration of evidence at all levels of the library, not only on high profile projects or among managers.
The Bodleian Libraries at the University of Oxford have adopted a method from Agile software development to ensure that dashboards are created to meet a specific purpose. This means that they are used and therefore add value to the work of the libraries, rather than just being built for the sake of it. The User Story framework teases out who will use the dashboard, what the dashboard needs to do, and why this is important to the libraries through a fill-in-the-gaps approach.
Findings: Each approach creates opportunities to approach collaboration in different ways, but comes with their own costs as well: either the time associated with close collaboration or the assessment team’s expertise not always focused on the projects of greatest immediate importance. The presenters will discuss their experiences with both a ‘narrow and deep’ focus on addressing the needs of one user group as well as a ‘wide and shallow’ approach where a simpler dashboard is produced for many user groups within the libraries.
Value: Assessment is largely collaborative; not only do projects require consultation and mutual support, but it is often the responsibility of staff in specific areas to implement changes, rather than assessment staff themselves. In drawing connections and distinctions between each of these processes, the presenters hope that their paper will offer attendees a perspective on different modes of collaboration and share the practical methods they use to work with stakeholders to figure out what is meaningful and important.
Library Continuous Improvement through Collaboration on an Institution-Wide Assessment Initiative
Michael Luther and Jen Wells (Kennesaw State University)
Purpose: In Fall 2016, Kennesaw State University’s (KSU) Office of Institutional Effectiveness (OIE) launched Improve KSU, a Continuous Improvement Initiative. Improve KSU calls for all academic, student affairs, operational, and administrative units from around the university to track data on selected outcomes and report findings at the end of the fiscal year. By measuring the same outcomes year to year, KSU has the opportunity to demonstrate improvement at the institutional, departmental, and unit levels.
This plan represents a meaningful collaboration between assessment professionals at the university and the academic library. This paper will elaborate on the value of these two perspectives in the pursuit of continuous improvement.
Design, Methodology, or Approach: Kennesaw State University Library System (KSULS) and its seven functional units were among 500 units throughout the university to participate in the full annual cycle of Improve KSU. The cycle included initial development of outcomes and measures, mapping outcomes to the university strategic plan, gathering data, reporting findings, and articulating strategies for improvement. Each unit developed three outcomes and a minimum of two measures for each. Unit representatives entered all outcomes, documented results, and contextual narratives into the CampusLabs assessment software.
Findings: In September 2017, the library reported findings on 67 distinct measures covering 24 outcomes. These outcomes related to a broad array of library functions, including faculty/staff engagement, professional development support, efficiency of workflows, customer satisfaction, collection relevance, seating, library wayfinding, availability of outlets, training, and exposure to library instruction. September 2018 will mark the second full year of Improve KSU and will be the first opportunity for the KSU Library System to show improvement on its outcomes.
Practical Implications or Value: A Continuous Improvement Plan affords the university and the library a broad grounding in assessment by tracking improvements to performance or learning outcomes over time and at multiple levels throughout the institution. Further, by mapping each outcome to the goals, objectives, and action steps of the university strategic plan, the library demonstrates not only continuous improvement, but also how these improvements provide direct support for the university mission. Evidence of this kind is essential in meeting the standards of accreditation bodies and the expectations of university administrators. It is also key to understanding library operations and users and advocating for their needs.
Is There a (Data) Point? Are All These Measures Useful?
Dawn McKinnon, Joseph Hafner, Martin Morris, and Andrew Senior (McGill University)
At McGill University Library, librarians collaborated to gain a deep understanding of how faculty and students use e-journal collections, to help inform collection development and promotion. With projects covering three subject areas, Music, Dentistry and Social Work, data was gathered from multiple sources, including a yearlong ARL MINES for Libraries™ survey, 1Science reports comparing usage and faculty publications against the library’s holdings, results from faculty surveys on their preferred journals for teaching, as well as traditional vendor-supplied statistics. Using these four tools, the following research questions for each subject area were examined:
- Which e-journals were being used, and by whom?
- Are the journals that faculty cite and publish in, the same journals being downloaded the most often? What kind of access does the library provide to these e-journals?
- How do results of “priority” or “top” e-journals differ depending on the measurement tool used? Do some of the measurement tools provide more comprehensive information for different subject areas?
- With all of these tools (and more) available for collection analysis and evaluation, is one type of tool better for certain tasks or questions?
Not surprisingly, findings show that each tool has advantages and challenges. Collection evaluation is time consuming, and often requires collaboration between collection librarians and liaison librarians. An understanding of the unique perspective that each tool can provide, combined with practical examples of what can be done from the findings of various subject areas, and lessons learned on how we collaborated, will provide sustainable solutions for any librarian doing assessment of online collections.
Measuring Library Support for Institutional Research Endeavors Using a Return-On-Investment Model: Data Analysis Update
Jennifer Kluge, Douglas Varner, Nancy Woelfl, and Jett McCann (Georgetown University)
Library assessment is a growing specialty within the field of librarianship. Library assessment courses have been developed and integrated into library school curriculum and many academic libraries are offering positions devoted solely to assessment and evaluation of services and resources in enhancement of institutional research and educational initiatives. Trends in library assessment are moving towards supplementing qualitative measures of the library operations with quantitative computations that lend themselves to statistical analysis and data aggregation. Libraries are increasingly required to assess the value of services in the global context of importance to users expanding beyond qualitative models to develop quantitative methodologies. This proposal provides an update to data presented at the 2016 Library Assessment Conference describing the application of a model in a health sciences library setting which calculates the institutional return-on-investment in library resources correlated with the generation of extramural grant income. The data point generated from this model is a monetary unit outlining how every dollar an institution invests in the library achieves a dollar amount return in grant income. A pilot study of this model has been conducted and presented at the 2016 Library Assessment Conference to assess the value of the model. Results of the pilot study demonstrated that for every dollar the institution invests in library resources $1.89 in grant income will be realized. The librarian and the Office of the Dean for Research staff at the Georgetown University Medical Center have refined the established return-on-investment model developed by Luther, Tenopir and adapted for use in health sciences libraries by Woelfl to enhance the robustness of the model. In addition, the librarian and the Office of the Dean for Research staff will analyze and incorporate additional data into the return on investment calculation. The return-on-investment dollar amount is derived from a formula based on compilation of discrete data points including approved NIH grant dollars, the number of applications using citations in the reference section of the grant application, the number of citations available via the library collections in the grant application reference section and additional quantitative values to be discussed in the presentation. Library staff and the Office of the Dean for Research staff collaborated with a consultant who developed the health sciences library model to arrive at a return on investment calculation which provides a quantitative measure of the impact of institutional investment in library resources and services on grant income. A pilot study has been conducted with reference sections from 18 proposals analyzed. The citations were placed into one of five categories with respect to availability by the researcher submitting the grant proposal:
- Journal articles in electronic format in library collection: 742 (70.60%)
- Journal articles in print collection: 7 (0.67%)
- Book available in library print collection: 13 (1.24%)
- Open access articles available to everyone: 76 (7.23%)
- Citations not in the library collections: 213 (20.27%).
- Total citations in grant reference sections: 1,051.
The journal articles, books and open access titles were aggregated into one data point representing 79.74% of citations available in some format to grant submitters. The results of the pilot study conducted at Georgetown University Medical Center demonstrated that for every $1.00 the institution invested in library resources $1.89 in grant income was generated. This figure was derived based on the analysis of the reference sections from 18 National Institutes of Health-funded grant proposals with data input into the model formula. Additional research grant reference section data are currently being analyzed and will be incorporated into the model to enhance the robustness and application of the model in other settings. The additional data calculations and analysis will be described in this presentation.
Session 17: Teaching and Learning
Assessing Student Learning in Library Instruction: A Faculty Perspective
Doreen Bradley and Jo Angela Oehrli (University of Michigan)
Purpose: Assessing library impact on student learning is essential for demonstrating libraries’ integrated value and commitment to higher education. Librarians at the University of Michigan Library designed and conducted a study to assess student learning in one-time, course-integrated instruction sessions in order to investigate faculty perceptions of student learning in the library.
Methodology: Librarians designed a survey with three Likert scale questions to assess faculty perceptions of student learning and satisfaction with library instruction related to the specific goals of their course. Additionally, two open-ended questions focused on obtaining data about specific concepts/skills students learned during the session, and then how students applied these concepts/skills to improve the quality of their coursework. Surveys were created in Qualtrics and individually sent to 380 faculty who requested instruction for their courses during the 2017–2018 academic year. This represents a total of 828 course-related instruction sessions as many faculty incorporate library instruction into multiple courses they teach. 165 faculty completed the survey for a 43% response rate.
Data from faculty responses are being analyzed in two ways.
- Quantitative data from the three Likert scale questions are being analyzed to identify trends based on course level and discipline.
- Qualitative data from the two open-ended questions are being coded for specific themes related to student learning. This study design is a sustainable model of assessment.
Findings: Data from the study is overwhelmingly positive that faculty strongly state that students learn valuable concepts and skills from library instruction sessions. They provide examples of specific learning goals and how they determine students achieved these goals. 95% of faculty responding to the survey report that students were better able to complete coursework because of this session. 98.5% state the instruction session met their learning goals and expectations. 99.4% would recommend library instruction to other instructors. Further coding is being conducted to identify specific learning trends by discipline and course level. These results will be discussed in detail.
Practical implications and value: On a practical level, the data collected through this study is immensely useful in assessing our library instruction program. Library instructors learned that their teaching practices are effective and thoroughly meet the needs of faculty and students. Analyzing information on faculty learning goals and assessment of student work helps library instructors to understand where faculty place emphasis in their courses, where students could increase knowledge, and a broad range of faculty expectations regarding student achievement. Through analysis of this data, library instructors are better positioned to discuss future student learning needs with faculty in their disciplines.
Moreover, this study clearly demonstrates the value that library instruction can bring to the student learning experience. The concepts and skills taught through library instruction are foundational and intrinsic to curricula throughout higher education. When faculty demonstrate the value of library instruction to student learning, other faculty and administrators listen. Sharing this sustainable methodology with other libraries to implement can advance campus conversations around student success and best practices at their institutions.
Information Literacy Assessment for Instruction Improvement and Demonstration of Library Value: Comparing Locally-Grown and Commercially-Created Tests
Kathy Clarke (James Madison University) and Carolyn Radcliff (Carrick Enterprises)
Information literacy assessment takes many forms. This paper will explore how two types of fixed-choice tests, one locally created and one nationally developed, can be used for program improvement and to demonstrate library value.
Locally-Grown: At James Madison University, first-year library instruction is handled via a tutorial-test model. The tutorial, Madison Research Essentials, is a combination of video tutorials followed by practice exercises. The test, Madison Research Essentials Skills Test (MREST), is a requirement that all students must successfully meet or face an administrative hold blocking them from sophomore registration.
6,000 first-year and transfer students complete the test annually. Data are analyzed for trends in student achievement and areas for improvement. We have identified student populations who have difficulty—international students, athletes, and transfer students—and designed interventions specifically for them. We also use MREST subscales to determine which objectives students are struggling with and we make tutorial changes accordingly.
A shorter version of the MREST, InfoCore, is one of a battery of tests given on JMU Assessment Days. Incoming first-year students take assessments as a part of their orientation prior to classes and again after completing three semesters. Comparing the same cohort of students from first-year to sophomore shows that we are making gains in each objective, some very significant.
These efforts rely on a collaborative model between JMU Libraries, the General Education Program, and the Center for Assessment Research Studies (CARS). JMU offers a PhD in assessment and measurement which affords access to psychometric experts.
Session attendees will hear strategies for learning improvement and sample reports that have led to program improvement and used for state reporting requirements.
Commercially-Created: On the national level, the recently-created Threshold Achievement Test for Information Literacy (TATIL) has many of the benefits and limitations traditionally associated with standardized tests.1 TATIL is based on the ACRL Framework for Information Literacy and in four test modules measures both information literacy knowledge and dispositions.
TATIL has been adopted by institutions that are using the test to achieve a variety of local assessment goals as will be described in this paper. For example, TATIL results, including peer institution comparisons, are being used by one university to inform preparations for institutional reaccreditation. A librarian at another university used TATIL in a two-credit hour information literacy class within the general education curriculum. Results revealed gaps in critical thinking and problem solving, resulting in an expansion of the course to three credit hours.
Comparison: This paper will explicate the decision-making process for implementing a locally-developed test like MREST or using a commercially-available test like TATIL. We will present attendees and readers with comparisons of the benefits and limitations of these fixed-choice tests and offer guidance on how the tests can be used to meet institutional needs for information literacy assessment.
1 Erlinger, Allison. “Outcomes Assessment in Undergraduate Information Literacy Instruction: A Systematic Review” College & Research Libraries [Online], Volume 79 Number 4 (2 May 2018)
Developing Library Learning Outcomes: Reflecting on Instruction across the Library
Ashley McMullin, Jennifer Schwartz, and Janice Scurio (DePaul University)
Like many academic libraries, at DePaul University Library, our teaching and instruction efforts support a wide range of academic departments, introductory and advanced courses and multiple face-to-face and online formats. We offer programmatic instruction in foundational courses, advanced subject-specific instruction through our liaison program and specialized workshops and training in a variety of information, primary source and digital literacy topics. Despite—or perhaps because of—this diversity in instruction, the library recognized a need for a set of holistic learning outcomes to provide a strategic direction for our instruction and assessment efforts across the library.
During the 2017–18 academic year, DePaul University engaged in a campus-wide effort to review program-level learning outcomes in all academic and co-curricular areas. The goal of this project was to ensure all programs had a set of three–six learning outcomes that were clearly articulated, measurable and mapped to the curriculum. This provided the perfect opportunity for the university library to revise our learning outcomes with support from our university’s Office of Teaching, Learning and Assessment and buy-in from both library administration and library staff across departments. This process enabled all library staff to reflect on our learning objectives in every course we teach in the library, to incorporate professional standards and frameworks in our learning outcomes and to identify gaps in our outcomes and instruction. It also allowed us to articulate the value of the library to both academic and co-curricular areas throughout the university and demonstrate how the library contributes to student learning.
In this paper, we will discuss the process by which we mapped, reviewed and revised our library learning outcomes. We started by surveying all library staff who teach students in classes, workshops, one-on-one sessions and other venues on a regular basis. We wanted to be as inclusive as possible while limiting our scope to recurring instruction with measurable learning goals, even if those goals were not previously articulated as outcomes. We gave our instructors one month to submit a first draft at learning outcomes for each class they teach that met those criteria. We mapped those outcomes to our existing library learning outcomes to identify strengths and gaps between course and library level outcomes. We then organized the course level outcomes into new themes. Throughout our process, we referred to the ACRL Framework for Information Literacy, the Guidelines for Primary Source Literacy and several digital literacy frameworks. This paper will include the final library learning outcomes we developed, as well as specific examples we provided to help library staff identify how the outcomes apply to their specific instruction. We will discuss our plans for managing the annual assessment of these learning outcomes through our Library Assessment and Research Committee. Last, we will share our experience communicating the impact of our new outcomes with stakeholders both within and outside the library.
What Could We Do, If Only We Knew? Libraries, Learning Analytics, & Student Success
Megan Oakleaf (Syracuse University), Malcolm Brown (Educause), Scott Walter (DePaul University), Dean Hendrix (University of Texas at San Antonio), and Joe Lucia (Temple University)
Purpose: Library Integration in Institutional Learning Analytics (LIILA) was an IMLS-funded, grant project that ran from 2017–2018. Intended to increase academic library awareness and involvement in institutional learning analytics (LA) and develop a detailed plan to prepare academic libraries to engage in this emerging and important use of data to support student learning and success, the LIILA project brought together experts in academic library leadership, library systems, information technology, educational technology standards, and LA systems to discuss the inclusion of library data in institutional LA efforts. Learning analytics, the “collection and analysis of usage data associated with student learning…to observe and understand learning behaviors in order to enable appropriate interventions,” posits that understanding more about the behaviors and interactions that align with success will enable educators to support students more effectively as they endeavor to develop, learn, and attain success through their higher education experiences. Learning analytics approaches represent new territory for academic librarians and offer exciting opportunities for librarians to build upon previous assessment efforts and expand the ability of their institutions to support student learning and success.
Approach: LIILA participants engaged in rigorous dialogue and hands-on group exercises. At the first meeting, participants considered the potential impact of LA on student learning and success; developed a list of successful LA exemplars to study; prioritized “user stories” describing the integration of library data into institutional LA; ideated librarian roles institutional LA initiatives; identified obstacles to those roles; and drafted a vision for libraries in institutional LA. At the second meeting, participants analyzed the feasibility of enacting the prioritized user stories; assessed the value the use cases could provide to student learning and success efforts; and considered strategies for ameliorating known challenges. During the final meeting, participants developed library use cases; discussed how to apply interoperability standards to facilitate data communication between systems; and engaged library partner organizations in discussion about library data ownership.
Findings: This approach resulted in numerous outputs including a vision for library integration in learning analytics, identification of librarian roles in LA efforts, numerous user stories to guide future library data efforts, prioritized use cases describing library and institutional data integrations, a list of existing exemplars of library involvement in LA, and recommended “next steps” for continuing work in this field.
Value: The integration of libraries and institutional LA is a nascent field, but LIILA provides a firm grounding for future work in this area by outlining existing examples of library data inclusion at the institutional level, articulating user stories and use cases to guide the exploration of next steps, surfacing both obstacles and facilitators for this work, and advancing a vision of library involvement in institutional efforts to use data to support student learning and success. This contribution enables librarians to reflect upon the unprecedented opportunity LA offers to determine what libraries could or should contribute to the larger picture of student success at their institutions, and envision the ways in which they could transform their services and resources to better meet student learning needs.
Library Participation in Learning Analytics Initiatives: Library and Student Perspectives
Michael Perry (Northwestern University), M. Brooke Robertshaw (Oregon State University), Andrew Asher (Indiana University), Kristin Briney (University of Wisconsin-Milwaukee), Abigail Goben (University of Illinois at Chicago), Kyle Jones (Indiana University), and Dorothea Salo (University of Wisconsin at Madison)
Academic libraries are increasingly building capacity for, and participating in learning analytics (LA) initiatives. Defined, LA is the “measurement, collection, analysis, and reporting of [student and other data] for the purposes of understanding and optimizing learning and the environments in which it occurs.” (citation) With LA, institutions are more prepared to describe (what is happening?), diagnose (why did it happen?), and predict (what is likely to happen?) factors that influence or inhibit student learning, as well as and prescribe (what should we do about it?) data prescribed interventions. Libraries are pursuing LA insights to evaluate the impact of library services, collections, and spaces on student learning. The success of LA depends in part on an institution’s ability to connect campus information systems—including those under the purview of libraries—to aggregate and analyze student data. But as institutions continue to surface granular data and information about student life, the risk to student privacy grows and it is unclear how libraries are changing to meet those risks.
Very little research has addressed LA and student privacy issues from a student perspective, and extant research suggests that the student voice is missing from LA conversations. At the time of this writing, little to no scholarship exists that specifically considers student perceptions of their privacy when libraries are actively leading or contributing to LA initiatives. Because of these indicators, the research team identified a need to study library learning analytics and the privacy issues from the perspective of students . Simultaneously, there is not a clear precedent on how academic libraries should develop policies and procedures around LA initiatives.
This presentation will discuss the initial findings of a survey of Association of Research Libraries member libraries on policies and procedures around LA initiatives and data. It will also share initial findings from phase one of the Data Doubles three-year research project, which addresses questions about how LA initiatives align with and counter to student expectations of privacy, and how libraries might maximize the benefits of learning analytics while respecting student privacy expectations. Finally this presentation will end with a brief presentation of the further goals of the Data Doubles project, audience members can get involved and keep up with the team’s progress as it develops research findings.
Session 18: Services
Where Do We Grow from Here? Assessing the Impact of a Digital Media Commons on Student Success
Kathy Crowe and Armondo Collins (University of North Carolina at Greensboro)
Digital literacy skills are essential for 21st-century students to succeed in their academic work and future professions. Academic libraries around the country have implemented digital media services to assist students in developing videos, 3D objects, posters and other media for course assignments and other projects. Staff in these services also work with faculty to create and assess effective assignments so that students learn these important skills. At a high-research university a Digital Media Commons (DMC) was implemented in 2012 with technology and expertise in a separate service area to focus on media production projects Services have grown to also include a green screen lab, maker space, virtual reality, drones and a gaming lab.
In 2017–18 an assessment project was implemented to determine how the DMC contributes to student success. The project included three phases:
A faculty survey. Surveys were sent to faculty who worked with the DMC staff in fall 2017 and spring 2018 to provide classroom instruction and follow-up consultations. Faculty were asked to provide the learning goals for their multimedia assignment, how well the students met the learning goals and for feedback on the instruction session.
A customer service survey. Patrons who received assistance at the DMC service desk or used the DMC technology and/or spaces were asked to complete a survey in spring and fall 2018. Participants were asked what kind of assistance they received, what spaces and technology they used and to rate the quality of the interaction with staff.
Student focus groups. Focus groups were conducted in fall 2018 to delve deeper into the impact of the DMC on student success. Students were asked how the DMC supports their academic work, what specific aspects of the spaces and services they use and for what activities and what improvements they recommended for the DMC.
Results of the assessments will be used for a variety of purposes including improvement to instruction, addressing service issues and adding needed services or resources. It will be distributed to the libraries’ administration to use for planning future improvements, services and to address staffing issues. The data will also be used to inform an upcoming major library renovation and addition. The libraries have employed a consulting firm to provide a master building plan. These assessment results will be most beneficial for planning an updated Digital Media Commons.
This paper and presentation will discuss the methodologies used for the assessments, the results and how the data was distributed and applied for improvements and future planning.
Benchmarking Reference Data Collection: The Results of a National Survey on Reference Transaction Instruments with Recommendations for Effective Practice
Rebecca Graff (Southern Methodist University), Paula Dempsey (University of Illinois at Chicago), and Adele Dobry (California State University, Los Angeles)
Purpose: This study provides a cross-institutional snapshot of current practices in reference data collection and analyzes recent changes in what, how, and why academic and public libraries document their reference interactions. There is no other recent, national examination of how reference data are collected and used. The results form a benchmark for institutions that seek to develop best practices in gathering data for needs analysis, documenting impact or value, and external reporting. The authors advocate collaboration on a consistent approach to allow for more accurate cross-institutional research.
Design, Methodology, or Approach: We created a survey using Qualtrics, including basic demographic information. Then, our team requested participation through library-related listservs (brass-l; libref-l; medref; pla; pla-eval; rss-eforum; rss-l; rusavr) and Facebook sites (ALA, RUSA, ALA Think Tank). By using a variety of posting forums, we sought to ensure participation from across the profession. For this exploratory study, we requested that librarians upload the forms they use for capturing reference interactions, inquired about changes they have made regarding what data they collect, and asked for them to indicate the most useful statistics gathered. In the next couple of months, we will complete the process of coding and evaluating to provide a descriptive analysis of the data.
Findings: We received 232 responses, primarily from academic (61%) and public (31%) libraries. Aspects of analysis include: data capture methods; whether the information recorded conforms to RUSA and ARL definitions of reference; and identifying standard practices as well as outstanding innovations. It’s been 15 years since the Reference Service Statistics & Assessment : A SPEC Kit was published. At that time, nearly all academic libraries marked reference transactions with tick marks; now, over half of our respondents use other methods, including commercial software, freeware, and homegrown forms. We will detail the distribution of methods, as well as the different tools used to capture that data such as Desk Tracker, Google Forms, and Springshare.
Practical Implications or Value: Our results contribute to a culture of data-driven decision making and fill a gap in the literature. Attendees will read our paper and then reconsider what reference data they choose to gather, how to better align with normative practice, and options for improving reporting. With good reason, the overwhelming majority of libraries focus solely on quantitative data. To complement that information, we will proffer qualitative measures that further demonstrate the value of professional librarians to administrators. For example, we will explore how librarians can document use of the Framework for Information Literacy for Higher Education to enhance research interactions, place reference help more firmly along the reference-instruction continuum, and bolster our teaching profile on campus. Similarly, we believe it is possible for our data-collection instruments to reinforce professional competencies for research librarians. Our audience will discover trends, innovations, and tools for better reference service. The findings are a crucial foundation for moving toward more accurate and complete data gathering.
Implementing Standardized Statistical Measures and Metrics for Public Services in Archival Repositories and Special Collections Libraries
Amanda Hawk (Louisiana State University)
Developed by a three-year task force comprised of members of ACRL’s Rare Books and Manuscripts Section and the Society of American Archivists, the Standardized Statistical Measures and Metrics for Public Services in Archival Repositories and Special Collections Libraries report provides institutions—for the first time—with commonly-accepted guidelines for quantifying use and measuring impact. The report is divided into eight domains (e.g. user demographics, collection use, exhibitions) that outline basic and advanced reporting measures, plus recommended metrics when applicable.
In response to the report, Louisiana State University Libraries began efforts to apply the newly-approved measures and metrics in the Special Collections unit. We first evaluated the existing statistical data collected in past years, moving away from paper and pencil tallies toward robust software solutions. We identified new areas of reporting to implement in 2018. We initiated the changes in a test phase from April 1 to June 30 to work out problems and receive staff feedback, and launched the final version of the reporting measurements on July 1, 2018 to coincide with the new fiscal year.
This paper describes LSU’s process for implementing new measures and metrics across Special Collections, primarily through two applications: Springshare’s LibApps platform and Aeon, a request and workflow management software for special collections. Specific examples of changes made to LibApps and Aeon are included. The paper also considers how the enhanced reporting data dovetails with unit goals, and it highlights some of the many ways the data can be used in future decision making.
Assessing Need and Evaluating Programs for a Health Science Center Library’s Wellness Initiative
Ariel Pomputius, Nina Stoyan-Rosenzweig, Margaret Ansell, Terry Selfe, Jane Morgan-Daniel, Michele R. Tennant (University of Florida)
Purpose: To support a mental health crisis in healthcare providers and an associated growing interest among accrediting bodies and curriculum committees in encouraging wellness in the health sciences, the University of Florida Health Science Center Libraries created a Wellness Team. Because the library serves all six colleges across the Health Science Center, the Wellness Team was uniquely positioned to offer programs and information resources to support the wellness needs of students, staff and faculty across the entire center. In the first year, the Wellness Team administered a needs assessment survey to determine wellness needs; in the second year, they released a program evaluation survey to determine success, challenges, and future opportunities for their future wellness-related efforts.
Design: Both the needs assessment survey and the program evaluation survey were created in Qualtrics and distributed to students, staff, and faculty across the six colleges through the liaison librarians. Surveys were no more than 10 questions and contained both multiple choice and essay responses. Institutional Review Board exemption was sought and received for both versions of the survey.
Findings: The results from the needs assessment survey were used to develop a Wellness Initiative Proposal that determined the direction of the Wellness Team’s programming and resource collection efforts. This proposal included short-term activities for library users that could be immediately initiated, such as therapy dog visits, meditation classes, and coloring, as well as longer-term goals, such as yoga classes and dedicated wellness space in the library. The results also led to some interesting insights into what the students, staff, and faculty find is most important for their personal health and well-being and what a HSC Libraries’ role could be in supporting wellness.
The program evaluation survey is currently underway for the summer, with data analysis planned for the early Fall.
Practical Implications: Wellness is a growing area of interest, in the health sciences and beyond. Surveys to determine need and evaluate wellness programs will be important to determine the role of libraries in supporting the wellness needs of their users.
One Year In: Using A Mission-Driven Assessment Plan to Enact Change in an Academic Library Makerspace
Krystal Wyatt-Baxter and Amber Welch (University of Texas)
Makerspaces are becoming more commonplace on college campuses and within academic libraries, but librarians are still determining the best ways to assess these new kinds of spaces and the services they provide. For this study, a learning technology librarian and an assessment librarian teamed up to create and execute an assessment plan to guide decision making and continual improvement in a new academic library makerspace.
The purpose of this research is to determine whether a new makerspace at a large research university has, in its first full year out, accomplished its goals. These goals include advancing undergraduate, graduate, and faculty understanding of makerspace technology and the application of innovative production methods in educational and professional environments; supporting interdisciplinary constructivist learning and cooperation through strategic campus partnerships; and developing and stewarding a safe, inclusive makerspace that represents the diverse population of faculty, students, and staff at the institution. Additional data are gathered and used to provide a holistic picture of how patrons are utilizing the space. These measures provide information that is considered when making decisions about operational hours, service levels, staffing, and equipment purchases.
This study employs a mixed methods design including user surveys, staff focus groups, and analysis of user demographic data. The approach was to follow a mission-driven assessment plan developed as part of a campus effort to achieve continuous improvement. The plan includes goals, outcomes, strategies for achieving outcomes, and assessment methods. This study will report on findings and next steps resulting from the first year of following the plan.
Methods are still currently under implementation, but preliminary findings show that the makerspace is serving a diverse constituency that mimics campus percentages for gender and ethnicity. Departmental representation is also diverse, though not completely representative of departmental percentages as a whole. Usage is steady and slowly growing, and events continue to attract new users. Similarly, course-integrated use is slowly growing, drawing new classes each semester while retaining courses that previously used the space. Further data will help to determine at what level students are achieving their goals in the space and whether they perceive the Foundry as being welcoming. Focus groups with student workers have revealed that service models continue to improve and point to further directions for future development.
This paper will detail practical strategies for assessing an academic library makerspace. Library literature in this area is growing, but is not yet robust. Librarians in the process of designing a makerspace as well as those currently operating a makerspace will benefit from learning about how a mission-driven assessment plan can be used to track progress and enact change.