Assessment Accelerators

Tuesday, November 1

11:30 a.m.–1:00 p.m.


CARL Library Impact Framework

Justine Wheeler (University of Calgary) and Mark Robertson (Brock University)
75 minutes, registration limit: 50

Show More

Audience: This presentation is suitable for anyone interested in using logic models, however, all of our examples will be from an academic library perspective.

Introduction
Focused on impact pathways, the CARL Library Impact Framework uses a modified logic model as a way to visualize the arc of influence of  libraries’ programs, resources, and services.

Exemplars of how logic models can be used will be provided, but these logic models are not intended to be exhaustive or complete in themselves. The value is primarily in the application of the models, the process of thinking through logic models for local needs, and how logic models can contribute to making impact pathways visible.

Structured Activity
The facilitators will work with the audience to create an assessment question and then work through a logic model. Polls and quizzes will be embedded into each step of the learning process.

Extension
Discussion will focus on potential use of logic models, as well as, possible challenges to use.
Moving assessment from evidence to insight as one moves from inputs to outputs will also be discussed.

Summary
Participants will take away an understanding of the CARL Library Impact Framework, and how it might be applied in their assessment work. Links to online resource material will also be provided.

Learning Outcomes

  1. Understand the context and development of the CARL Library Impact Framework
  2. Use the CARL Library Impact Framework
  3. Discuss the benefits and limitations of logic models.

Justine Wheeler
Throughout my career, instruction has always been a component of my responsibilities.  I have a presented in both the online and face-to-face environment. I have actively pursued opportunities to improve my pedagogical practice including participating in workshops and attending presentations.  Along with my MLIS, I hold a PhD in Educational Research. I served as the Co-Chair of the Impact Framework Working Group.

Mark Robertson
Mark serves on the CARL Assessment Committee, and is the Chair of the CARL Library Impact Subcommittee.  This committee was tasked overseeing the Impact Framework Working Group. Mark was integral in the CARL Library Impact Framework.  He has extensive presentation and public speaking experience.


A Roadmap to Practical Strategic Planning

Maurini Strub (University of Rochester) and Starr Hoffman (UNLV Libraries)
75 minutes

Show More

Audience: Beginner/Intermediate

Organizations invest a great deal of resources into developing a strategic plan, only for it to frequently land on a shelf, file cabinet, or be electronically archived. We will take a look at best practices for creating, implementing, and managing a strategic plan that remains at the forefront of every staff member’s mind. Throughout this session, participants will have an overview with an opportunity for input/small group discussion, around surmounting challenges/obstacles to implementation.

Learning Outcomes

  • Understand structures for creating and implementing a strategic plan.
  • Identify, manage, and realign cultural mismatches of operational vs strategic work.
  • Building accountability in an implementation plan.

Detailed Outline
Participants will learn how to walk their organization through the process of creating a strategic plan. We will consider how to create and balance different kinds of goals (aspirational, strategic, operational, etc.) and how to bake assessment in from the beginning. We will particularly focus on constructing a planning process that is inclusive and involves all levels of the organization, and how that can increase buy-in. The end goal is a living document, presented in a digestible format that is flexible and outcomes-based. Participants will participate in small group discussion and brainstorm how to implement this process in their organizations.

Next, the session will discuss how to apply an outcomes-based assessment framework to create a required (but flexible) structure that drives projects that advance strategic goals. We will also cover how to design an assessment plan for the implementation that keeps the strategic plan at the forefront of everyone’s minds. Takeaways will include identifying the type of plan and goals that fits your organization, how to plan an iterative planning process, and how to implement and assess your plan.

Starr Hoffman
Dr. Hoffman’s work includes leading strategic planning and library assessment to improve services and support decision-making, as well as contributing to the accreditation process. She has led workshops on strategic planning and assessment at ACRL, LAC, and the International Conference on Library Performance Measurement.

Maurini Strub
Maurini has been a part of the team that led her current institution through the implementation of the first phase of their strategic plan, including  the development of the mechanism for assessing its progress.


Present & Future Proficiency: Updating the ACRL Assessment Proficiencies to Reflect Current and Coming Realities

Rebecca Croxton (University of North Carolina at Charlotte), Megan Oakleaf (Syracuse University), and Jung Mi Scoulas (University of Illinois Chicago)

90 minutes, registration limit: 60

Show More

Audience: The audience for this session should run the gamut of library assessment professionals and be inclusive of both new and veteran library assessment practitioners to gain maximum feedback into the Proficiencies revision process.

Detailed Outline
Session presenters who are leading an ACRL task force to update the 2017 ACRL Proficiencies for Assessment Librarians and Coordinators (https://www.ala.org/acrl/standards/assessment_proficiencies) invite conference attendees to engage in active conversation and contribute input to identify gaps in the existing proficiencies and brainstorm new proficiencies that should be included in order to ameliorate the absence of Social Justice, Equity, Diversity, and Inclusion (SJEDI) content in the current document.

The Proficiencies for Assessment Librarians and Coordinators allows academic libraries to begin with a common definition of assessment librarian responsibilities. Proficiencies may be used to write job descriptions that define the duties of assessment librarians, assess performance and guide evaluation, and design professional development programs. These proficiencies are also used by graduate Library & Information Studies programs to educate pre-professional librarians to take on assessment tasks across a range of library positions/roles.

Introduction: Presenters will begin the accelerator with a brief history of the development and use of the Proficiencies for Assessment Librarian and Coordinators, share the results of a recent evaluation of the presence of SJEDI content in library professional standards, describe the motivation behind the current effort to update and improve the Proficiencies, summarize the process undertaken thus far, and close with an overview of the current draft Proficiencies document.

Structured Activity: Participants will be divided into small groups to engage with sections of the draft Proficiencies. Following a “start, stop, continue” feedback approach, participants will evaluate content using a “keep, delete, revise” activity in which they use hands-on technology (e.g., Jamboard, Padlet) to recommend proficiencies that should be kept, removed, or altered. In the latter case, participant recommendations for changes will be elicited and captured. All feedback will be delivered to the Proficiencies writing team for inclusion in the draft produced for later public comment.

Extension: Following the small group exercise, participants will reconvene for a full-group discussion of strengths and weaknesses of the current draft using a Plus/Delta (strengths/changes) approach.

Summary: Presenters will summarize major takeaways from the feedback, describe next steps for the Proficiencies revision process, and recommend future uses of the final iteration.

Learning Outcomes
Participants will:

  • Actively revise the ACRL Proficiencies for Assessment Librarians and Coordinators through a lens of SJDEI.
  • Ensure inclusion of SJEDI content as well as innovative assessment approaches in this important professional document.
  • Ideate suggestions for future use and application of the Proficiencies.

Presenters are scholars who present professional development and LIS courses at the national level, undertake library assessment work in their practice, and are members of the task force engaging this revision work.

1:30 p.m.–3:00 p.m.


Choosing Your Own Scholarly Communication Assessment Adventure: Applying and Reviewing a Draft Engagement Matrix to Evaluate Program Growth and Development

Emily Chan (San Jose State University), Suzanna Yaukey (Towson University), Nicole Lawson (California State University, Sacramento), and Daina Dickman (Network of the National Library of Medicine Region 5)
75 minutes, registration limit: 60

Show More

Audience: This accelerator is intended for assessment and scholarly communication practitioners across the spectrum of expertise.
Detailed Outline
Scholarly communication librarians frequently struggle to show the impact of their myriad services and programs.The Scholarly Communication Assessment Forum (SCAF) IMLS project goal was to address these gaps by developing reusable assessment tools such as the SCAF Engagement Matrix. The matrix assesses the level of embeddedness of a scholarly communication program by identifying the various resources allocated to the program and contextualizing the actions taken by the library (or campus) to support scholarly communication. Sharing this matrix with practitioners gives them an actionable tool to begin to think about program impact and planning.
The program will include an introductory presentation detailing the history of the project and the development of the SCAF Engagement Matrix. Participants will be given the matrix to work with, split into groups of 2-3 people, where they will discuss how they would rate and rank their institution in terms of the maturity and the embeddedness of their scholarly communication programs. Participants will also be encouraged to “choose their own adventure” and add options to the matrix to better suit their local configurations and services. Participants will be asked to provide feedback on the matrix, as well as its suitability for ascertaining past achievements and future growth.

Learning Outcomes
By the end of the session, participants will be able to identify various qualitative criteria for assessing scholarly communication programs. Attendees will gain an appreciation of how other institutions may be assessing scholarly communication activities through small and large group discussion. Attendees will be able to interact with an engagement matrix for future application to their own libraries’ scholarly communication services.

Both presenters are co-principal investigators for the grant that produced the engagement matrix. Emily Chan oversees her library’s efforts to embed information literacy in the curriculum and is well-versed in best practices in teaching library instruction. Suzanna Yaukey developed the matrix shared for the accelerator and oversees all areas of academic librarianship, including relevant areas of instruction, assessment and scholarly communication, in her role at her library.


Demystifying Qualitative Coding

Alisa B. Rod, Marcela Y. Isuster, and Tara Mawhinney (McGill University)
90 minutes

Show More

Audience: This session is aimed toward library practitioners and professionals interested in learning about using qualitative coding in their own research and assessment initiatives. While participants are encouraged to have a general understanding of qualitative research, no previous knowledge/expertise is required.

Detailed Outline
Introduction: Evidence-based assessment practices and initiatives often us quantitative methods for information and data gathering. However, many practitioners rely on unstructured data types, such as open-ended text survey responses, transcripts, etc. In this session, participants will learn about tools, resources, strategies, and best practices for coding qualitative data to in order to conduct meaningful content analysis. The presenters will discuss different types and levels of coding, codebook development, and available qualitative coding software.

Structured activity: Participants will practice and refine their qualitative coding skills by analyzing a sample of qualitative textual data. The first part of the activity involves analyzing an unstructured text excerpt from a dataset to brainstorm potential codes and schemas. The second part of the activity introduces a validated codebook for participants to apply to two additional excerpts.

Discussion: Participants will be encouraged to share their experiences completing the activity and how qualitative coding applies to their own projects. Discussion will aim to identifying ways that coding can be implemented as part of a methodological toolkit by each participant at their own institution.

Summary: Participants will leave the session with a better understanding of best practices for qualitative coding including planning, executing, and using qualitative analysis software.

Learning Outcomes
By the end of the session participants will be able to:
  • Define key concepts including qualitative coding, content analysis, codebooks, and unit of analysis and apply codes to a dataset using a sample codebook
  • Communicate context-specific examples of how qualitative research approaches can be applied to assess library programs and services
  • Explain ways that qualitative analysis software can facilitate coding

This session is adapted from a workshop developed in Winter 2022, which has since then been delivered both virtually and in person to a variety of audiences including faculty, hospital researchers, graduate students, and library practitioners. All three presenters have conducted and published qualitative research studies including:

Gendron, H. & Rod, A.B. (2014). A mixed-methods approach to questionnaire development: Understanding students’ interpretations of library survey questions. Proceedings of the 2014 Library Assessment Conference, 63–74.

Mawhinney, T. (2020). User preferences related to virtual reference services in an academic library. The Journal of Academic Librarianship46(1).

Rod, A. B., Isuster, M. Y., & Chandler, M. (2021). Love Data Week in the time of COVID-19: A content analysis of Love Data Week 2021 events. The Journal of Academic Librarianship47(6).

Click here for full list


Online Participatory Design: Activities and Approaches for User Engagement in the Remote Environment

Jackie Belanger, Maggie Faber, and Reed Garber-Pearson (University of Washington)

90 minutes, registration limit: 50

Show More

Audience: The session is appropriate for new professionals and experienced practitioners seeking new tools for engagement and learning about user experiences.

Learning Outcomes
After attending the session, participants will be able to:

  • Choose and facilitate participatory design activities to better understand user needs and experiences
  • Identify contexts where online participatory design can be employed for user engagement and service development/improvement
  • Describe how online participatory design can be used as part of an overall assessment/UX program toolkit

In 2020, the presenters co-facilitated an 11-week participatory design experience with six students from fully-online programs. In preparation for these sessions, presenters created an Activities Index that outlines dozens of activities for engagement and design, all suitable for online, hybrid or in-person use. In response to the success of the participatory design project, presenters have used the design activities to facilitate learning experiences with faculty converting their courses to an online format at their University’s Center for Teaching & Learning, with a Faculty Fellowship in Teaching with Technology, and a workshop on creating inclusive syllabi. Presenters have also used activities for Libraries staff to foster engagement during remote working.

Detailed Outline
Introduction: Participatory design is a methodology centering equity, social justice, and expertise sharing between participants, focusing on including end users in the design process. In Spring 2020, we engaged in a fully online, 11-week participatory design project to understand the needs of six students in online programs. Presenters will explain the project, what we learned about this method, and the benefits of using participatory design in different contexts. Participants will experience facilitated design activities in small groups, with guided discussion to explore how to use activities for their own work.

Structured activities: Activities designed to surface student impressions of the library and identify areas for co-design: Library Is/Is Not and/or Build Your Vehicle.
Activity designed to disrupt typical thinking and gain inspiration: Remix-a-wish. The result is a list of “wishes” around a topic (such as a new or re-tooled library service).

Extension: In what contexts might you use participatory design in your work? What resources might you need to conduct a participatory design project at your own institution?

Summary: Participants will leave with an understanding of how participatory design works in the online environment and have a toolkit of activities they can use for their projects.

3:30 p.m.–4:45 p.m.


A Discussion on the Evolution of Assessment Work in Academic Libraries: Is this a fluke, or is this our future?

Kat Bell (George Mason University), Emily Guhde (Georgetown University), Steve Borrelli (Penn State University Libraries), and Maurini Strub (University of Rochester)
90 minutes, registration limit: 50

Show More

Audience: This proposal aims to engage mid-career and expert practitioners in the evolving roles of library assessment professionals. However, the nature of the discussion should appeal to professionals new to assessment per the forward focus of the sub-domain of academic librarianship.

Detailed Outline
This session will open a discussion on the evolution of assessment positions in academic libraries and explore a potential paradigm shift affecting current assessment professionals. The session will begin with a panel sharing experiences from their local university contexts, followed by facilitated small group breakout sessions, and a wrap-up with the larger group to share and summarize the current status and experiences of other professionals with assessment responsibilities. The session seeks to connect attendees that share similar changes in their positions, leading to new questions and point to whether we are seeing a broader trend or just a series of coincidences. Identifying a broader trend would highlight the need for expanding the discussion and research.

Learning Outcomes
Attendees will:

  1. Increase awareness of assessment position changes happening in individual institutions.
  2. Articulate how organizational needs and individual experiences relate to themes and patterns happening across the field.
  3. Explore what other discussions need to happen or areas for further research.

The presenters have a combined 36+ of years experience in library assessment, 21 of those years at a department head or director level.


Data Ethics and Learning Analytics: Putting Privacy into Practice

Lisa Hinchliffe (University of Illinois at Urbana-Champaign)
90 minutes, registration limit: 25

Show More

Audience: Any librarian who is interested in or working with learning analytics will find value in this session. It is highly pragmatic and action oriented.

Detailed Outline
Introduction: The introductory lecture will provide a foundation learning analytics in higher education and the ethical issues that are raised by this work, with particular attention to the privacy questions and frameworks for privacy by design.

Structured Activity: Participants will complete Privacy Sourcebook components, specifically developing a library analytics privacy vision and/or stakeholder talking points.

Extension: Participants will share their drafts and provide feedback through a structured reflection/discussion process.

Summary: Participants will identify next steps using the action plan component of the Privacy Sourcebook.

Learning Outcomes
After the workshop, participants will be able to:

  • Describe the social, political, and technological elements of learning analytics in higher education, generally, and academic libraries, specifically, and analyze how learning analytics principles, policies, practices, and recommendations and the ways in which they may create privacy harms.
  • Plan for ethical and evidence-based library learning analytics activities that are based in privacy by design.
  • Develop an action plan for engaging with learning analytics, information privacy, and ethical practice one’s home institution.

The proposed workshop would be an offering of Prioritizing Privacy, a multi-faceted continuing education program that will train academic library practitioners to comprehensively address privacy and other related ethical implications of learning analytics projects. Prioritizing Privacy is funded by an IMLS National Leadership Grant and more information is available on the project website (http://prioritizingprivacy.org/).

The workshop would guide participants in exploring learning analytics, privacy theory, privacy-by-design principles, and data ethics, using the grant-developed Privacy Sourcebook, which can continue to guide their practice after the workshop. The workshop materials have been field-tested and developed through a design process informed by input from international experts in privacy.

Lisa is the PI of this project.


Introducing the Values-Sensitive Library Assessment Toolkit: A Practical Approach for Ethical Assessment

Scott Young (Montana State University)
90 minutes

Show More

Audience: This learning session will be useful for assessment practitioners at any stage of their career. The Value-Sensitive Library Assessment Toolkit is a flexible framework that can be used three ways: as a planning tool when designing a new assessment, as an evaluation tool when in the middle of an assessment or when looking back on a completed assessment, and as a teaching tool when educating about the practice of assessment. I believe that this session would be relevant for the full range of conference attendees.

Detailed Outline
Introduction: This session introduces the Value-Sensitive Library Assessment Toolkit. The toolkit takes the form of a participatory design card deck, featuring 12 Value Cards and 3 Exercise Cards. Each Value Card features a value relevant for library assessment, such as Alignment, Validity, and Stewardship. The Exercise Cards include instructions for working with the Value Cards. The aim of the toolkit is to center values in library assessment, in support of an ethical practice. I developed this toolkit from a national study of library assessment practitioners (via survey and interviews). I will also report briefly on this research.

Structured Activity: The toolkit itself includes three exercises that will be completed during the session: Connect Two builds an understanding of the values, Must-Haves sorts the values by priority, and Anchors and Sails reveals the barriers and opportunities for implementing the values.

Extension: I will facilitate discussion before, during, and after the exercises so that participants are equipped to apply the toolkit in their local settings.

Summary: Participants will walk away from the session with fresh insights about the values relevant to their practice. Participants will also receive the toolkit itself as a takeaway in the form of a print-ready digital download.

Learning Outcomes

  • Articulate the values that matter to your assessment practice.
  • Prioritize different values for different contexts, and understand how values can complement or conflict with each other.
  • Identify and remove barriers to implementing values; identify and enhance supporting factors for implementing values.

In terms of topic expertise, I have 10 years of experience researching and practicing in user-centered design, and 5 years of experience in assessment and practical ethics. In terms of presenting and facilitating, I have given over 70 presentations within our LIS community, including a dozen or so workshops or other interactive sessions. Among those, I have delivered a number of remote or virtual presentations and workshops, so I feel prepared to facilitate an online workshop for the conference. As an assessment practitioner who is oriented toward user-centeredness, I am particularly sensitive to the need to create a learning session that is positive and meaningful for conference participants. I believe that I have both the topic expertise and the presenter experience to accomplish that for the LAC community.


Thursday, November 3

11:30 a.m.–1:00 p.m.


Developing and Implementing Library Inventories

Holt Zaugg (Brigham Young University)
75 minutes

If you are planning to attend this session, please download the Audit Description Template from this folder prior to the presentation: https://tinyurl.com/byuaudit

Show More

Audience: This presentation is geared for assessment practitioners and library administrators.

Detailed Outline
Participants will leave the session with insights on how to conduct current and future inventories unique to their library. Prior to and at the start of the presentation, participants will have the opportunity to download a workbook to guide them through the presentation and to use as a reference following the presentation. They will also be encouraged to bring or have access to specific items in relation to the presentation.

Learning Outcomes

  • Participants will be able to state at least three purposes for an inventory.
  • Participants will engage in and complete the initial steps in at least one inventory.
  • Participants will leave the session with insights on how to conduct current and future inventories unique to their library.

The presenter has 11 years’ experience in presenting professional development workshops to teaching professionals. Since moving to an academic library setting, he has over two dozen conference presentations and has presented three webinars on assessment topics. He oversees the assessment of library operations within the BYU Library. Over the past 10 years, he has coordinated or conducted almost 110 library assessments.


Learning Analytics: It’s Coming…Get Ready!

Ken Varnum (University of Michigan), Megan Oakleaf (Syracuse University), and Rebecca Croxton (University of North Carolina at Charlotte)
90 minutes, registration limit: 120

Show More

Audience: The audience for this session should include librarians new to learning analytics as well as those with more advanced knowledge who may be considering a pilot in the future.

Detailed Outline
Learning analytics is not new to higher education, but it presents novel—and sometimes controversial—questions, decisions, and practices for library assessment practitioners. The last several years have seen a rise in library conversations not just about whether libraries should participate in these campus efforts, but about how we can do so while keeping our professional ethics and practices in front of mind. In this new context, assessment librarians are being asked to adopt new assessment strategies and practices that document the library’s role in student learning and success in ways that will enable libraries and overarching institutions to make data-informed decisions about how we can do better work with and on behalf of our students.

To take the next steps of library engagement in learning analytics, librarians must 1) identify problems and questions that merit a learning analytics approach; 2) frame these problems and topics as user stories to highlight the impact of the approach on their students, faculty, library colleagues, and institutions; 3) envision library data that can contribute to solving problems and answering questions; and 4) ideate pilot studies that move libraries forward in this important use of data to support student learning and success.

Prior to the session, a pre-assessment, readings, and recordings will be distributed to ensure the level of the accelerator matches the learning needs of participants.

Introduction: This accelerator will begin with a brief presentation describing recent work in understanding library participation in campus learning analytics efforts, including LIILA, CLLASS, and Data Doubles.

Structured Activity: In small groups, participants will engage in a card sort activity to identify the broad analytics questions, problems, or areas of growth related to student learning and success that are most relevant to their institution. Next, participants will complete a worksheet-guided activity to convert these questions, problems, or growth areas into user stories that frame a specific purpose and direction for learning analytics. Draft user stories will be viewable to all participants in a shared document to enable synergies and facilitate a group discussion.

Extension: Next, participants will merge into larger groups to share their user stories and brainstorm data sources that could be used to answer these needs. Finally, each group will report out their top 1–2 ideas to the full group.

Summary: Presenters will reconvene the full group to present guiding suggestions for library learning analytics pilots, share an action plan template for pilot planning, and gather and respond to participant questions and insights.

Learning Outcomes
Participants will:

  • Brainstorm questions, problems, or growth areas related to student learning and success that can be informed by learning analytics approaches and are relevant to their students, library colleagues, and partners within their institutions.
  • State questions, problems, and growth areas in a user story format suitable for informing learning analytics pilot design.
  • Identify library data relevant to user stories to use in a learning analytics pilot.

Presenters will reconvene the full group to present guiding suggestions for library learning analytics pilots, share an action plan template for pilot planning, and gather and respond to participant questions and insights.


Six Dimensions: Evaluating and Planning Your Assessment Portfolio

Gregory A. Smith and Kory Quirion (Liberty University)
90 minutes, registration limit: 50

Show More

Audience: This class will be most useful for practitioners who meet two criteria:
  1. Experience in planning and implementing assessment activities in a library or similar setting
  2. Responsibility for a portfolio of assessments, whether related to a unit or function, a single library, or a system of libraries

Content will be appropriate for practitioners working in any geographic location and type of library.

Detailed Outline
Introduction
The presenters will introduce Six Dimensions as a useful tool for evaluating and planning a portfolio of library assessments. In brief, the tool calls for classifying assessments in six ways:
  • Quantitative vs. Qualitative
  • Solicitation/Perception vs. Observation/Behavior
  • National/Comparative vs. Local/Unique
  • Continuity/Longitude vs. Discontinuity/Exploration
  • Inputs/Outputs vs. Outcomes
  • System-Oriented vs. User-Centric

Each dimension will be introduced briefly via oral explanations, reference to guide documents, and class-wide work on low-stakes problems.

Structured Activity
Participants will be given a list of assessment scenarios. Using the Six Dimensions taxonomy, they will practice classifying various assessments, preferably in small groups.

Extension
Participants will reconvene as a full group to discuss their classification efforts.
Presenters will focus on moderating discussion about classifications that were difficult to assign. Following this, presenters will illustrate how they have used Six Dimensions to generate insights about their own library’s assessment portfolio.

Summary
Presenters will encapsulate key principles relevant to the use of Six Dimensions:

  • Assessments can be classified in a variety of useful ways.
  • A library benefits from planning and executing a diverse assessment portfolio.
  • Six Dimensions can help stimulate critical and creative thinking about assessment that fits a particular context.
Learning Outcomes
Participants in this accelerator will learn to…
  1. Recognize the various facets of assessment encompassed in the Six Dimensions tool.
  2. Use Six Dimensions to assign classifications to specific assessments.
  3. Use Six Dimensions to identify potential strengths and weaknesses of an assessment portfolio.
Gregory A. Smith has been active in academic library assessment for more than 20 years. He has presented and published on a variety of topics related to library assessment, planning, and strategy. Additionally, he has offered in-service assessment training to dozens of employees at his own library.

Kory R. T. Quirion began his journey with library-specific assessment in 2019. As Director of Finance & Assessment, he is responsible for planning and coordinating a diverse array of library assessment activities. Qualtrics, Python, and Power BI are among the tools that he uses to collect and analyze data.