The Importance of Design in Assessment
Selena Bryant, User Engagement and Inclusion Librarian, Mann Library, Cornell University
The 2018 Library Assessment Conference covered the importance of assessment in all aspects of librarianship, from organization wide strategic planning to small-scale projects. My initial assumptions about the conference were that it would be more focused on strategic planning and design but it was so much more. There were several simple things that really made this conference stand out initially. The welcome included statistics of how registration money was spent, which showed a refreshing level of transparency. Also first time attendees were asked to stand and we were a sizable amount of the total participants; this allowed everyone to get a good visual of how the field of assessment in librarianship is growing.
My initial interest in the conference stemmed from working on a space redesign project at my library and I originally planned to primarily go to sessions related to space design. But there were so many interesting and varying sessions that I also went to sessions on diversity, larger organizational issues and collections. Hearing about these varying projects really highlighted the importance of thorough design and strategic thinking when embarking on any plan. The poster sessions also highlighted these themes. I got to present a poster on work I was doing and enjoyed the chance to see what other work was being done around space redesign.
The methods and tools sessions were my favorites. There are so many options for data visualization software. Seeing how others used different software and what it could do was extremely valuable and will be of help when I am considering the best way to visualize the data I have gathered for different projects. I left wanting to try several different softwares and processes to more effectively explain outcomes.
I found the Value and Impact session ‘Impact and Ethics: A Meta-Analysis of Library Impact Studies’ very thought provoking. It was a good reminder to consider why you are collecting what data and the importance patron privacy. For me this session harkened back to the importance in properly designing and analyzing any project or paper.
The most emphasized point for me was the importance of creating a thorough process, conducting a needs assessment and starting with a solid plan at the outset of design, which really allows the implementation process to go more smoothly. I was inspired by the rigorous work many colleagues had done. To paraphrase Paul-Jervis Heath, designs are experiments that constantly can be improved and changed and post-assessment is just as important as pre-assessment.
Attending the post-conference workshop “Creating an Assessment Plan & Data Inventory: Aligning and managing metrics of success” led by Starr Hoffman was a great ending and wrapped up how data collected from small scale projects could be applied to institutional strategic planning. We were given a great template to start the planning process. I left this conference with some great tools and more knowledge on how to implement and track the results of my current and future work.
Assessment as Process, Assessment as Culture
Alexa Carter, NCSU Libraries Fellow, North Carolina State University
As I walked into the 2018 Library Assessment Conference for the first time, I reflected on my ideas about assessment. I have always thought of assessment as a vital process that allows libraries to collect valuable data for evaluating performance. However, after attending the Library Assessment Conference with nearly 600 of my closest assessment professionals, I realized assessment is much more than an appraisal process. Assessment is a culture that ensures that our institutions are more inclusive and responsive to users’ needs in strategic decision-making.
The conference kicked off with its first keynote speaker, Paul-Jervis Heath, a designer from Modern Human who shared a unique perspective on library space and design research. As he told about his work with redesigning physical spaces with strategically placed potted plants and prototyping spaces and services outside of the physical realm, I reminisced of assessments past and future. Heath described the use of design ethnography in designing library environments to emphasize the need to look beyond one’s instincts and intentionally partner with users to imagine and construct a future space or service together. I learned that this can be achieved by challenging both institutional and personal assumptions, prototyping and testing out new realities that allow one to observe and get to know users, and treating each iteration of the design process as an experiment that one can learn from and purposefully improve on. Sounds easy enough, right? Perhaps it can be if we allow ourselves to not get hung up on complexities and flex our nimbleness.
I continued my conference journey with several sessions focusing on various aspects of assessment, including learning analytics, diversity, equity, inclusion, and space. Selena Killick, Richard Nurse, and Helen Clough’s presentation, “The Continuing Adventures of Library Learning Analytics: Exploring the Relationship between Library Skills Training and Student Success,” discussed the evaluation of student data and performance based on interactions with online library modules and training sessions. This presentation, along with others in the same session, stressed the importance of creating policies about the ethical use of student data for learning analytics and involving our users throughout the process. Another highlight of the conference was the diversity and inclusion discussion led by Mark Puente. This discussion emphasized the necessity to advocate for diversity and equity through assessment work to create inclusive library spaces and services. Methodologies for working towards more diverse and equitable assessments were explored in Scott W. H. Young and Hailley Fargo’s presentation, “Assessing and Improving the Experience of Underrepresented Populations.” It gave attendees insight into leveraging input from diverse user communities and co-creating library services through the participatory design process. I was reminded that as we continue to challenge how we seek, collect, frame, and utilize assessment data, we can begin to better understand our users and enhance our services with a sharpened focus on equity and accessibility.
Attending the Library Assessment Conference left me inspired, intrigued, and curious. What do library users say when we’re not around? What about non-users? How can we create sustainable workflows to capture and incorporate users’ thoughts and needs as we refresh library services and spaces? Although creating and implementing assessments is an experimental process that takes time to culminate and grow, it has become an essential key to ensure our libraries move purposefully without leaving our users behind. In this strengthening culture of assessment, we must continue to challenge our assumptions and collaborate with our users to make our libraries the best versions of themselves.
Reflection on the Library Assessment Conference 2018
Kimberly Lace Fama, Reference Librarian, The University of British Columbia
This year’s library assessment conference was an unforgettable one as it began with a bang when I attended Paul-Jervis Heath’s pre-conference workshop, titled “Service Design for Libraries.”
Paul is the principal founder of Modern Human and his team has worked with clients across different industries such as banks, retail, educational institutions and libraries. The workshop provided a glimpse of his work and the thought process that went into designing services with the end goal of delighting users.
Being used to just relying on surveys to understand and assess user needs, Paul’s workshop opened a world of opportunities. In fact, it was mentioned that asking what people what they want or need is not the most accurate measure as more often than not they do not even know what they themselves want.
This was a mind-blowing revelation for me but it definitely makes sense. He shared the story of an improvement of the Pyrex measuring cup where the probability of someone saying that the parallax error can contribute to baking mishaps as baking is based on measurements.
Deriving from his experience, Paul shared several ways in how we can design for the future by getting to know the people, and I believe too that it is very important as our patrons are our end users who should be delighted with our services and resources.
The theme for the conference is all about building effective, sustainable and practical assessment and there was a great mix of those topics at the plenary and poster sessions. As a liaison librarian, I was very interested in assessment practices that are focused on the methods, tools, measuring value and improving the user experience.
Learning analytics is an important topic for the assessment world and I was glad to find out that there were several sessions that are focused on that. One such session talked about learning analytics to measure the relationship between library skills training and student success. In this study, the authors from the Open University focused on their online students to see how their attendance at either recorded or live library research training have impacted their grades positively.
While it is fantastic that learning analytics could provide such valuable insight, not every library has the ability to have access to this powerful data about student learning. Megan Oakleaf thus pointed out in her session titled “What Could We Do, If Only We Knew? Libraries, Learning Analytics, & Student Success” that perhaps it is time to think about how librarians of the future can contribute to this effort, possibly by providing library data.
Another highlight from this conference was the different models and techniques from other disciplines used in assessment. In “From Indifference to Delight: Gauging Users’ Preferences Using the Kano Model,” what stood out for me was that the authors used the Kano Model in assessing the student’s preference on the space study area at Cornell University and the preference on how Penn State’s digital signage would look. It shows that my colleagues are using very creative and effective ways in taking into consideration the needs and wants of the students or user audience.
Coming from a business background where it is not only important to acquire data but also to know how to use it, the Library Assessment Conference provided very valuable insight into how we can improve the results that users derive from their library experience. Librarians play a vital role in helping users find exactly what they are looking for (even if they don’t know what it is from the start!) and having the proper data and analytics will help us do just that.
2018 Library Assessment Conference Reflection
Twanna Hodge, Academic/Research Librarian, Health Sciences Library, SUNY Upstate Medical University
This was my first time attending the Library Assessment Conference. It was an eye-opening experience. I have always been interested in assessment and have done several quick and dirty assessment strategies and tips in library instruction sessions. These entailed one-minute papers, the jigsaw puzzle, think-pair-share and more. I initially set out to learn about current strategies and the best practices for assessing user needs. In addition to that, I also wanted to examine current tools, methods, and strategies for how diversity, equity, and inclusion are incorporated. Some other important aspects I was interested in include gaining insight into how others are measuring their teaching as well as exploring more about learning, reference services, and user experiences with staff, services, and resources. These topics and more were addressed by the conference’s unique speakers in a way that helped me gain confidence in the area of supporting graduate students and participation on my library’s assessment committee.
Paul-Jervis Heath taught me to treat everything as an experiment through tackling it by assuming ignorance and working from the ground up. He also advised to continually involve users throughout the process, use different methods but also to understand that it takes time, with the goal in mind, to purposefully make things better and that the design process is self-correcting. A lot of the concepts seem simple but sometimes we complicate things and make them more complex than they should be.
The Diversity, Equity, and Inclusion sessions covered a plethora of concepts such as: legitimacy, measuring what matters, looking at the online presence profiles of researchers based on gender and how they deal with online threats, critically engaging the power structures, land acknowledgment and asking, “Who’s on your team?” They delved further by also focusing on understanding what is historical trauma and the impact on collaboration with certain populations/users, knowing that a trust exchange is a key to moving forward and understanding that there are drawbacks to each design method. Kawanna Bright’s and Nikhat Ghouse’s session, “Taking AIM: Integrating Organization Development into the Creation of a Diversity, Equity & Inclusion Audit,” talked about using organizational development as an assessment tool. Feedback from those who participated in the audit when it was done at their institution talked about differentiating between the organization and individuals, “Who gets credit for the work?” Moving the organization from audit to action.
Maggie Faber, Jackie Belanger, and Ebony Magnus’ session, “A Consideration of Power Structures,” urged us to think about defining what is useful to the library, to challenge the status quo and our assumptions, to be reflective of our own practice and to mobilize assessment for social justice. How is data collected? What strategies or frameworks are we using? Who were they initially created for? How do we include users in this work? Who are the research subjects that are typically used? Are there cultural or social implications that we haven’t thought about or pursued? All these questions are an integral part of understanding our users more and designing accessible and equitable spaces, services, collections, and profession.
As I attended the different sessions and viewed the posters, there was something for everyone and I met and learned about people from various walks of life and what led them to attend this conference. The sessions were chock full of information, but it made for great discussions with others during the breaks and meal times. An unintended benefit was the workout I got due to the repeated raising of my arms to take pictures of slides with my iPad as well as hopping from room to room in an attempt to glean as much as I could from different presentations.
It’s challenging for me say what the future for assessment looks like, but I’ll make an educated prediction based on what I have garnered throughout the conference. The culture of assessment is here to stay for now. We must strive to stop every so often and conduct an audit of ourselves. Our future is bright because we strive to provide the highest level of service to all library users. This is provided through appropriate and usefully organized resources, equitable service policies, equitable access, as well as accurate, unbiased, and courteous responses to all requests according to ALA’s Code of Ethics.
This conference has greatly impressed upon me that assessment is truly a critical component of any work that we do. It should be embedded from the beginning into everything that we do and it doesn’t have to be pricey, time-consuming, or complicated. It could be beneficial if you are supporting your liaison areas or evaluating the continued feasibility and sustainability of programming and projects. It is also best to identify how to tweak outreach efforts or the use of social media to measure the impacts on services and resources. I will be reading many of the papers, especially the ones of the sessions that I couldn’t attend. The questions that I will work on answering are: “What are we doing to help our users? How do we move from better to great? From equality to equity?” With this in mind, I believe that library assessment plays an integral part in not only improving library projects but the quality of libraries as a whole.
LAC 2018: A Long Way to Grow, but a Good Start
Zhehan Jiang, PhD, Assistant Professor, The University of Alabama
As a first-time attendee to LAC 2018, I was completely astonished by the swarm of library professionals in the house: it is hard to believe that a great proportion of librarians dedicate themselves to assessment work, which tends to be overlooked in broad library research topics. The prolific presentations in the conference show the dedication of librarians; exchanging both research findings and operational practice among attendees creates great space for library assessment to improve.
The conference started with a great presentation by Paul-Jervis Heath, who shared the observation on a project about renovating library space. The results are absolutely interesting as he handled the problem from an irregular perspective and, in the end, successfully attracted more students to use the space for learning purposes. Similarly, there were presentation sessions, such as Measurement and Measures Indicators and Methods and Tools, showing innovative approaches for investigating assessment related tasks and research. Particularly, many cutting-edge technologies, for instance, Tableau and Google Analytics as well as log-files of Ezproxy, had been adopted in substantial presentations. All these indicate that librarians have a good sense of implementing new toolkits to their works.
Problems, however, can not be ignored, as they showed a systemic pattern in many presentations. These problems can be traced back to the fact that librarians generally do not have rigorous doctoral-level trainings; to illustrate, substantial inferences were made without robust statistical tests, or simply not satisfying the assumptions of certain models. The other example is that the research design was inappropriate and therefore led to doubtful conclusions. Workshops in the pre-conference could have cultivated the scientific mindsets of librarians who are responsible for conducting research and using the research results to navigate library functions, but LAC 2018 did not have relevant sessions, unfortunately.
As a quantitative mythologist, I see both challenges and opportunities in the conference. LAC provides a great platform for librarians to learn from each other which eventually will benefit the entire library assessment ecosystem.
Assessment That Matters: Ensuring Informed Decision-Making for Maximum Impact
Jean Sarurai Kanengoni, PhD Student in Library and Information Science, School of Information Science, University of Illinois at Urbana-Champaign
Libraries in Africa tend to be poorly funded and are normally the first to be impacted when funds are being cut or employee positions are being funded. As a PhD student, I am focusing on how libraries in Africa can prove their work by getting more support from their stakeholders, donors and community through library assessment. Attending LAC was an opportunity to learn from practising librarians.
Paul-Jervis Heath, in his opening keynote address, made me aware that beyond collecting data and stories about the community’s use of the library, it’s important to observe the library patrons’ use of the library. In observing library patrons, I will get to understand their values, motivation and what is important to them, and thus I will be able to design library programs and services suitable for the community needs. Mr. Heath made me aware that library assessment that is not linked with an appreciation of patrons (by getting to know them and loving them!!) would not be successful. Librarians need to be patient to design library assessment tools and be prepared to make adjustment as the need arises.
The second lesson I learnt was from the second keynote address. For my dissertation, I have been contemplating designing a tool to measure the impact of a children’s reading program in Zimbabwe. I was planning to compare the children who participate in the reading program against the children who do not participate in the reading program. I have since changed my mind, as I learnt the best way to measure impact is to assess the children participating in the reading program over a period of time. In this way it’s possible to trace the progress, or not, of the reading program.
LAC was also an opportunity to meet and connect with experienced librarians who could provide advice during my dissertation process. I met Liz Ozbum, the assessment coordinator from Utah State University and learnt of the Performance Measurement in Libraries to be held in United Kingdom. As Zimbabwe is a former British colony, whose library practises are closely linked to the British system, this conference will be of interest for my dissertation and I will apply to attend. I also met Susan Beatty from the University of Calgary and we discussed how to separate the effectiveness of library services and programs from other factors that are outside the library’s influence. I obtained the cards and contact details of both librarians, and have been in communication with Liz. I will be reaching out to Susan as I develop and work on my dissertation proposal. I also collected cards from presenters of posters, writing the poster numbers as there were numerous posters presented.
LAC is vital as a forum to learn from experienced librarians and network for the development of programs relevant to the African context. The conference helped me understand the importance of assessing patrons over a period of time, and to be continuously assessing the tools of performance evaluation.
LAC 2018 Reflection
Leni Matthews, User Experience Librarian, University of Texas at Arlington
I enjoyed the variety of sessions offered at this year’s conference. From distributing surveys to conducting hands on user experience methods, there were takeaways for all librarians. All sessions had students in mind and ideas for improving libraries as a whole.
The sessions also had the common thread of collaboration. Be it in the library, on campus or working with outside institutions, assessment should be done with partnerships in mind. As a whole, the sessions considered how collaboration strengthens the idea of making assessment on-going. I believe this distributes the responsibility while allowing others to take part in the process. This collaboration will also help us to see our community from different angles to better inform our assessment practices.
The posters were a great way to add to the conference. It gave me an opportunity to speak more in-depth with people about their work. I wish there were more opportunities to have extended conversations during the conference. Sometimes when we return to our institutions, we continue our honey bee work and lose touch with the conference stimulation.
A lot of the assessment we do happens behind closed doors, away from the people we are assessing. We become disengaged with the people and more engaged with the data about the people. I hope the user experience (UX) methodologies gets us back in tune with the people we are serving. Paul-Jervis Heath demonstrated great examples of getting in tune with our patrons. His example of observing bakers, then coming up with an improved design for the measuring cup as well as his observation of students’ use of seating space in a library, then adding plants to create privacy are great lessons for us to always be aware and part of our library community. Sometimes we have to be where our patrons are and use the space they way they do to better serve them. The Space sessions along with the User Experience sessions discussed being more involved or rather gaining data from being among patrons, and I think we need more of this.
There was so much to learn at this conference that it was overwhelming at times to take everything in and choose sessions to attend. However, it’s always good to see my colleagues making a difference. I look forward to reaching out to some of them to learn more.
LAC 2018 Reflection
Kyung-Im (Kim) Noh, Data Analyst, Harvard University
The conference program was organized in a way that attendees plan and navigate sessions easily and understand the scope of library assessment more clearly. Many sessions, especially those on library spaces, diversity and inclusion, and building community and culture of assessment were inspiring and provided me with directions for the library. I have also gained practical knowledge and ideas from the sessions on library data modeling, mining, and visualization. I have also learned the value of the academic library from the sessions on the library contributions to student success and scholarly productivity.
Changing careers from institutional research (IR) to academic libraries was a big transition and a learning curve that allowed me to learn about academic library systems and operations. The Library Assessment Conference was a great opportunity for me as a new library professional to learn about the trends and best practices of library assessment as well as to network with highly enthusiastic and skilled library professionals. I look forward to engaging more in the 2020 Library Assessment Conference.
A Call to Action: Reflecting on the Library Assessment Conference
Marisa Ramirez, Archival Processing Assistant, Loyola Marymount University
As a first-time attendee, an entire conference dedicated to library assessment was perplexing. What would it cover and what could I bring back to my workplace? Fortunately, the orientation for new attendees was welcoming and informative. I quickly realized that we were called to action, a sentiment expanded upon in the upcoming keynote speeches. An “action plan” handout at the orientation asked us our intent, how we know we’ve succeeded, as well as action steps, resources needed, responsible parties, and a timeframe. This would become a helpful guide when thinking of how I could connect what I was hearing in the panels with what I could achieve at my own institution.
In his keynote address, Paul Jervis-Heath of design consultancy Modern Human offered three simple yet effective pieces of advice that attendees could keep in mind when going after the daunting task that is assessment: fall in love with your users, move purposefully and make things better, and treat everything you design as an experiment. Later that day I attended sessions on the topics of “Value and Impact” as well as “Digital Libraries.” In these sessions, I found many examples of library professionals moving purposefully and making things better for their users, whether it was by assessing textbook affordability options or policy revision for digital repositories.
The following day featured a keynote panel that addressed diversity and inclusivity, a topic which is easier discussed than implemented. The panel called asked of attendees, “How do we go from exclusive club to inclusive organization?” Sessions I attended that day attempted to address the issue, with topics including “Diversity, Equity, and Inclusion” and “Non-Traditional Users.” The former looked at power structures, gender, and participatory design while the latter looked at efforts made to serve populations that include first-generation students and undergraduate non-users, offering helpful links and tips.
The final day saw a panel on human subject-based research and asked questions such as “Have we reached a point where the drive for assessment and evidence-based decision making has created a need for a new professional code of ethics?” and “How should libraries prepare, support, and advise library professionals in an increasingly research-oriented environment?” These are important questions to consider not only for those in assessment or user experience positions but for all information professionals looking to improve their institution. I attended the session immediately following on “Measurement and Measures Indicators” which delved into specific tools used for thorough and effective assessment.
A highlight of my conference experience were the poster sessions which allowed attendees to directly interact with presenters in a less formal manner. Poster topics were divided into six categories which included Facilities & Space, Outreach & Services, Teaching & Learning, Collections, Methods & Data, and Organizational Issues. As a poster presenter myself, it was great sharing assessment work that my institution is performing while also learning what works and what doesn’t work for other institutions.
My attendance at the Library Assessment Conference was meaningful and I am grateful for the opportunity to attend. It is an honor to be included in the impressive group of information professionals chosen as Travel Award Recipients, especially as a paraprofessional. The individuals I met and interacted with gave me much to aspire to and I look forward to continuing in this industry with a keen eye towards assessment.
LAC 2018 Reflection
Danielle Rapue, Systems & Assessment Librarian, Pasadena City College
The world of library assessment popped off the pages of academic journals and online spaces that discuss assessment practices that I browse, and sprang into 3D for me during the 2018 Library Assessment Conference. As a community college librarian (shout out to the rest of the 5% of us community colleges present this year!), I am the only librarian at my institution who has assessment duties as a point area. For me this means being a one-person department who seeks out studies, methodologies, and productive ways I can approach assessment within my library. Coming together with other assessment library and information professionals filled a gap of community within this area of librarianship I didn’t fully realize I had.
Hats off to the organizers for creating the opportunity for welcoming experiences and for building community with one another. The first-time attendee session the night before the conference kicked off was an excellent primer for me to understand how to make the most of this conference session and meet a few people before the droves of attendees piled in that same room the following day. On day 1, in opening session organizers made time to share conference data and statistics. To me that was different and refreshing from previous conferences I had been to, which I made sure to express on Twitter! While it might not have been intentional, I felt this also added to the sense of community… If anyone has had the experience of having an assessment or data/statistical driven conversation with a non-assessment person and their eyes glaze over… this was exactly the opposite. By making space and time to share out data that reflected representation of types of libraries, previous attendance, and dollar amount spent, it was confirmation that I was among people who value data and statistics similarly. Seeing the nodding heads and hearing murmur of those impressed by this share out further built this sense of community and it was only day 1, hour 1.
A true highlight of LAC 2018 for me was the poster sessions. While I did learn and grow from the presenter sessions themselves, the poster sessions allowed me to take in information, methodologies, and ideas at my own pace. I appreciated being given permission to take photos of the posters I was interested in to revisit the information later on. Some poster presenters provided handouts and business cards ready to take for when they were away from their poster which was also helpful to me. Some of you may still be getting an email from me!
University of Pennsylvania Libraries, director of liaison services Rebecca Stuhr’s poster, “Tracking Relationships and Measuring Professional Development” was of particular interest to me, since my colleagues and I have been actively exploring ways to approach liaison services, which means naturally I am thinking about how we can track what we do to demonstrate our impact of approach. Upon viewing this poster, I instantly had ideas for how we might adopt their assessment approaches at my own institution in a feasible way that could work for our library.
Back in library school, reading the work of assessment rockstar Megan Oakleaf helped form my own opinions and practice of how I would eventually approach information literacy instruction, which is also part of my current position. Little did I know years later, I would have the opportunity to talk with her about the assessment face to face! The library world is small, but imagine my excited surprise when I found out my current colleague, Pasadena City College Library Technician Joshua Hughey, was selected to collaborate with her on new work surrounding assessment of critical librarianship or #critlib as it is known on Twitter. Prior to the conference I knew Josh was in library school, but we hadn’t connected on talking about assessment or critical librarianship practices. His poster and post conference session with Megan, “Critical Information Literacy & Outcomes Assessment: Mutually Supportive, Not Exclusive,” helped me see a new aspect of approaching critical librarianship in order to create learning outcomes that are measurable. I have felt the impact that critical librarianship practices in the classroom have had on students, but now I have more knowledge and methods I can use that better demonstrate impact in a way that holds more validity in academia. I also now have another colleague I can talk to about assessment, critical librarianship, and data & stats, who was there all along but it took realizing we were going to the same conference and travels to Houston for us to connect on that level.
LAC 2018 was truly a space of community, ideas, and growth. This conference facilitated a place for us to inspire each other in some sessions and professionally challenge methods and approaches we aren’t always on the same page about. My gratitude for being provided a means to be a part of that is immeasurable.
2018 Library Assessment Conference Summary
Lamonica Sanford, Assessment Librarian, Georgia College
One of the first things I did upon my return from the 2018 Library Assessment Conference in Houston, TX was to give an overview of the conference to the director of my library. This was not necessarily a requirement as there are flash conferences each semester at our library where staff members share their experiences at conferences they have attended the previous semester. However, I just could not contain my excitement about presentations and poster sessions I attended. This conference was truly a learning experience.
It would take me time to unpack all that I had learned and experienced. The sessions I attended were both informative and pertinent to my position as an assessment librarian. I tried to select a broad range of sessions covering collections, methods and data, organizational issues, library spaces, teaching, and library services. I was relieved to find out that the presentations and poster sessions would be posted online which would allow me to review resources from sessions I could not attend.
Being an attendee afforded me the opportunity to hear from presenters and meet attendees from a diversity of institutions, regions, backgrounds, and research interests. I was fascinated with how a number of the presenters began their research journey and subsequent presentations at the conference with a simple question or set of questions that stemmed from their day to day work. For example (and there are many other examples I could mention), in Liz Bernal’s presentation “Library Impact with International Rankings – One Library’s Continuous Journey to Figure it Out,” her research journey came from a question posed by her university librarian about how the library could improve the institution’s international rankings. Her journey to figure out how her library could impact her university’s ranking was quite interesting. And while international rankings may not be a priority for some smaller institutions, I appreciated the effort and methodology she used to answer the original question and gained knowledge about how rankings worked and how I could review our systems and processes for accuracies and inaccuracies to improve efficiency and effectiveness.
As a new assessment librarian, I left the conference with a number of ideas of how my library could demonstrate its positive impact on my university and its stakeholders. I was impressed with how presenters utilized both quantitative and qualitative methods to explore library impact in diverse ways.