Skip to main content

This video requires you to be logged in to view it.

Session Recording

Stopwatch Session 1 Recording

Filter

Stopwatch Session 1

Wednesday, November 16, 2022 8:30 AM EST

Online Video and Relevance - How Users Choose Streaming Videos

Erin DeWitt-Miller (University of North Texas)

Keywords: streaming video, information behavior, relevance

We are inundated by online video – consumer platforms such as Netflix, Amazon, and Hulu as well as platforms like TikTok and YouTube, are rapidly changing the information landscape. As online video becomes more central to information seeking behavior, it is also increasingly important to the library world.

This poster is a visual representation of an extensive research project into how people use online video to find information. An analysis of both interview and survey data, the findings are of practical interest to librarians who want to better understand their users, as well as to vendors and distributors of video content. This study of user information behavior can inform librarians doing information literacy instruction, making collection decisions, or designing events and outreach. It can also provide insight that is valuable to UX designers and others who build and license video platforms to public, school and academic libraries.

The study specifically looks at relevance criteria to determine what characteristics of online video users rely on to decide whether a particular video is a valid information source. For example, if one searches on YouTube for videos that explain how to poach an egg there are thousands of results…how do our users decide which video is ideal? To understand our users and to provide information effectively library professionals and vendors should be able to answer this question - this poster will provide insight that is helpful us do so.

---

Avoiding the Sunk Cost Fallacy : A Roadmap to Making Changes

Ellie Kohler (Virginia Tech Libraries) & Nitra Eastby (Virginia Tech Libraries)

Keywords: project management, strategy

What happens when you’ve spent a lot of time and money on something that just isn’t working? When do you start exploring other options? How do you decide to let go? This is an issue that every library has to face at some point. In Collections and Technical Services this is particularly relevant as we face shrinking budgets, personnel reductions, and increasingly complicated subscription contracts.

Recently, the University Libraries at Virginia Tech were facing several decision points, including whether to transition an in-house usage database to a hosted solution. As we had years of effort and resources in the in-house database, we wanted to be clear about exactly what was and wasn’t working. We created a roadmap to help us think through this process methodically.

Whether you are contemplating big changes like an ILS change or have just grown fed up with that one pesky spreadsheet the whole department uses, it’s important to have a framework in place as you move from contemplation to action. In this presentation we will walk through the decision-making process, from identification of a problem area, through the evaluation of that area, to the ultimate decision and follow through. We will provide a template and suggest how to adapt it for different project sizes. Using actual examples that recently occurred we will also discuss how we came to view our decision-making processes as a framework, and lessons learned along the way.

---

Assessing the Assessment: An Examination of Electronic Resource Usage Data and Its Applications

Richard Wisneski (Miami University)

Keywords: Usage Statistics, Collection Assessment, Electronic Resources

When presenting collection managers and administrators with statistics pertaining to electronic resources, it is important to understand what usage data points are most useful, how frequently such data is needed, in what presentation formats, and for what purposes. This presentation examines usage data commonly collected in electronic resources librarianship, including COUNTER and non-COUNTER compliant statistics. In particular, we will argue for scrutinizing data collected. What stories is data collection ultimately trying to tell, and what are the ramifications for these stories? Included will be examples of data points and their potential impact on collection management decisions. In April 2022, EBSCO published a white paper, "Analytics Play a Key Role in Campus Library Operations." The researchers surveyed academic librarians in fall 2021, with 196 responses. The writers state: "In many libraries, there are either too few analytics tools in use or else librarians aren’t happy with the current tools they have." This presentation expands on EBSCO's findings. Collection management librarians from Ohio colleges and universities were surveyed in the summer 2022 on their use of e-resource statistics, specifically, how frequently they collected data, for what purposes, and in what formats. This presentation will discuss the results of the survey, and propose revisiting what data e-resource librarians collect, how often this work should occur, and for what purposes beyond retention and cancellation decisions.

---

Why Haven’t Bibliometrics Measured Up in the Humanities?

Jeffrey Staiger Staiger (University of Oregon)

Keywords: humanities; bibliometrics; impact; research; knowledge

How do we conceptualize the nature of research in the humanities? What are its distinguishing characteristics? These age-old questions have recently returned with new force as bibliometricians have had to acknowledge the inadequacy of their methods with respect to work in the humanities. The resultant call for a clearer conceptualization of the nature of research in the humanities attempts has led to “bottom-up” methods that seek to define quality indicators through interviews with humanists in different disciplines. This approach represents an important step in the effort to find ways of measuring the impact of publications in the humanities, yet as long as it is informed by a conception of research derived from scientific practice it will not be responsive to what makes humanistic inquiry distinctive: the centrality of interpretation. This paper will offer a robust account of humanistic inquiry, illuminating the role played in it by interpretation, which to science, including library science, will naturally seem subjective, uncertain, and elusive. Rejecting this external view of interpretation, this paper will show why humanist interpretation is not strictly repeatable in the manner of an experiment and yet can achieve different levels of explanatory power and validity. This consideration of the nature of humanist interpretation will necessarily include a rethinking of such terms as “research,” “method,” “results,” and “knowledge” in a humanistic context. Attendees of this session will thus gain a deeper appreciation of the nature of humanist inquiry and the reasons why it resists the application of bibliometric approaches viable in other sectors of academia. This is a prerequisite for the development of duly sensitive methods for measuring contributions to knowledge in the humanities fields.

Stopwatch Session 1

Wednesday, November 2, 2022 8:30 AM EDT

Online Video and Relevance - How Users Choose Streaming Videos

Erin DeWitt-Miller (University of North Texas)

Keywords: streaming video, information behavior, relevance

We are inundated by online video – consumer platforms such as Netflix, Amazon, and Hulu as well as platforms like TikTok and YouTube, are rapidly changing the information landscape. As online video becomes more central to information seeking behavior, it is also increasingly important to the library world.

This poster is a visual representation of an extensive research project into how people use online video to find information. An analysis of both interview and survey data, the findings are of practical interest to librarians who want to better understand their users, as well as to vendors and distributors of video content. This study of user information behavior can inform librarians doing information literacy instruction, making collection decisions, or designing events and outreach. It can also provide insight that is valuable to UX designers and others who build and license video platforms to public, school and academic libraries.

The study specifically looks at relevance criteria to determine what characteristics of online video users rely on to decide whether a particular video is a valid information source. For example, if one searches on YouTube for videos that explain how to poach an egg there are thousands of results…how do our users decide which video is ideal? To understand our users and to provide information effectively library professionals and vendors should be able to answer this question - this poster will provide insight that is helpful us do so.

---

Avoiding the Sunk Cost Fallacy : A Roadmap to Making Changes

Ellie Kohler (Virginia Tech Libraries) & Nitra Eastby (Virginia Tech Libraries)

Keywords: project management, strategy

What happens when you’ve spent a lot of time and money on something that just isn’t working? When do you start exploring other options? How do you decide to let go? This is an issue that every library has to face at some point. In Collections and Technical Services this is particularly relevant as we face shrinking budgets, personnel reductions, and increasingly complicated subscription contracts.

Recently, the University Libraries at Virginia Tech were facing several decision points, including whether to transition an in-house usage database to a hosted solution. As we had years of effort and resources in the in-house database, we wanted to be clear about exactly what was and wasn’t working. We created a roadmap to help us think through this process methodically.

Whether you are contemplating big changes like an ILS change or have just grown fed up with that one pesky spreadsheet the whole department uses, it’s important to have a framework in place as you move from contemplation to action. In this presentation we will walk through the decision-making process, from identification of a problem area, through the evaluation of that area, to the ultimate decision and follow through. We will provide a template and suggest how to adapt it for different project sizes. Using actual examples that recently occurred we will also discuss how we came to view our decision-making processes as a framework, and lessons learned along the way.

---

Assessing the Assessment: An Examination of Electronic Resource Usage Data and Its Applications

Richard Wisneski (Miami University)

Keywords: Usage Statistics, Collection Assessment, Electronic Resources

When presenting collection managers and administrators with statistics pertaining to electronic resources, it is important to understand what usage data points are most useful, how frequently such data is needed, in what presentation formats, and for what purposes. This presentation examines usage data commonly collected in electronic resources librarianship, including COUNTER and non-COUNTER compliant statistics. In particular, we will argue for scrutinizing data collected. What stories is data collection ultimately trying to tell, and what are the ramifications for these stories? Included will be examples of data points and their potential impact on collection management decisions. In April 2022, EBSCO published a white paper, "Analytics Play a Key Role in Campus Library Operations." The researchers surveyed academic librarians in fall 2021, with 196 responses. The writers state: "In many libraries, there are either too few analytics tools in use or else librarians aren’t happy with the current tools they have." This presentation expands on EBSCO's findings. Collection management librarians from Ohio colleges and universities were surveyed in the summer 2022 on their use of e-resource statistics, specifically, how frequently they collected data, for what purposes, and in what formats. This presentation will discuss the results of the survey, and propose revisiting what data e-resource librarians collect, how often this work should occur, and for what purposes beyond retention and cancellation decisions.

---

Why Haven’t Bibliometrics Measured Up in the Humanities?

Jeffrey Staiger Staiger (University of Oregon)

Keywords: humanities; bibliometrics; impact; research; knowledge

How do we conceptualize the nature of research in the humanities? What are its distinguishing characteristics? These age-old questions have recently returned with new force as bibliometricians have had to acknowledge the inadequacy of their methods with respect to work in the humanities. The resultant call for a clearer conceptualization of the nature of research in the humanities attempts has led to “bottom-up” methods that seek to define quality indicators through interviews with humanists in different disciplines. This approach represents an important step in the effort to find ways of measuring the impact of publications in the humanities, yet as long as it is informed by a conception of research derived from scientific practice it will not be responsive to what makes humanistic inquiry distinctive: the centrality of interpretation. This paper will offer a robust account of humanistic inquiry, illuminating the role played in it by interpretation, which to science, including library science, will naturally seem subjective, uncertain, and elusive. Rejecting this external view of interpretation, this paper will show why humanist interpretation is not strictly repeatable in the manner of an experiment and yet can achieve different levels of explanatory power and validity. This consideration of the nature of humanist interpretation will necessarily include a rethinking of such terms as “research,” “method,” “results,” and “knowledge” in a humanistic context. Attendees of this session will thus gain a deeper appreciation of the nature of humanist inquiry and the reasons why it resists the application of bibliometric approaches viable in other sectors of academia. This is a prerequisite for the development of duly sensitive methods for measuring contributions to knowledge in the humanities fields.

As Head of Library Data Analytics and Assessment for the University Libraries at Virginia Tech, I try to provide leadership for data collection and analysis efforts and promote data-informed decision-making. My current areas of research interest include developing interactive visualizations, ethical data collection, methods of data communication, and text and data mining.

Jeff Staiger holds a PhD in English from Berkeley and a MLIS from Rutgers. He publishes scholarly articles in the fields of library science and literary criticism, as well as critical and personal essays for literary reviews.

Richard Wisneski is the electronic resources librarian for Miami University in Oxford, OH. He has worked in technical services and collection management for other academic libraries throughout his career in librarianship.