Preface

Demand-driven acquisitions (DDA) describes any acquisitions process that is driven by the desires of patrons or their usage of materials rather than predictive processes like package purchases and librarian selection. DDA is commonly used to refer to the catalog-integrated ebook programs that emerged with major vendors clustered around 2010–11, but the idea of DDA can be applied to many workflows that examine usage to inform purchasing across all formats including digital and physical monographs, serials, media materials, and other resources. For clarity, this book will use the abbreviation DDA to refer to all of these processes, though patron-driven acquisitions (PDA), purchase on demand, patron-initiated purchasing, and customer-based collection development are also commonly used in the research on this topic.

This volume will begin with a short history of DDA programs in libraries and their place in the landscape of acquisitions today. We will go on to discuss options for beginning and adjusting DDA programs with an eye towards the evaluation of these programs. This volume is intended to assist librarians and library professionals in assessing existing DDA programs, expanding their DDA with new processes, and setting up DDA programs for the first time. We will accomplish this through an examination of the research organized around several assessment criteria and then discuss the implications of DDA for different types of libraries. There are many ways to assess acquisitions strategies including cost, the immediate use of resources, the permanent value of the collection, and factors that impact processing workflows. DDA research has examined each of these factors in depth and across a variety of different library types and user groups.

When the librarians at the University of Arizona were developing metrics for assessing their own DDA program, they used five categories to contextualize the data they withdrew from the DDA records: financial metrics, resource metrics, performance metrics, patron metrics, and usage metrics. This zoning helped them establish goals for their program and then assess the progress toward those goals.

The financial metrics explored both cost per use and other cost factors like the processing cost to the institution, and cost per Library of Congress subject area. Patron metrics focused on both patron satisfaction and patron actions in the data. Performance metrics examined how vendors met their own set standards for service. Usage metrics measured circulation and use. Resource metrics examined how well the collection met collection development standards and how likely it was to be a healthy and well-used collection into the future (Dewland & See, 2015). This represents a comprehensive basic framework for creating an assessment program for DDA and existing DDA research can provide a context for beginning evaluation frameworks in each of these areas.

The examination of the research in this volume will loosely follow the University of Arizona model by thoroughly investigating the research on the financial aspects of DDA, collection standards and diversity issues, usage, and library issues like preservation and workflow. These sections should provide an analysis of the research in each of these areas and serve as a foundation for the evaluation of individual DDA programs. Though there is not enough research to make conclusive judgments about how DDA programs should be set up and proceed, the case studies in this section will be useful for comparison in evaluating existing DDA programs and as models for new programs.

Assessment is often not considered until DDA programs are already well underway, but goal setting and evaluation can be useful at any stage of the process, especially during planning. This volume will provide a good foundation for DDA assessment as well as for establishing new DDA programs using existing data and positioning them for later assessment.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset