BIBFLOW was a two-year project (2013-2016) of the UC Davis Library and Zepheira made possible by a grant from the Institute of Museum and Library Services (IMLS). This research project investigated the future of library technical services, i.e., cataloging and related workflows, in light of modern technology infrastructure such as the Web and new data models and formats such as Resource Description and Access (RDA) and BIBFRAME, the new encoding and exchange format in development by the Library of Congress.
BIBFLOW’s focus was on understanding both the feasibility and impact of the adoption of Linked Data for library workers and work-flows.
BIBFLOW was a two-year project of the UC Davis Library and Zepheira, funded by IMLS. Its official title is “Reinventing Cataloging: Models for the Future of Library Operations” and we are investigating the future of library technical services, i.e., cataloging and related workflows, in light of modern technology infrastructure such as the Web and new data models and formats such as Resource Description and Access (RDA) and BIBFRAME, the new encoding and exchange format in development by the Library of Congress. Our hypothesis is that, while these new standards and technologies are sorely needed to help the library community leverage the benefits and efficiencies that the Web has afforded other industries, we cannot adopt them in an environment constrained by complex workflows and interdependencies on a large ecosystem of data, software and service providers that are change resistant and motivated to continue with the current library standards (e.g. Anglo-American Cataloguing Rules (or AACR) and MARC. Research is required on how research libraries should adapt our practices, workflows, software systems and partnerships to support our evolution to new standards and technologies.
BIBFLOW is a research agenda and set of activities to advance our community’s understanding of the resource description landscape – the current and desired future state – and begin to develop a roadmap that the library community can reference for planning investments and changes over the coming years. The area of greatest focus for the project will be the academic library technical services processes, including acquisitions, licensing, cataloging, processing, digitizing, and so on. But we will also look at the impact of the new standards and technologies on related operations that rely on the same library data, such as circulation, interlibrary loan, and public catalogs. In fact, it is this interdependency across library functions that is the root of our difficulty with changing any part of our local environment for fear of damaging others; the many benefits achieved by consolidation on a single data format and software system (i.e., an Integrated Library System) has become a constraint on our flexibility in rapidly changing times, often requiring years of planning to replace a key software system or convert huge amounts of legacy data, and because technical services are the data engine that drives most other library functions and operations, so understanding its future will allow us to be more strategic about investments and planning for all of our activities.
As part of this research, we will be collaborating and communicating with partners across the library data ecosystem – key organizations like the Library of Congress and OCLC, library vendors, standards organizations like NISO, software tool vendors and commercial data providers, and other libraries that are trying to plan for change, such as the BIBFRAME “early experimenters”. While the project will not convene meetings or conferences that bring these partners together, we will leverage existing projects to do so (e.g. by NISO), coordinate with stakeholders virtually, and do extensive outreach to get feedback from the community on what we are learning. Through this combination of research, collaboration, and outreach, our project will create a roadmap for the community, and particularly academic research libraries, and is designed in such a way that, as the new data models, standards, workflows and practices emerge and evolve the roadmap can be continuously updated with new roads and milestones.