This was my first time attending ALA’s Midwinter Conference. I had a great time rooming and socializing with Susan and Roz. During the conference, I ran into some old colleagues (Elizabeth Novicki, Jim Galbraith & Debbie Nolan), made new contacts with other librarians, and heard some interesting talks which are subsequently described.
Friday night, I attended the Anthropology and Sociology Section’s (ANSS) social at the Lansdowne Pub which is in Boston’s Fenway Park neighborhood. Met and spoke with several librarians who work/liaise with their respective university’s anthropology department. Two individuals with whom I spoke knew Lauren C. from her employment at Old Dominion and Emory, and another individual whom I sat beside at dinner knew Lauren P. from ALA committee work. Truly, the library world is a small one.
Saturday morning, Alasdair Ball (British Library), Ruth Fischer (R2 Consulting) and Brian Schottlaender (UC San Diego) spoke on redesigning Technical Services workflow in regards to libraries’ costs and the value delivered to libraries and patrons. As the Head of Acquisition and Description, Mr. Ball reported that his department processed around 1 million items per year. He characterized the UK’s National Bibliographic framework as being one with high duplication of effort, having a fragmented network of stakeholders, using multiple standards and formats, having an increase in demand for shelf-ready materials and records, and slow to change. Within his organization, there is a focus on adding value to research and providing collaborative workspace and tools for researchers. Acquisition and Description is viewed as necessary, but a back office set of functions with a high cost. Some operational challenges he sees are contributing to the library’s expanding agenda with no increase in resources (human and monetary), the need to optimize balance between costs, quality, and speed of service, outsourcing of the CIP programme, redefining and streamlining workflows and process models, and where can the British Library add value. Ruth Fischer spoke on the Study of the North American MARC Records Marketplace which she and her partner were commissioned by the Library of Congress (LC) to research and write. They conducted two online surveys, 1 with libraries (972 participants from all types of libraries) and 1 with vendors (70 participants), to investigate the current MARC records marketplace.
Results from the libraries’ survey found:
- Everyone prefers LC records
- 80% of libraries edit records in OPAC, but only half upload edits to their bibliographic utility
- 78% of libraries are unaware of restrictions on MARC record use and distribution
- Backlogs exist everywhere and are increasing (Largest backlogs are videos and DVDs; second largest are English language monographs)
Ms. Fischer’s report estimated that there are 34,000 each of original catalogers with the MLS and copy catalogers, and if each MLS cataloger created 1 new record each workday (200 in a year), 6.8 million original records could be created per year. It appears libraries have capacity in regards to number of catalogers. The question then is why are there backlogs?
Two hundred organizations create, sell or distribute MARC records to North American libraries, with the largest number of vendors providing MARC records for e-books.
Ms. Fischer’s interpretation of her findings include:
- There is confusion in the market about real cost and/or value of MARC records-Each year LC catalogs many titles that are not retained in its collections (i.e. CIP program). By law, LC is disallowed to recover cataloging costs. In essence LC subsidizes the market, which in turn causes the undervaluing of MARC records.
- Market provides insufficient incentives to contribute original cataloging-New commercial entrants are screen scraping LC’s and other libraries’ websites, and are not hiring MLS catalogers.
- Most libraries and catalogers must believe that they create more value by modifying existing records (e.g. including pagination, changing or removing subject headings, adjusting call numbers, etc.) than producing original records.
Questions raised from Fischer’s study:
- How long will libraries rely on MARC as the primary format for bibliographic data? We are trapped by the ILS.
- What would be required to correct the economic structure of the MARC record marketplace?
- What would happen if MARC record creators and creators of other descriptive metadata insisted on recovering their costs?
- Why have we (i.e. catalogers) deincentivized ourselves if we have capacity to create?
Fischer closed by saying catalogers need to determine what the concept of “good enough” means and start believing and incorporating it into workflows.
Brian Schottlaender spoke on the University of California’s next-generation of Technical Services initiative which has grown out of the last five years of community thinking. He stated that his library is freeing up resources in order to focus cataloging and other metadata description on unique resources. He believes administration must make a commitment to its employees, who are moving into new positions with new responsibilities, by providing them with education and training to ensure their success.
As a member of the Cataloging and Classification Section (CCS) Recruitment and Mentoring Committee, I attended my first ALA committee meeting ever. This is a newly formed committee and our charge is to recruit cataloging mentors and pair them with interested new and seasoned catalogers, as well as persons interested in cataloging. We will contact library schools to see if they have any students interested in becoming mentees, and are planning to send out send out a survey questionnaire to listservs to garner interest from potential mentors and mentees.
Saturday evening, I attended a screening of Alexander Street Press’ new product Ethnographic Video Online. This product is a partnership with Documentary Educational Resources (DER) whose founder John Marshall was an anthropologist/documentary filmmaker. John Marshall is renowned for his films on the !Kung San (Bushmen) peoples of the Kalahari Desert in Namibia. His first film The Hunters (1957) became an instant classic of ethnographic film. DER’s films will comprise over 60% of the films in Ethnographic Video Online, which launches next month with 200 films. Its collection will eventually be comprised of 1000 titles (750 hours of films). This products will allow users to create clips, make playlists and annotations, search for specific words in a film, is fully transcribed and has scrolling synchronous transcripts. Alexander Street Press is meeting with individual ethnographers/filmmakers who have unpublished footage to try and get their films into this database. I feel this product would be very useful to Wake’s Anthropology department and even perhaps the new Documentary Film program. I hope we will be able to get a trial of this product, and if possible, purchase it if deemed valuable by faculty from Anthropology and the Documentary Film program. Afterwards, I met Susan, Roz and Elizabeth Novicki for a wonderful dinner at Legal Seafood.
Bright and early Sunday morning, I went to an OCLC Update Breakfast and spoke with someone about the problems I was encountering with entering information into MARC cataloging records for Wake ETDs, specifically complex mathematical equations and subscripts/superscripts that aren’t numerical. I was told that some character sets are not supported, but there may be some workarounds with the subscript/superscript problem. The OCLC rep. asked me to email him some examples of my problems, and he would get back in touch. I also found out that in July 2010 OCLC will be releasing its Digital Collections Gateway product to any OAI compatible repository, of which D-Space is one, and will simplify the ETD cataloging process even more and allows for more visibility of these unique items. Hooray! I ran into Jim Galbraith, who is now working for OCLC, at the breakfast and also met a librarian from Brigham Young University who knows Derrik. Such a small world!
The rest of Sunday’s sessions included attending another session where Ruth Fischer spoke more in depth about R2′s report on MARC records, an Out Of Print (OOP) Discussion Group where the topic was digitization on demand (James Lee of Brandeis University spoke about the process and his school’s involvement with the Boston Library Consortium, a pioneer in the area of digitization on demand), and the Anthropology Librarians Discussion group.
Before leaving for home on a snowy Monday, I attended two more sessions at the convention center. The first was the Publisher-Vendor-Library-Relations Forum. Beth Bernhardt (UNCG) started off by saying that the NC legislature has mandated that by 2014 UNCG’s enrollment will be at 24,000 students. UNCG’s library is utilizing patron-driven acquisitions in building their e-book collection. Changes in user expectations, librarians’ roles, and researchers’ needs are some of the factors behind this new model of collection development. In April ’09, the library began this new model with MyiLibrary and chose computer science as the subject area. 1144 e-books that matched the library’s profile were loaded into the OPAC. 70 e-books were ordered at a cost of $7010. They are expanding their profile to include physics, chemistry, nursing and business. Plans are to compare what professors and students purchase. The first access to a title, no cost is incurred; the second look triggers a purchase. With the Life Sciences Library e-collection, they pre-selected a set of 23 books, but loaded all 750 titles’ MARC cataloging records into catalog. These books are very pricy due to the subject areas (i.e. nursing, anatomy, anesthesiology, and nutrition). Some important things to take into consideration when allowing patron-driven acquisitions include budget, deposit accounts, price limits, real-time invoicing, and cut off access/visibility.
Lindsey Schell (U of TX-Austin) spoke about their experience with EBL. Currently their patrons have access to 70,000+ titles, but have purchased 4,000. They too dumped all of EBL’s MARC records for their titles into their catalog, but this year began removing records for those titles never viewed in the initial 12 months to reduce cost exposure. Her university also incorporates patron-driven print approval acquisitions. The library downloads MARC records for publishers and subjects on a refined approval plan to the OPAC and allows patrons and subject specialists to decide which titles are actually purchased. Books are expressed shipped and are shelf-ready.
Next steps for this model of acquisitions involves analyzing patron purchasing and usage by LC call number and publisher to target specific areas for e-book and print delivery.
Due to patron-driven acquisitions, adjustments in the Technical Services department have occurred and include:
- Automate wherever possible
- Eliminate creating work elsewhere
- Free up staff to work on library’s priorities that can’t be automated or outsourced (i.e. e-resource management, digital content, unique collections)
- Eliminating serials check-in-Some people are freaking out about this
- Move monographic series standing orders to approval vendor
- Discontinue label production for periodicals-People can read titles unless it’s not in Roman alphabet
- Eliminate approval review shelves
- Reduce the number of gifts received
- Discontinue paper book plates for non-endowment donations
- Cut binding quotas-Redirect funds toward digitization
Judy Luther talked about developing a common platform for university press e-book distribution. The Mellon Foundation has awarded a grant to four university presses (NYU, Rutgers, Temple and Penn), and these presses are working with consultants to help develop a business model suitable to a university press consortium. They are looking to establish a “university press” brand and achieve economies of scale through collaboration on technical, financial, and practical challenges. Twenty-nine librarians were interviewed and core markets were identified (ARL, other Ph.D./masters programs, Oberlin Group). These were exploratory conversations designed to frame library practices, expectations, concerns and trends. Key issues included pricing functionality, digital rights management (DRM), and ability to select print and e-book purchases. An online survey of 1000 librarians (30% response rate) was conducted to test conclusions gathered from interviews. Purchase models must be evaluated. Vendors’ platforms need assessment. How should a university press consortium operate? The challenge, according to Ms. Luther, is serving our users well. Libraries want content for their users, as well as presses then getting out of their way. Right now platforms are not conducive; one can only print 10 pages at a time. The consultants’ report is due next month, and the presses will determine if they want to move forward. If so, further planning will be required.
My last session was on bibliographic mash-ups and once again the concept of redundancy in our data and workflow was mentioned by opening speaker Renee Register of OCLC. For libraries, most of the production work is performed at the end of the publication cycle with the receipt of a published item. On the publishers end, bibliographic data evolves over time beginning months before publication and sometimes ending years later with people contributing data. Inefficiencies and redundancy are common in metadata exchange, and different standards make it even harder to share. OCLC is currently creating authority control and mapping between BISAC subject terms (seen in Amazon) and Dewey Decimal Classification. We need to have ways in our systems that will allow metadata to grow overtime. Metadata records are living things from the information supplied by publishers to end user applied headings.
OCLC’s Karen Coyle spoke about the Open Library whose goal is to create one web page for every book ever published. It is not a library’s catalog and includes all e-books in the Internet Archive (Open Content Alliance, Google, public domain). The head of the project is the founder of Flickr. This database does not have records; it uses semantic web concepts called types (e.g. author name, birth date). All are equally important. Each type has properties; one can add properties without disruption and nothing is required and everything is repeatable. The database is based on wiki principles. All edits are saved and viewable and anyone can edit and add types and new properties. Sources of data come from LC, Amazon, Onix (publisher data), numerous libraries, and individual users (people can add their own books such as vanity press published books). There are some data problems as this is an experiment of non-librarians taking library data and using it. Examples of problems are:
- Names-no inversion, no birth or death dates
- Inclusion of initial articles in titles (e.g. The Hobbit)-Alphabetical order is not important here
- Needs normalization of series
- Differences in publication product dimensions-LxWxH vs. height in cm. used by libraries
There are page views for books and authors (similar to WorldCat identities). LC subject headings are not used; segment of LCSH are broken apart (i.e. no “dash dash”). Each subject heading has its own page. This project is currently in beta but is coming out in February 2010.
Kurt Groetsch of Google closed the session by speaking on the challenges Google Books has encountered with metadata reuse and matching, and the challenges of working with multivolume works. I got a little sleepy during his segment so I don’t have many notes for this part of the program.
I then met up with Susan and Roz. We took the Gale sponsored shuttle (very nice service) to the airport, got an early flight to Newark, and then waited for several hours in the magical place that is the Newark airport before we caught our flight home to Greensboro. All in all, my experience at ALA’s Midwinter Conference was a good one.