Professional Development

Leslie at MLA 2015

Monday, March 23, 2015 8:26 pm

Lots of good presentations at this year’s meeting of the Music Library Association in Denver. As at ALA, winter weather prevented a number of colleagues from attending, but we were able to Skype presenters in most cases, and for the first time, selected sessions were live-streamed. The latter will be posted on the MLA website.


In a session on “digital musicology,” several exciting projects were described:

Contemporary Composers Web Archive (CCWA). A Northeastern consortium project in progress. They’re crawling and cataloging composers’ websites, and contributing the records to OCLC and the Internet Archive. The funding is temporary, so here’s hoping they find a way to continue this critical work preserving the music and music culture of our times.

RISM OPAC. The Repertoire international des sources musicales is the oldest internationally-organized music index (of manuscripts and early printed editions), but only a small portion has so far been made available online. The new online search interface they’re developing retrieves digital scores available on the websites of libraries, archives, composers, and others worldwide. They expect to have 2 million entries when national inventories are completed.

Music Treasures Consortium (MTC). A similar project hosted by the Library of Congress, it links to digitized manuscripts and early printed editions in conservatories, libraries, and archives in the US, UK, and Germany. It’s modeled on an earlier project, the Sheet Music Consortium (hosted by UCLA).

Blue Mountain Project. Named after a Kandinsky painting representing creativity, this Princeton project, funded by a NEH grant, aims to provide coverage of Modernism and the Avant-Garde in arts periodicals 1848-1923. References to music in these sources are often fleeting, so there is a need for enhanced “music discovery.” The presenter discussed the challenges of digitizing magazines: the mix of text, images, and ads; multiple languages of periodicals in this project; variations in the transcription/spelling of names (they plan to cross-index to VIAF, the international authority file).

In the Q & A period, discussion centered on the global importance of projects such as these, and the concomitant need for best-practices standards (including a requirement to link to VIAF) and multi-language capabilities in metadata schema.


Now that the ACRL Framework has replaced learning objectives with “threshold concepts,” music librarians have begun taking first stabs at interpreting these for their discipline:

Scholarship as a conversation = performance as a conversation. Most music students enter college as performers, so this can serve as a base for scaffolding. One notable difference: performance lacks a tradition of formal citation — might some way be found to codify the teacher/student oral tradition by which the performing arts are transmitted?

Authority as constructed and contextual = performers as authorities (Performer X as a leading interpreter of Composer Y’s works); also, the changing of performance practices over time; learning to listen with a critical evaluative ear.

Information creation as process: understanding the editing process for scores, and also of recordings and video (vs. live performance).

Research as inquiry: every performing-arts student who spends long hours in practice and rehearsal is familiar with the concept of an iterative process — an excellent jumping-off point for understanding research as an iterative process.

Searching as strategic exploration: this has been related to musicians’ vexed relationship with library discovery interfaces that don’t work well for music retrieval! Resourcefulness and persistence is needed to meet performers’ information needs regarding specialized details such as instrumentation, key, range, etc.

Information has value = creative output has value. Understanding how the artist fits into the marketplace; the complexities of copyright as it applies to the arts.


The music library community has long been frustrated by issues surrounding music recordings released online but governed by EULAs (end-user license agreements) that prohibit institutional purchase. MLA and the University of Washington have recently received a IMLS grant to develop strategies for addressing these issues, “culminating in a summit with stakeholders and industry representatives.” On the agenda: EULA reform (developing a standard language); preservation (given the industry’s apalling track record, perhaps the library community can create dark archives?); and public relations. Strategies being considered: developing a MLA best practices document; creating a test case; approaching either the smaller labels (who are generally more open to negotiation) or going directly for the big three (Sony, Warner, and Universal) on the theory that if they agree, others will follow.

Another session on recordings and fair use discussed the best practices movement. Noting that the courts, when confronted by new questions, have begun referring to community practice, many disciplines and professions are drafting best-practices documents. Unlike guidelines, whose specificity make them prone to obsolescence, best-practices statements “reflect the fundamental values of a community” — which not only helps them better stand the test of time, but also results in more commonalities between communities, so that they reinforce each other, lending them more weight in the face of legal challenges. The NRPB (National Recordings Preservation Board) recently completed a study that recommended such a document, and the ARSC (Association of Recorded Sound Collections) has a handbook forthcoming.


At a poster session, I learned about two surveys done at Kent State that queried the preferences of music and other performing-arts students re the materials they use. One survey noted the significant number of print resources that still occupied top places in a ranking of preferred materials: print scores were much preferred to e-scores (68% to 28%); ditto for books (80% print to 27% electronic); CDs were still used regularly. E-journals, however, were preferred to print (64% to 32%). The survey’s conclusion found a “strong sentiment” in favor of a mix of print and electronic.

The other survey debated the relevance of audio reserves. It confirmed widespread use of extra-library resources by students for their listening assignments: YouTube, streaming services such as Spotify and Pandora, MP3s they had purchased themselves. Reasons given for preferring these sources: ability to listen on a smartphone or tablet (a preference also noted by commercial database vendors, who have begun developing mobile-device capabilities); personal comfort, and convenience. On the other hand, two encouraging reasons students give for using the library’s CD collection: the superior sound quality, and the availability of library staff for help.


I attended a half-day workshop on genre and medium terms for music. Historically, the Library of Congress subject headings have combined, in long pre-coordinated strings, many disparate aspects of the materials we catalog: topic (Buddhism), genre (drama, folk music), form (symphonies), medium (painting, piano), physical format (scores), publication type (textbooks, catalogs), intended audience (children’s books, textbooks for foreign speakers). Since these can be more effectively machine-manipulated as discrete data than in strings, there’s a project afoot to parse them into separate vocabularies, to be used in new RDA fields, for more precise search-and-sort capabilities in our discovery interfaces.

Three vocabularies are being developed:

  • Genre/form (LCGFT) — e.g., drama, folk music
  • Demographic groups (LCDGT) — author’s nationality, gender, etc.; intended audience
  • Medium of performance (LCMPT) — for music: instruments/voices

Given the many thousands of existing subject terms, this is clearly a challenging task, and I acquired a new appreciation for its complexities as I listened to the LC folks describe their struggles wrestling music terminology (as just one disciplinary example) to the ground. Problems debated included: types of music that musicians have long regarded as genres in their own right (think string quartets) but are really just defined by their instrumentation or number of players; ditto for music set to certain texts (Magnificats, Te Deums); bringing out the distinctions between art music, folk music, and popular music (an attempt to remedy the original classical-centrism of the LC terminology); terms like “world music” that seem to have been invented mainly for marketing purposes; music for specific events or functions; stuff like master classes, concert tours, etc.; ethnomusicological (area studies) terms, which proved too numerous, and too inconsistently defined in reference sources, to be dealt with in the project’s initial phase; and tension between the need to build a logical hierarchy and recognizing the more fluid conventions practiced by user communities. While the new vocabularies are still under construction, we learned about the major changes, and how to encode the terms in RDA records.

In a session on Bibframe (a new encoding format designed to replace the aging MARC format), we heard about LD4L, a project conducted by Standford, Cornell, Harvard, and LC to develop an open-source extensible ontology to aid in conversion of MARC to Bibframe; and another project at UC-Davis to develop a roadmap for Bibframe workflows, from acquisitions operations to cataloging and conversion, and even a prototype discovery layer.


A Friday-night treat was the screening of a silent film (The General, starring Buster Keaton) accompanied by the Mont Alto Motion Picture Orchestra (a 6-piece strings-and-winds band). The score was one they had compiled from music used by theater orchestras of the period, now archived in the University of Colorado’s American Music Research Center.


Friday, February 6, 2015 10:45 am

It was good to visit my home state of Illinois for ALA Midwinter 2015 in Chicago. I was able to get together with a few cousins with whom I was close growing up in Decatur, three hours south. And who doesn’t like 18 inches of snow? Somehow the weather didn’t actually interfere too much with the conference. If anything it brought attendees closer, I daresay.

At the meeting of the ALCTS Copy Cataloging Interest Group, Angela Kinney from the Library of Congress talked about restructuring at LC, specifically reductions in acquisitions and cataloging staff; this is a theme at many libraries, unfortunately. Roman Panchyshyn from Kent State (whom I’ve also seen present on an RDA-enrichment project similar to the one we’ve just undergone with Backstage) then talked about the considerable proliferation of e-resource bulk record loads in recent years and the need to build copy catalogers’ skills in this area (at their library this work has traditionally been done by professional catalogers and systems staff). Necessary skills include PC file management, FTP/data exchange, basic knowledge of RDA, comfortability with secondary applications such as MarcEdit, and the ability to follow instructions and documentation. Here at ZSR, our copy catalogers, I must say, have these skills in spades, and I do not take for granted the fact that they are so sophisticated; nor should any of us. Not only are they able to follow workflows and documentation, but they create their own. Every record load is a little bit different, and these operations require attentiveness, diligence, and accuracy.

I also attended a session by the ALCTS MARC Formats Transitions Interest Group. The central topic was BIBFRAME, the new encoding format being developed by LC in collaboration with several libraries that eventually is meant to replace MARC as a more linked data/web-friendly format. Nancy Fallgren from the National Library of Medicine talked about the need for BIBFRAME (I think I’m going to get sick of typing that word before the end of this paragraph) to be flexible enough to work with the different descriptive languages of various sectors of the cultural heritage community – libraries, archives, museums, etc. She emphasized that BIBFRAME is not a descriptive vocabulary in and of itself and is built to accommodate RDA, not compete with it; it is a communication method, not the communication itself. Perhaps most importantly, this new format has to be extensible beyond library catalogs, as BIBFRAME-encoded data must go bravely off into the web to seek its fate, alone. Xiaoli Li from UC-Davis described her university’s two-year pilot project, BIBFLOW (BIBframe + workFLOW), in which they are actively experimenting with technical services workflows using the new format. She concluded that “Linked data means an evolutionary leap for libraries, not a simple migration.” This seems fair to say.

In July 2014 I started on two committees, and Midwinter was my first official meeting with both. On the ALCTS Acquisitions Section Organization and Management Committee, or, less conveniently, ALCTSASOAMC, we are planning a preconference for Annual in San Francisco entitled “Streaming Media, Gaming, and More: Emerging Issues in Acquisitions Management and Licensing.” The gaming component of this, in particular, is interesting to me, because I know absolutely nothing about it. I have high hopes for the program, which will be comprised of librarian presentations, a vendor panel, and guided group discussions. I am also on the ALCTS Planning Committee, which has been working on a fairly exhaustive inventory of all ALCTS committees’ and interest groups’ activities with an eye to how they support ALA’s initiatives of Advocacy, Information Policy, and Professional and Leadership Development. It’s been an interesting exercise; one gets a broad sense of the many and diverse efforts being made to support librarians and to advance the profession. In the end we will draft a new three-year strategic plan.

What exactly someone who decided to drive back to Winston-Salem from Chicago can really contribute to strategic planning is a question for another day. I’ll close with the dreary view from inside the hotel room I shared with Steve Kelley, who at the time seemed to be dying. Fortunately blue skies (see above) emerged.

Leslie at MLA 2014

Saturday, March 15, 2014 4:38 pm

This year’s Music Library Association conference was held in Atlanta. It was a very productive meeting for me: I got a lot of continuing education in RDA, the new cataloging standard; and an opportunity to renew contacts in the area of ethnomusicology (music area studies), having learned just before leaving for MLA that our Music Department plans to add an ethnomusicologist to their faculty.


The impact of RDA, one year after its adoption by the Library of Congress, was apparent in the number of sessions devoted to it during the general conference, not just the catalogers’ sessions sponsored by the Music OCLC Users Group. I learned about revisions made to the music rules in the RDA manual, in MLA’s “Best Practices” document, and in the various music thesauri we use. (So if you see a “Do Not Disturb” sign on my door, you’ll know I have a lot of re-reading to do, all over again!). One sign of the music library community’s clout: MLA’s Best Practices will be incorporated into the official RDA manual, with links integrated into the text alongside LC’s policy statements. In a session on RDA’s impact on public services, I was gratified to find that almost all the talking points presented by the speakers had been covered in my own presentation to our liaisons back in September.


LC gave a report on its National Recordings Preservation Plan (NRPP), which began in February 2013. The group has developed 31 recommendations, which will be presented at hearings scheduled for this year by the US Office of Copyright, covering the entire copyright code, including section 108, orphan works, and pre-1972 sound recordings (the ones not covered by federal law, leaving librarians to navigate a maze of state laws). Also to be presented: a proposed “digital right of first sale,” enabling libraries and archives to perform their roles of providing access and preservation for born-digital works whose licensing currently prohibits us doing so. In the meantime, best-practices documents have been developed for orphan works (by UC Berkeley) and fair use for sound recordings (by the NRPP).


Perennial, and always interesting, sessions are held at MLA on the ongoing problem of musical works and recordings that are issued only online, with licensing that prohibit libraries and archives from acquiring them. An MLA grant proposal aims to develop alternative licensing language that we can use with recording labels, musicians, etc., allowing us to burn a CD of digital-only files. A lively brainstorming session produced additional potential solutions: an internet radio license, which would stream a label’s catalog to students, at the same time generating revenue for the label; placing links to labels in our catalogs, similar to the Google links that many OPACS feature for books, offering a purchase option; raising awareness among musicians, many of whom are unaware of the threat to their legacies, by speaking at music festivals, and asking the musicians themselves to raise public awareness, perhaps even by writing songs on the topic; capturing websites that aggregate music of specific genres, etc., in the Internet Archive or ArchiveIt; collaborating with JSTOR, PORTICO, and similar projects to expand their archiving activities to media.


This hot topic has begun to make its impact on the music library community, and MLA has established a new round table devoted to it. In a panel session, music librarians described the various ways they are providing support for, and collaborating with, their institutions’ DH centers. Many libraries are offering their liaisons workshops and other training opportunities to acquire the technical skills needed to engage with DH initiatives.


In a panel session on new technologies, we heard from a colleague at the University of Music and Drama in Leipzig, Germany, who led a project to add facets in their VuFind-based discovery layer for different types of music scores (study scores, performance scores, parts, etc.); a colleague at Haverford who used MEI, an XML encoding scheme designed for musical notation, to develop a GUI interface (which they named MerMEId) to produce a digital edition of a 16th-century French songbook, also reconstructing lost parts (we’ve been hearing about MEI for some years — nice to see a concrete example of its application); an app for the Ipad, developed by Touch Press, that offers study aids for selected musical works (such as Beethoven’s 9th symphony) allowing you to compare multiple recordings while following along with a modern score or the original manuscript (which automatically scrolls with the audio), watch a visualization tool that shows who’s playing when in the orchestra, and read textual commentary, some in real time with the audio; a consortium’s use of Amazon’s cloud service to host an instance of Avalon, an audio/video streaming product developed by Indiana U, to support music courses at their respective schools; and ProMusicDB, a project that aims to build an equivalent to IMDB for pop music.

Leslie at MLA 2013

Wednesday, March 6, 2013 7:48 pm

A welcome escape from the usual wintry rigors of traveling to a Music Library Association conference — mid-February this year found us in San Jose, soaking up sun, balmy breezes, and temps in the 70s. (Colleagues battered by the Midwest blizzards were especially appreciative.)

This was the title of a plenary session which yielded a number of high-level insights. For one, it was the first time I had heard the term “disintermediation” to describe the phenomenon of librarians being displaced by Google et al as the first place people go for information.

Henriette Hemmasi of Brown U analogized the MOOCs trend as “Diva to DJ”: that is, the role of the instructor is shifting from lone classroom diva to the collaborative role played by a disc jockey — selecting and presenting material for team-produced courses, working with experts in web development, video, etc. Her conclusion: 21st-century competencies must include not just knowledge, but also synthesizing and systems-thinking skills.

David Fenske, one of the founding developers of Indiana’s Ischool, noted that the rapid evolution of technology has rendered it impossible to make projections more than 5 or 10 years out (his reply to a boss who asked for a 20-year vision statement: “A 20-year vision can’t be done without drugs!”). He also observed that digital preservation is in many ways more difficult than the traditional kind: the scientific community is beginning to lose the ability to replicate experiments, because in many cases the raw data has been lost due to obsolete digital storage media. Fenske envisions the “library as socio-technical system” — a system based on user demographics, designed around “communities of thought leaders” as well as experts. Tech-services people have long mooted the concept of “good-enough” cataloging, in the face of overwhelming publication output; public-services librarians, in Fenske’s view, should start talking about the “good-enough” answer. Fenske wants to look “beyond metadata”: how can we leverage our metadata for analytics? semantic tools? How can we scale our answers and services to compete with Google, Amazon, and others?

Some interesting findings from two studies on the library needs of performing faculty and students (as opposed to musicologists and other researchers in the historical/theoretical branches of the discipline):

One study addressed the pros and cons of e-scores. Performers, always on the go and pressed for time, like e-scores for their instant availability and sharability; the fact that they’re quick and easy to print out; their portability (no more cramming a paper score into an instrument case for travel); easy page turns during performance (a pedal mechanism has been devised for this). Performers also like an e-score that can be annotated (i.e., not a PDF file) so they can insert their notes for performance; and the ability to get a lot of works quickly from one place (as from an online aggregator). On the other hand, academic users, who work with scholarly and critical editions, like the ability of the online versions to seamlessly integrate critical commentary with the musical text (print editions traditionally place the commentary in separate supplementary volumes). Third-party software can also be deployed to manipulate the musical text for analysis. But the limitations of the computer screen continue to pose viewability problems for purposes of analysis. Academic users regard e-scores as a compliment to, not an alternative to, print scores.

Another study interviewed performing faculty to find out how they use their library’s online catalog. Typically, they come to the library wanting to find known items, use an advanced-search mode, and search by author, title, and opus number (the latter not very effectively handled by many discovery layers; VuFind does a reasonably good job). Performing faculty often are also looking for specific editions and/or publishers (aspects that many discovery interfaces don’t offer as search limits/facets). Performing faculty (and students) study a work by using a score to follow along with a sound recording, so come to the library hoping to obtain multiple formats for the same work — icons or other aids for quickly identifying physical format are important to them, as for film users and others. There is also a lot of descriptive detail that performers need to see in a catalog display: contents, duration, performers’ names.

Stuff a lot of music librarians have observed or suspected, but good to see it quantified and confirmed in some formal studies.

This is a topic that has generated much interest in the library community, and music librarians have also been exploring collaborative options for acquiring the specialized materials of their field. Besides shared approval-plan profiles for books, and shared database subscriptions, music librarians have divvied up the collecting of composers’ collected editions, and contemporary composers whose works they want to collect comprehensively. Because music materials are often acquired and housed in multiple locations on the same campus, internal collaboration is as important as external. One thing that does not seem to lend itself to collaborative collection: media (sound recordings and videos). Many libraries don’t lend these out via ILL, and faculty tend to want specific performances — making on-request firm orders a more suitable solution. One consortium of small Maine colleges (Colby, Bates, and Bowdoin) divided the processing labor of their staffs by setting up rotating shipments for their shared approval plan: one library gets this month’s shipment of books, another library receives the next month’s shipment, and so on.

There was a good bit of discussion concerning demand-driven e-book acquisitions among colleagues whose institutions had recently implemented DDA services. On two separate occasions, attendees raised the question of DDA’s impact on the humanities, given those disciplines’ traditional reliance on browsing the stacks as a discovery method.

It was a very busy conference for music catalogers, as over a hundred of us convened to get prepared for RDA. There was a full-day workshop; a cataloging “hot topics” session; a town-hall meeting with the Bibliographic Control Committee, which recently produced a “RDA Best Practices for Cataloging Music” document; and a plenary session on RDA’s impact across library services (the latter reprising a lot of material covered by Steve and others in ZSR presentations — stay tuned for more!)

A very special experience was a visit to the Ira F. Brilliant Center for Beethoven Studies (located on the San Jose State campus), the largest collection of Beethoveniana outside Europe. During a reception there, we got to play pianos dating from Beethoven’s time. Hearing the “Moonlight Sonata” up close on the model of instrument he wrote it for (Dulcken, a Flemish maker) was a true revelation.

Steve at ALA Midwinter 2013

Friday, February 8, 2013 2:10 pm

Although my trip to Seattle for the ALA Midwinter Conference had a rough start (flight delayed due to weather, nearly missed a connecting flight, my luggage didn’t arrive until a day later), I had a really good, productive experience. This Midwinter was heavy on committee work for me, and I was very focused on RDA, authority control and linked data. If you want a simple takeaway from this post, it’s that RDA, authority control and linked data are all tightly bound together and are important for the future of the catalog. If you want more detail, keep reading.
My biggest commitment at the conference was participating in two long meetings (over four hours on Saturday afternoon and three hours on Monday morning) of CC:DA (Cataloging Committee: Description and Access). I’m one of nine voting members of CC:DA, which is the committee responsible for developing ALA’s position on RDA. The final authority for making changes and additions to RDA is the JSC (Joint Steering Committee), which has representation from a number of cataloging constituencies, including ALA, the national library organizations of Canada, the UK, and Australia, as well as other organizations. ALA’s position on proposals brought to the JSC is voted on by CC:DA. Membership on this committee involves reading and evaluating a large number of proposals from a range of library constituencies. Much of the work of the committee has so far involved reviewing proposals regarding how to form headings in bibliographic records, which is, essentially, authority control work. We’ve also worked on proposals to make the rules consistent throughout RDA, to clarify the wording of rules, and to make sure that the rules fit with the basic principles of RDA. It has been fascinating to see how interconnected the various cataloging communities are, and how they relate to ALA and CC:DA. As I said, I am one of nine voting members of the committee, but there are about two dozen non-voting representatives from a variety of committees and organizations, including the Music Library Association, the Program for Cooperative Cataloging, and the Continuing Resources Cataloging Committee of ALCTS.
During our Monday meeting, we saw a presentation by Deborah Fritz of the company MARC of Quality of a visualization tool called RIMMF, RDA In Many Metadata Formats. RIMMF shows how bibliographic data might be displayed when RDA is fully implemented. The tool is designed to take RDA data out of MARC, because it is hard to think of how data might relate in RDA without the restrictions of MARC. RIMMF shows how the FRBR concepts of work, expression and manifestation (which are part of RDA) might be displayed by a public catalog interface. It’s still somewhat crude, but it gave me a clearer idea of the kinds of displays we might develop, as well as a better grasp on the eventual benefits to the user that will come from all our hard work of converting the cataloging world to RDA. RIMMF is free to download and we’re planning to play around with it some here in Resource Services.
I also attended my first meeting of another committee of which I am a member, the Continuing Resources Cataloging Committee of the Continuing Resources Section of ALCTS). Continuing resources include serials and web pages, so CRS is the successor to the old Serials Section. We discussed the program that we had arranged for that afternoon on the possibilities of using linked data to record serial holdings. Unfortunately, I had to miss the program due to another meeting, but I’m looking forward to seeing the recording. We also brainstormed ideas for our program at Annual in Chicago, and the committee’s representative to the PCC Standing Committee on Training gave us an update on RDA training initiatives.
The most interesting other meeting that I attended was the Bibframe Update Forum. Bibframe is the name for an initiative to try to develop a data exchange format to replace the MARC format(s). The Bibframe initiative hopes to develop a format that can make library data into linked data, that is, data that can be exchanged on the semantic web. Eric Miller, from the company Zepheira (which is one of the players in the development of Bibframe), explained that the semantic web is about linking data, not just documents (as a metaphor, think about old PDF files that could not be searched, but were flat documents. The only unit you could search for was the entire document, not the meaningful pieces of content in the document). The idea is to create recombinant data, that is, small blocks of data that can be linked together. The basic architecture of the old web leaned toward linking various full documents, rather than breaking down the statements into meaningful units that could be related to each other. The semantic web emphasizes the relationships between pieces of data. Bibframe hopes to make it possible to record the relationships between pieces of data in bibliographic records and to expose library data on the Web and make it sharable. At the forum, Beacher Wiggins told the audience about the six institutions who are experimenting with the earliest version of Bibframe, which are the British Library, the German National Library, George Washington University, the National Library of Medicine, OCLC, and Princeton University. Reinhold Heuvelmann of the German National Library said that the model is defined on a high level, but that it needs to have more detail developed to allow for recording more granular data, which is absolutely necessary for fully recording the data required by RDA. Ted Fons of OCLC spoke of how Bibframe is an attempt to develop a format that can carry the data libraries need and to allow for library data to interact with each other and the wider web. Fons said that Bibframe data has identifiers that are URIs which can be web accessible. He also said that Bibframe renders bibliographic data as statements that are related to each other, rather than as self-contained records, as with MARC. Bibframe breaks free of the constraints of MARC, which basically rendered data as catalog cards in electronic format. Bibframe is still going through quite a bit of development, but it is moving quickly. Sally McCallum of the MARC Standards Office said that they hope to finalize aspects of the Bibframe framework by 2014, but acknowledged that, “The change is colossal and the unexpected will happen.”
Actually, I think that’s a good way to summarize my thoughts on the current state of the cataloging world after attending this year’s Midwinter, “The change is colossal and the unexpected will happen.”

Steve at NASIG 2012

Thursday, June 14, 2012 5:03 pm

Last Thursday, Chris, Derrik and I hopped in the library van and drove to Nashville for the NASIG Conference, returning on Sunday. It was a busy and informative conference, full of lots of information on serials and subscriptions. I will cover a few of the interesting sessions I attended in this post.
One such session was called “Everyone’s a Player: Creation of Standards in a Fast-Paced Shared World,” which discussed the work of NISO and the development of new standards and “best practices.” Marshall Breeding discussed the ongoing development of the Open Discovery Initiative (ODI), a project that seeks to identify the requirements of web-scale discovery tools, such as Summon. Breeding pointed out that it makes no sense for libraries to spend millions of dollars on subscriptions, if nobody can find anything. So, in this context, it makes sense for libraries to spend tens of thousands on discovery tools. But, since these tools are still so new, there are no standards for how these tools should function and operate with each other. ODI plans to develop a set of best practices for web-scale discovery tools, and is beginning this process by developing a standard vocabulary as well as a standard way to format and transfer data. The project is still in its earliest phases and will have its first work available for review this fall. Also at this session, Regina Reynolds from the Library of Congress discussed her work with the PIE-J initiative, which has developed a draft set of best practices that is ready for comment. PIE-J stands for the Presentation & Identification of E-Journals, and is a set of best practices that gives guidance to publishers on how to present title changes, issue numbering, dates, ISSN information, publishing statements, etc. on their e-journal websites. Currently, it’s pretty much the Wild West out there, with publishers following unique and puzzling practices. PIE-J hopes to help clean up the mess.
Another session that was quite useful was on “CONSER Serials RDA Workflow,” where Les Hawkins, Valerie Bross and Hien Nguyen from Library of Congress discussed the development of RDA training materials at the Library of Congress, including CONSER serials cataloging materials and general RDA training materials from the PCC (Program for Cooperative Cataloging). I haven’t had a chance yet to root around on the Library of Congress website, but these materials are available for free, and include a multi-part course called “Essentials for Effective RDA Learning” that includes 27 hours (yikes!) of instruction on RDA, including a 9 hour training block on FRBR, a 3 hour block on the RDA toolkit, and 15 hours on authority and description in RDA. This is for general cataloging, not specific to serials. Also, because LC is working to develop a replacement for the MARC formats, there is a visualization tool called RIMMF available at that allows for creating visual representations of records and record-relationships in a post-MARC record environment. It sounds promising, but I haven’t had a chance to play with it yet. Also, the CONSER training program, which focuses on serials cataloging, is developing a “bridge” training plan to transition serials catalogers from AACR2 to RDA, which will be available this fall.
Another interesting session I attended was “Automated Metadata Creation: Possibilities and Pitfalls” by Wilhelmina Randtke of Florida State University Law Research Center. She pointed out that computers like black and white decisions and are bad with discretion, while creating metadata is all about identifying and noting important information. Randtke said computers love keywords but are not good with “aboutness” or subjects. So, in her project, she tried to develop a method to use computers to generate metadata for graduate theses. Some of the computer talk got very technical and confusing for me, but her discussion of subject analysis was fascinating. Using certain computer programs for automated indexing, Randtke did a data scrape of the digitally-encoded theses and identified recurring keywords. This keyword data was run through ontologies/thesauruses to identify more accurate subject headings, which were applied to the records. A person needs to select the appropriate ontology/thesaurus for the item(s) and review the results, but the basic subject analysis can be performed by the computer. Randtke found that the results were cheap and fast, but incomplete. She said, “It’s better than a shuffled pile of 30,000 pages. But, it’s not as good as an organized pile of 30,000 pages.” So, her work showed some promise, but still needs some work.
Of course there were a number of other interesting presentations, but I have to leave something for Chris and Derrik to write about. One idea that particularly struck me came from Rick Anderson during his thought provoking all-conference vision session on the final day, “To bring simplicity to our patrons means taking on an enormous level of complexity for us.” That basic idea has been something of an obsession of mine for the last few months while wrestling with authority control and RDA and considering the semantic web. To make our materials easily discoverable by the non-expert (and even the expert) user, we have to make sure our data is rigorously structured and that requires a lot of work. It’s almost as if there’s a certain quantity of work that has to be done to find stuff, and we either push it off onto the patron or take it on ourselves. I’m in favor of taking it on ourselves.
The slides for all of the conference presentations are available here: for anyone who is interested. You do not need to be a member of NASIG to check them out.

Leslie at NCLA 2011

Friday, October 7, 2011 2:50 pm

It was really nice to be able to attend an NCLA conference again — one of my music conferences, as it happens, has been held at the same time for years.

I attended a session on RDA, the new cataloging standard recently beta-tested by LC. Christee Pascale of NCSU gave a very helpful, concise reprise of that school’s experience as a test participant; the staff training program and materials they developed; and advice to others planning to implement RDA.

Presenters from UNCG and UNCC shared a session titled “Technical Services: Changing Workflows, Changing Processes, Personnel Restructuring — Oh My!” Both sites have recently undergone library-wide re-organizations, including the re-purposing of tech services staff to other areas, resulting in pressure to ruthlessly eliminate inefficiencies. Many of the specific steps they mentioned are ones we’ve already taken in ZSR, but some interesting additional measures include:

  • Eliminating the Browsing Collection in favor of a New Books display.
  • Reducing the funds structure (for instance, 1 fund per academic department — no subfunds for material formats)

There also seems to be a trend towards re-locating Tech Services catalogers to Special Collections, in order to devote more resources to the task of making the library’s unique holdings more discoverable; outsourcing or automating as many tech services functions as possible, including “shelf-ready” services, authority control, and electronic ordering; and training support staff (whose time has putatively been freed by the outsourcing/automation of their other tasks) to do whatever in-house cataloging remains. That’s the vision, at any rate — our presenters pointed out the problems they’ve encountered in practice. For instance, UNCC at one point had one person doing the receiving, invoicing, and cataloging: they quickly found they needed to devote more people to the still-significant volume of in-house cataloging that remained to be done even after optimizing use of outsourced services. They’re also feeling the loss of subject expertise (in areas like music, religion, etc.) and of experienced catalogers to make the big decisions (i.e., preparing for RDA).

NCLA plans to post all presentations on their website:



NISO RDA webinar

Wednesday, May 11, 2011 2:29 pm

Today, Erik, Carolyn, Lauren C, Derrik, Alan Keely, Leslie McCall, Steve Kelly, Beth Tedford, Mark McKone, Linda Ziglar, Jean-Paul and Chris B. attended the first of a two part NISO seminar on RDA. The webinar gave a good overview of FRBR and how RDA facilitates the creation of records following the FRBR metadata model.

Rorbert Maxwell did the first part of the session. He did a good job of reviewing the FRBR model and in discussing where RDA standard was relevant. After a tour of what types of searching this approach would enable the presenter said that the next stepss are to implement RDA, design and ER database structure, and figure out how to handle legacy MARC data.

The second part of the session was by John espley at VTLS who covered some of these ‘next steps’ at a high level. He covered some various data modesl including flat file, ER models, and linked MARC records. I was curious to see that he did not discuss object databases or tripple stores as an option. He showed a sample FRBR ER diagram to illustrate how a database model would work.

John talked at length about the role of metadata encoding models and made the assertion that a common encoding model was not a necessary feature of next-generation systems. He asserted rather that interoperability between systems would be based on appropriate encoding crosswalk methods. John indicated that some areas of development included added ability to use Macros, easy access to the RDA toolkit, and more sophisticated workforms.

Leslie at MLA 2011

Monday, February 14, 2011 2:08 am

I’m back from another Music Library Association conference, held this year in Philadelphia. Some highlights:

Libraries, music, and digital dissemination

Previous MLA plenary sessions have focused on a disturbing new trend involving the release of new music recordings as digital downloads only, with licenses restricting sale to end users, which effectively prevents libraries either from acquiring the recordings at all, or from distributing (i.e., circulating) them. This year’s plenary was a follow-up featuring a panel of three lawyers — a university counsel, an entertainment-law attorney, and a representative of the Electronic Frontiers Foundation — who pronounced that the problem was only getting worse. It is affecting more formats now, such as videos and audio books — it’ not just the music librarian’s problem any more — and recent court decisions have tended to support restrictive licenses.

The panelists suggested two approaches libraries can take: building relationships, and advocacy. Regarding relationships, it was noted that there is no music equivalent of LOCKSS or Portico: Librarians should negotiate with vendors of audio/video streaming services for similar preservation rights. Also, libraries can remind their resident performers and composers that if their performances are released as digital downloads with end-user-only licenses, libraries cannot preserve their work for posterity. The panelists drew an analogy to the journal pricing crisis: libraries successfully raised awareness of the issue by convincing faculty and university administrators that exorbitant prices would mean smaller readerships for their publications. On the advocacy side, libraries can remind vendors that federal copyright law pre-empts non-negotiable licenses: a vendor can’t tell us not to make a preservation copy when Section 108 says we have the right to make a preservation copy. We can also lobby state legislatures, as contract law is governed by state law.

The entertainment-law attorney felt that asking artists to lobby their record labels was, realistically speaking, the least promising approach — the power differential is too great. Change, the panelists agreed, is most likely to come through either legislation or the courts. Legislation is the more difficult to affect (there are too many well-funded commercial interests ranged on the opposing side); there is a better chance of a precedent-setting court case tipping the balance in favor of libraries. Such a case is most likely to come from the 2nd or 9th Circuit, which have a record of liberal rulings on Fair Use issues. One interesting observation from the panel was that most of the cases brought so far have involved “unsympathetic figures” — individuals who blatantly abused Fair Use on a large scale, provoking draconian rulings. What’s needed is more cases involving “sympathetic figures” like libraries — the good guys who get caught in the cross-fire. Anybody want to be next? :-)

Music finally joins Digital Humanities

For a couple of decades now, humanities scholars have been digitizing literary, scriptural, and other texts, in order to exploit the capabilities of hypertext, markup, etc. to study those texts in new ways. The complexity of musical notation, however, has historically prevented music scholarship from doing the same for its texts. PDFs of musical scores have long been available, but they’re not searchable texts, and not encoded as digital data, so can’t be manipulated in the same way. Now there’s a new project called the Music Encoding Initiative, jointly funded by the National Endowment for the Humanities and the German Deutsche Forschungsgemeinschaft. MEI (yes, they’ve noticed it’s also a Chinese word for “beauty”) has just released a new digital encoding standard for Western classical musical notation, based on XML. It’s been adopted so far by several European institutions and by McGill University. If, as one colleague put it, it “has legs,” the potential is transformative for the discipline. Whereas critical editions in print force editors to make painful decisions between sources of comparable authority — the other readings get relegated to an appendix or supplementary volume — in a digital edition, all extant readings can be encoded in the same file, and displayed side by side. An even more intriguing application of this concept is the “user-generated edition”: a practicing musician could potentially approach a digital edition of a given work, and choose to output a piano reduction, or a set of parts, or modernized notation of a Renaissance work, for performance. Imagine the savings for libraries, which currently have to purchase separate editions for all the different versions of a work.

Music and metadata

In a session titled “Technical Metadata for Music,” two speakers, from SUNY and a commercial audio-visual preservation firm respectively, stressed the importance of embedded metadata in digital audio files. Certain information, such as recording date, is commonly included in filenames, but this is an inadequate measure from a long-term preservation standpoint: filenames are not integral to the file itself, and are typically associated with a specific operating system. One speaker cited a recent Rolling Stone article, “File not Found: the Recording Industry’s Storage Crisis” (December 2010), describing the record labels’ inability to retrieve their backfiles due to inadequate filenames and lack of embedded metadata. Metadata is now commonly embedded in many popular end-user consumer products, such as digital cameras and smartphones.

For music, embedded metadata can include not only technical specifications (bit-depth, sample rate, and locations of peaks, which can be used to optimize playback) but also historical context ( the date and place of performance, the performers, etc.) and copyright information. The Library of Congress has established sustainability factors for embedded metadata (see One format that meets these requirements is Broadcast Wave Format, an extension of WAV: it can store metadata as plain text, and can include historical context-related data. The Technical Committee of ARSC (Association of Recorded Sound Collections) recently conducted a test wherein they added embedded metadata to some BWF-format audio files, and tested them with a number of popular applications. The dismaying results showed that many apps not only failed to display the embedded metadata, but also deleted it completely. This, in the testers’ opinion, calls for an advocacy campaign to raise awareness of the importance of embedded metadata. ARSC plans to publish its test report on its website ( The software for embedded metadata that they developed for the test is also available as a free open-source app at

Music cataloging

A pre-conference session held by MOUG (Music OCLC Users Group) reported on an interesting longitudinal study that aimed to trace coverage of music materials in the OCLC database. The original study was conducted in 1981, when OCLC was relatively new. MOUG testers searched newly-published music books, scores, and sound recordings, as listed in journals and leading vendor catalogs, along with core repertoire as listed in ALA’s bibliography Basic Music Library, in OCLC, and assessed the quantity and quality of available cataloging copy. The study was replicated in 2010. Exact replication was rendered impossible by various developments over the intervening 30 years — changes in the nature of the OCLC database from a shared catalog to a utility; more foreign and vendor contributors; and the demise of some of the reference sources used for the first sample of searched materials, necessitating substitutions — but the study has nevertheless produced some useful statistics. Coverage of books. not surprisingly, increased over the 30 years to 95%; representation of sound recordings also increased, to around 75%; but oddly, scores have remained at only about 60%. As for quality of the cataloging, the 2010 results showed that about 20% of sound recordings have been cataloged as full-level records, about 50% as minimal records; about a quarter of scores get full-level treatment, about 50% minimal. The study thus provides some external corroboration of long-perceived music cataloging trends, and also a basis for workflow and staffing decisions in music cataloging operations.

A session titled “RDA: Kicking the Tires” was devoted to the new cataloging standard that the Library of Congress and a group of other libraries have just finished beta-testing. Music librarians from four of the testing institutions (LC, Stanford, Brigham Young, U North Texas, and U Minnesota) spoke about their experiences with the test and with adapting to the new rules.

All relied on LC’s documentation and training materials, recording local decisions on their internal websites (Stanford has posted theirs on their publicly-accessible departmental site). An audience member urged libraries to publish their workflows in the Toolkit, the online RDA manual. It was generally agreed that the next step needed is the development of guidelines and best practices.

None of the testers’ ILSs seem to have had any problems accomodating RDA records in MARC format. LC has had no problems with their Voyager system, corroborating our own experience here at WFU. Some testers reported problems with some discovery layers, including PRIMO (fortunately, we haven’t seen any glitches so far with VuFind). Stanford reported problems with their (un-named) authorities vendor, mainly involving “flipped” (changed name order) entries. Most testers are still in the process of deciding which of the new RDA data elements they will display in their OPACs.

Asked what they liked about RDA, both the LC and Stanford speakers cited the flexibility of the new rules, especially in transcribing title information, and in the wider range of sources from which bib info can be drawn. Others welcomed the increased granularity, designed to enhance machine manipulation, and the chance this affords to “move beyond cataloging for cards” towards the semantic web and relation-based models. It was also noted that musicians are already used to thinking in FRBR fashion — they’ve long dealt with scores and recordings, for instance, as different manifestations of the same work.

Asked what they thought “needed fixing” with RDA, all the panelists cited access points for music (the LC speaker put up a slide displaying 13 possible treatments of Rachmaninoff’s Vocalise arranged for saxophone and piano). There are other areas — such as instrument names in headings — that the RDA folks haven’t yet thought about, and the music community will probably have to establish its own practice. Some catalogers expressed frustration with the number of matters the new rules leave to “cataloger’s judgment.” Others mentioned the difficulty of knowing just how one’s work will display in future FRBRized databases, and of trying to fit a relational structure into the flat files most of us currently have in our ILSs.

What was most striking about the session was the generally upbeat tone of the speakers — they saw more positives than negatives with the new standard, assured us it only took some patience to learn, and were convinced that it truly was a step forward in discoverability. One speaker, who trains student assistants to do copy-cataloging, telling them “When in doubt, make your best guess, and I’ll correct it later,” observed that her students’ guesses consistently conformed to RDA practice — some anecdotal evidence suggesting that the new standard may actually be more intuitive for users, and that new catalogers will probably learn it more easily than those of us who’ve had to “unlearn” AACR2!


Our venue was the Loews Philadelphia Hotel, which I must say is the coolest place I’ve ever stayed in. The building was the first International Style high-rise built in the U.S., and its public spaces have been meticulously preserved and/or restored, to stunning effect. The first tenant was a bank, and so you come across huge steel vault doors and rows of safety-deposit boxes, left in situ, as you walk through the hotel. Definitely different!

Another treat was visiting the old Wanamaker department store (now a Macy’s) to hear the 1904 pipe organ that is reputed to be the world’s largest (

Dianne Hillman on collaborative opportunities

Tuesday, February 8, 2011 12:41 pm

The second keynote of the morning was Dianne Hillman – she talked about collaborations between programmers and catalogers.

Dianne dated her career by showing us a few tools that I remember from my early time as a librarian (Cord catalog rods and a card filer)! I wonder what that says about the pace of change from the early 1970s to the early 1990s. For most of her talk however Dianne focused on the emerging roles of catalogers in libraries and potential collaborations that exist between catalogers and programmers. Dianne has published a few times in the past few years about RDA 1) and 2) (among others) and it was interesting to hear her thoughts about the intersection between MARC, RDA, ISBD, AACR, RDF, XML and other ABTs (Acronym based technologies).

Her presentation focused on the need to re-shape the cataloging profession and as such she spent a few minutes talking about the potential impact of RDA encoded in RDF in terms of serving as a replacement for the MARC encoding and representation standards. She introduced some concepts from her recent publications including metadata registries, use of identifiers as opposed to literals in records and use of single record or vocabulary repositories as opposed to replicated records across thousands of databases.

The audience asked some interesting questions 1) about economics of migration (it is tough but not changing is not an option), 2) about the future of cataloging in libraries (traditional cataloging is diminishing – copy cataloging is the current model, distributed cataloging/data-geeking is the future, getting rid of all the catalogers first does not make sense – get them the skills to change), 4) what programmers could learn from the cataloging community (creativity in data representation and use, understanding of the complexities of library data).

The rest of the morning and early afternoon is devoted to short IT presentations & should be very interesting.

2007 ACRL Baltimore
2007 ALA Annual
2007 ALA Gaming Symposium
2007 ALA Midwinter
2007 ASERL New Age of Discovery
2007 Charleston Conference
2007 ECU Gaming Presentation
2007 ELUNA
2007 Evidence Based Librarianship
2007 Innovations in Instruction
2007 Kilgour Symposium
2007 LAUNC-CH Conference
2007 LITA National Forum
2007 NASIG Conference
2007 North Carolina Library Association
2007 North Carolina Serials Conference
2007 OCLC International ILLiad Conference
2007 Open Repositories
2007 SAA Chicago
2007 SAMM
2007 SOLINET NC User Group
2007 UNC TLT
2008 Leadership Institute for Academic Librarians
2008 ACRL Immersion
2008 ALA Annual
2008 ALA Midwinter
2008 ASIS&T
2008 First-Year Experience Conference
2008 Lilly Conference
2008 LITA
2008 NASIG Conference
2008 North Carolina Serials Conference
2008 ONIX for Serials Webinar
2008 Open Access Day
2008 SPARC Digital Repositories
2008 Tri-IT Meeting
2009 ACRL Seattle
2009 ALA Annual
2009 ALA Annual Chicago
2009 ALA Midwinter
2009 Big Read
2009 code4lib
2009 Educause
2009 Handheld Librarian
2009 LAUNC-CH Conference
2009 LAUNCH-CH Research Forum
2009 Lilly Conference
2009 LITA National Forum
2009 NASIG Conference
2009 NCLA Biennial Conference
2009 NISOForum
2009 OCLC International ILLiad Conference
2009 RBMS Charlottesville
2009 SCLA
2009 UNC TLT
2010 ALA Annual
2010 ALA Midwinter
2010 ATLA
2010 Code4Lib
2010 EDUCAUSE Southeast
2010 Handheld Librarian
2010 ILLiad Conference
2010 LAUNC-CH Research Forum
2010 LITA National Forum
2010 Metrolina
2010 NASIG Conference
2010 North Carolina Serials Conference
2010 RBMS
2010 Sakai Conference
2011 ACRL Philadelphia
2011 ALA Annual
2011 ALA Midwinter
2011 CurateCamp
2011 Illiad Conference
2012 SNCA Annual Conference
ACRL 2013
ACRL 2015
ACRL New England Chapter
ALA Annual
ALA Annual 2013
ALA Editions
ALA Midwinter
ALA Midwinter 2012
ALA Midwinter 2014
ALCTS Webinars for Preservation Week
ARL Assessment Seminar 2014
Audio streaming
authority control
Berkman Webinar
bibliographic control
Book Repair Workshops
Career Development for Women Leaders Program
Carolina Consortium
CASE Conference
Celebration: Entrepreneurial Conference
Charleston Conference
CIT Showcase
Coalition for Networked Information
Conference Planning
Copyright Conference
CurateGear 2013
CurateGear 2014
Designing Libraries II Conference
DigCCurr 2007
Digital Forsyth
Digital Humanities Symposium
Disaster Recovery
Discovery tools
Educause SE
Electronic Resources and Libraries
Embedded Librarians
Entrepreneurial Conference
ERM Systems
evidence based librarianship
Future of Libraries
Gaming in Libraries
Google Scholar
Handheld Librarian Online Conference
Hurricane Preparedness/Solinet 3-part Workshop
information design
information ethics
Information Literacy
Innovation in Instruction
Innovative Library Classroom Conference
Institute for Research Design in Librarianship
Journal reading group
LAMS Customer Service Workshop
Learning spaces
Library 2.0
Library Assessment Conference
Library of Congress
Lilly Conference
LITA National Forum
Mentoring Committee
Metrolina 2008
MOUG 2010
Music Library Assoc. 07
Music Library Assoc. 09
Music Library Assoc. 2010
Music Library Association
National Library of Medicine
NCCU Conference on Digital Libraries
NCLA Biennial Conference 2013
NCLA Biennial Conference 2015
NHPRC-Electronic Records Research Fellowships Symposium
North Carolina Serial Conference 2014
North Carolina Serials Conference
Offsite Storage Project
OLE Project
online catalogs
online course
Online Learning Summit
open access
Peabody Library Leadership Institute
Preservation Activities
Preserving Forsyth LSTA Grant
Professional Development Center
rare books
SAA Class New York
SAMM 2008
SAMM 2009
Scholarly Communication
Social Stratification in the Deep South
Social Stratification in the Deep South 2009
Society of American Archivists
Society of North Carolina Archivists
Southeast Music Library Association
Southeast Music Library Association 08
Southeast Music Library Association 09
SPARC webinar
subject headings
Sun Webinar Series
TALA Conference
Technical Services
ThinkTank Conference
UIPO Symposium
user studies
video-assisted learning
visual literacy
Web 2.0
WFU China Initiative
Women's History Symposium 2007
ZSR Library Leadership Retreat
November 2015
October 2015
September 2015
August 2015
July 2015
June 2015
May 2015
April 2015
March 2015
February 2015
January 2015
December 2014
November 2014
October 2014
August 2014
July 2014
June 2014
May 2014
April 2014
March 2014
February 2014
January 2014
December 2013
November 2013
October 2013
August 2013
July 2013
June 2013
May 2013
April 2013
March 2013
February 2013
January 2013
December 2012
November 2012
October 2012
September 2012
August 2012
July 2012
June 2012
May 2012
April 2012
March 2012
February 2012
January 2012
December 2011
November 2011
October 2011
September 2011
August 2011
July 2011
June 2011
May 2011
April 2011
March 2011
February 2011
January 2011
December 2010
November 2010
October 2010
September 2010
August 2010
July 2010
June 2010
May 2010
April 2010
March 2010
February 2010
January 2010
December 2009
November 2009
October 2009
September 2009
August 2009
July 2009
June 2009
May 2009
April 2009
March 2009
February 2009
January 2009
December 2008
November 2008
October 2008
August 2008
July 2008
June 2008
May 2008
April 2008
March 2008
February 2008
January 2008
November 2007
October 2007
September 2007
August 2007
July 2007
June 2007
May 2007
April 2007
March 2007
February 2007
January 2007

Powered by, protected by Akismet. Blog with