Professional Development

In the 'Electronic Resources and Libraries' Category...

2014 ER&L virtual conference

Thursday, April 10, 2014 5:03 pm

For the second year running, I “attended” the Electronic Resources & Libraries conference by watching streamed sessions. I still plan on watching sessions as time permits throughout the year, since the group purchase that Derrik made runs until the next conference is held in 2015. (ZSR folks: Ask Derrik if you need the password.)

One trend that popped up in multiple presentations was Evidence-Based Acquisition (EBA). Like its close relative Demand-Driven (or Patron-Driven) Acquisition, it has two names and two initialisms. So you may also hear of Usage-Driven Acquisition (UDA). With EBA, you give a provider an up-front deposit, say, $5,000. Then then provider turns on their entire catalog of e-books or streaming films. After a set time, say, a year, you get a usage report and can choose $5,000-worth of products for permanent ownership. There are some pros and cons to this approach, especially vis-à-vis DDA. (What if you don’t get $5,000 worth of use? What if all the use is long tail with no “short head”?) However, since providers who use this model generally do not participate in DDA models, EBA may be the most cost-effective way to buy certain types of material.

Another hot topic was the end-user experience with e-books and certain multimedia databases. Basically, it’s bad. Typical problems with e-books include not being able to print, not being able to use the book on certain devices, not being able to store the book for later consultation. Multimedia has a different but related set of concerns. (I’m reminded of this comic and this infographic. They both claim that poor UX drives customers to piracy.) The presenters didn’t go as far as claiming that library resources drive folks to piracy, but they did claim that students will instead either download free alternatives or the “haves” might buy individual copies instead, which could magnify the effects of economic disparities among students. The presenters insisted that libraries should put their collective foot down and refuse to buy user-hostile resources (even if the information contained is high quality). They called out one well-known database as particularly awful. A quick check of that library’s website established that they still subscribe to the bad product, so the force of their argument was somewhat undermined. I have hope, however, because I can remember a time in the 90s when e-journals and e-newspapers were just as bad as e-books are today. Printing from JSTOR used to be a nightmare, and you had to use certain specific computers if you wanted to use ProQuest. Then you had to use a different computer entirely for LexisNexis. These days, e-journals generally just work. Maybe e-books and multimedia sites will get there someday if we keep leaning on the vendors and if we at least occasionally refuse to buy products that are the worst UX offenders.

Carol at ER&L

Thursday, April 26, 2012 12:09 pm

Impressions from the Electronic Resources & Libraries conference …

E-books and DDA

When CSU-Fullerton had a budget cut, they prioritized their DDA program and instead cut their approval plan. They skipped the intermediate step of an e-preferred approval profile.

In our own presentation, Derrik and I asserted that annual spending on DDA clusters around $4-$7 per FTE. Outrageous spending seen at other institutions might simply reflect a large FTE. With that thesis in mind (seeking confirmation bias?), we noted during other presentations that CSU-Fullerton is on track for $5/FTE. University of Denver spent $6/FTE.

An EBL rep reminded us to prepare for an increased percentage of triggered purchases each passing year as more infrequently-used books reach the trigger point.

A YBP rep mentioned that e-books now account for 10% of sales.

E-books vs. print books: The University of Denver examined usage in cases where they owned both the print and digital copies of the same book. High e-usage correlated with high print-usage (and vice versa), but without a clear causal link. Apparently, relevant content generates high use of both formats. About half of their presentation covered methodology – problems like separate ISBNs for each format made for a very time-consuming project.

E-journals and Big Deal alternatives

CSU-Fullerton used CCC’s Get It Now service to provide e-journals (with transactional payments) instead of ILL or subscribing. They did not anticipate that the same individual would sometimes download the same article multiple times. How to control for that in a patron-friendly way?

CUNY Graduate Center outlined how they eliminated a Big Deal. Essentially the content of that particular deal did not match current institutional strengths. By contrast, every time I’ve examined WFU use stats, the Big Deal for journals comes out ahead of the à-la-carte model.

Another presenter gave a sophisticated analysis of Big Deal journal usage for a consortium of libraries in the UK. He determined how much they would have to pay in Document Delivery or extra subscription charges if they left the Big Deal and returned to an à-la-carte model. In the end, the consortium renewed with both Big Deal publishers under consideration. The speaker’s model included a percentage use increase each year. He stated that use (i.e. journal article downloads) went up 14% each year. I’ve never thought to account for that before, but I could see whether that holds true for WFU. (If use does indeed go up, does it reflect enrollment growth or an increase in per-FTE consumption?)

CLOCKSS

Libraries (including ZSR) pay for hosting of the CLOCKSS archive at multiple sites worldwide. A speaker noted that the Japanese CLOCKSS site went down due to electric grid malfunctions in the aftermath of the earthquake/tsunami. The site restored itself with data from the other CLOCKSS sites over the next several months thereafter.

Discovery Layer

A speaker from Oklahoma State University investigated a question that Lynn has asked me to look into: If you have a discovery service (like Summon), do you still need A&I databases? OSU examined one case where a low-use A&I database offered a huge price increase. Her methodology was:

  1. Find the overlap between the A&I database and Summon.
  2. For unique titles, determine whether the library has holdings, and whether the title is in English.

Her findings:

  • For the database at issue, OSU determined that about 92% of the titles were covered (at least partially) in Summon. Of the remaining 8%, OSU held 6% (or, 0.48% of the entire list), and those holdings were generally both fragmentary and old.
  • About 75% of the unique titles were non-English. They also examined ILL requests for the unique titles, and discovered there had been none over the past two years.

Ultimately, they cancelled two A&I Databases using this methodology. At WFU, the true duds among our A&I databases have been cancelled already (unless bundled with something else). Therefore, I wouldn’t want to replicate this approach unless (as at OSU) the database is already low-use, budget pressures apply, and a faction protests the cancellation by playing the “unique content” card.

Copyright

One of the keynote addresses introduced the ARL Code of Best Practices in Fair Use for Academic and Research Libraries. This booklet covers scenarios like

  • reproducing portions of special collections items for the purpose of exhibit
  • e-reserves,
  • and many more.

Derrik at ER&L 2012

Wednesday, April 25, 2012 11:35 am

I had a very good conference experience with the 2012 Electronic Resources & Libraries (ER&L) conference. It’s almost overwhelming just to look at all the notes I took! ER&L really packs a lot into a 2.5-day conference, averaging 8 sessions a day. And if that’s not enough going on, you can follow even more sessions via Twitter.

My two main areas of focus for this conference were e-resource management systems (ERMS) and demand-driven acquisition (DDA).

ERMS. The first set of breakout sessions included a panel of 8 librarians representing a total of 5 ERM systems. I was one of two CORAL users on the panel. For those of you who are wondering, an ERMS helps Resource Services personnel keep track of databases and licenses-things like license terms, user limits, vendor contact information, etc. The panel discussion used a “buffet” metaphor, and the idea was for audience members to get a sampling of the different ERMS options. The format was fast-moving, even with a two-hour time slot. It was interesting how different sites use the same product differently, and see different strengths & weaknesses of that product. Common themes that emerged in the discussion included using the ERMS for internal communication, desires for better usage statistics management, and Interlibrary Loan permission as the only license term that anyone outside of e-resource management really cares about. And I discovered I’m not the only one who thinks CORAL should include subject headings for databases.

ERMS buffet

At the CORAL user group meeting (my first as an actual user), I learned more about the new CORAL Steering Committee. As I have described in previous blog posts, CORAL was developed by librarians at Notre Dame. But as adoption has increased, Notre Dame’s capacity to develop the product has been diminished. So they have formed a Steering Committee, with librarians from Texas A&M, Duke Medical Library, and the College of New Jersey. The committee will make product decisions and actively develop fixes and enhancements. As always, other libraries are also allowed to contribute code.

On a more general ERMS note, I attended a presentation by Tim Jewell, who has chaired a NISO working group on ERM Data Standards and Best Practices <http://www.niso.org/workrooms/ermreview>, a successor to the ERMI data initiative. Among other things, ERMI defined standards for what data elements should be tracked by an ERMS and has given direction to the development of other e-resource management standards such as SUSHI (usage statistics) and ONIX-PL (communication of license terms). The working group released a report in January (available at the website). The report (and Jewell’s presentation) recognizes that other standards initiatives, many of which have grown out of ERMI, provide greater granularity than ERMI. Thus the working group recommended that NISO not continue to develop the ERMI data dictionary, but instead continue to support these more targeted initiatives.

Sorry for the ERM geek-out; I hope I didn’t overwhelm you too much. Moving on…

DDA. Based on this conference, it seems like demand-driven acquisition is moving out of the pilot phase and is moving toward becoming a more accepted practice. Carol and I presented stats and findings from ZSR’s first year of DDA. We also saw data from the University of Denver’s DDA program, and it appears that they spent about $6 per FTE during fiscal year 2011, close to ZSR’s per-FTE spend of $5. But librarians from Calif. State Fullerton said that their DDA expenditure increased significantly in the second year-something for us to keep an eye on. We also learned that NISO is reviewing a proposal to develop best practices for DDA.

One question about DDA that was brought up a couple of times was planning for removal of titles. As the number of available titles increases, is there a need to “weed” outdated ones? If so, how would this be accomplished? No one offered any answers, just raising the question.

Publishers and vendors are also coming to grips with DDA. DDA is forcing them to re-think their sales models, moving from the predictability of Approval sales to the unpredictable volume and timing of patron-driven sales. Oxford Univ. Press is investing more heavily in discoverability, trying to make all Oxford content cross-searchable. Matt Nauman, from YBP, described their DDA service, and said that YBP is seeing a need to develop an e-book collection management service rather than relying strictly on sales.

JSTOR. John Lenahan from ITHAKA described some of the results of JSTOR user data analysis, and some of the projects they are working on as a result. JSTOR has found that a major portion of their users are coming to JSTOR from outside the library (mostly via Google), resulting in a high number of unnecessary turnaways. So JSTOR is developing some really cool features to address this. First of all, JSTOR has made all journal content published prior to 1923 free to anyone. The are also working on a “Register to Read” function, where a user can “borrow” up to 3 articles at a time. What’s really cool, though, is the “Institutional Finder,” which will prompt the user saying “You are not logged in from an affiliated institution,” and will allow the user to select their university and log in via the proxy server. Finally, they are building an integration with discovery services, providing the user with a link to re-do their JSTOR search on their library’s Summon instance.

Turnover. I attended a session on reducing information loss when there’s staff turnover, thinking of all the information stored in an individual’s memory, e-mail account, hard drive, etc. Strategies suggested included using an ERMS, wikis &/or LibGuides, and project management tools. The speaker also suggested using a checklist for departing personnel. One tip I liked was to create a generic institutional e-mail account to list with vendors so that when a person leaves you can just redirect that account rather than having to contact all those vendors.

AR. I learned about a project at the University of Manchester, where they have developed Augmented Reality (AR) apps in conjunction with Special Collections exhibits. For example, a student might point their smartphone camera at a 200-year-old printing press, or a copy of Dante’s Inferno, and can tap certain areas of their screen to get more information. The externally-funded project represented cooperation among software developers, tech support, librarians, and academic departments. They found it to be most meaningful for 1st- and 2nd-year undergraduates, less so for experienced students and researchers. In case you’re wondering (like I was), their Special Collections dept. has iPads available for checkout for patrons who don’t have a smartphone. More about the project is available at http://teamscarlet.wordpress.com/ .

ER&L is a great conference to follow on Twitter. There are quite a few attendees (including yours truly) who tweet during sessions, and with only three or four concurrent sessions, the conversations are fairly easy to follow. The conference organizers tried something new this year–in addition to the conference hashtag, they assigned a separate hashtag for each session. It was a good idea (IMHO), but apparently wasn’t publicized very well and had only moderate uptake. It will probably work better next year.

Finally, here are some miscellaneous sound bytes either from my notes or from the conference Twitter stream:
@AnAnarchivist: “Accepting other people’s opinions is an expectation, we want other’s opinions, and expect our opinions to be welcome. #erl12 #millennials”
“Unlikely you’ll ever be down to 1 tool” for managing e-resources – Heidi Zuniga, University of Colorado medical campus
“IP addresses are not an identity” – Thomas Blood, Naval Postgraduate School
@library_chic: “print books were all shareable across consortia. ebooks are, in most cases, not shareable #consortia #erl12″
@annacreech: “What a cataloger thinks a title is and what a vendor thinks a title is are two different things. #ebookpbook #erl12″
@tmvogel: “UDenver: Going through data fast, but it looks like they saw higher per title usage for the titles in both formats #erl12 #ebookpbook”

Saving the World and Some $$ – Roz at ALA

Monday, January 18, 2010 12:19 pm

My Saturday afternoon at ALA Midwinter went from the sublime to the sublime. First I got to hear Al Gore speak about his new book Our Choice: A Plan to Save The Earth – he was inspiring as always (if you find him inspiring as I do). He made some interesting points about the information and misinformation that has influenced the debate on climate change and the our responses to it. I won’t bore you with too many details, but one funny thing he said came in his discussions of the technologies that will bring about real solutions to the crisis. He went through the usual suspects: wind, solar, biomass, nuclear and then he said ‘there’s one other technology we need and that is one that removes CO2 from the atmosphere and converts it into something usable – and the good news is we have that technolgy now – it’s called a tree.’ And he went on to discuss reforestation.

The second half of my afternoon Saturday was spent in a focus group with Alexander Street Press. If you know their product, you know they are top quality and usually top dollar as well. I was asked to participate in a focus group on a new product called America History in Video. I had seen a preview of the product at ALA Annual in Chicago last summer and was eager to see more about it. About 12 librarians from a variety of instiutions were a part of the focus group. Some had trialed the product, one had already purchased the product and others, like me, wanted more information. The editor of the product discussed its purpose, development and content. She also showed us some of the amazing features: transcripts of all the spoken and subtitled words in the films, the ability to create playlists that include clips from any ASP product as well as links to other web content, and more. The development team had lots of questions for us about the kinds of content that might be appropriate to add to the product – unedited interviews, for example, that were later edited and included in PBS documentaries, for example. We were also asked for our opinions as to how they might move forward for a world history in video product. There were lots of interesting perspectives in the group and it was great to work with a company that was so knowledgeable about their products and eager to hear from us. The best news of all, however, is that I won us a year subscription to the product!! It has such far ranging interests that I think faculty from History, Political Science, Communication, Anthropology, American Ethnic Studies and other departments will find it extremely exciting and I look forward to what they do with the content!

Leslie at MLA 2009

Monday, March 16, 2009 7:59 pm

I’m back from this year’s annual conference of the Music Library Association, held in Chicago (during a snowstorm) Feb. 17-21. This year I also attended the pre-conference hosted by MOUG (Music OCLC Users Group). Some highlights:

Sound Recordings and Copyright

Tim Brooks of the Association of Recorded Sound Collections described the ARSC’s work lobbying Congress to reform US copyright law on pre-1972 sound recordings. These recordings are not covered by federal law, but are often governed by state law, which tends to give copyright holders, in Tim’s words, “absolute control.” Tim cited some startling statistics: of all recordings made in the 1940s-70s, only 30% have been made available by the copyright holders; of recordings made in the 1920s-30s, only 10% are available; and of the enormous corpus of ethnic and traditional music from all over the world that was recorded by Columbia and Victor in the early years of the 20th century, only 1% is available. Because US copyright law for sound recordings is the most restrictive in the world, early recordings of American artists are currently legally available in other countries but not in the US — which means that American libraries and archives are unable to preserve this portion of our own heritage.

In response, the ARSC has made the following reccomendations:

  • Place pre-1972 recordings under a single federal law.
  • Harmonize US copyright law with that of other countries.
  • Legalize use of “orphaned” works (whose copyright holders cannot be identified).
  • Permit use of “abandoned” works, with compensation to the copyright holders.
  • Permit “best practices” digitization for preservation. Libraries and archives are the most likely to preserve early recordings (they have a better track record on this than the recording companies themselves) and the least likely to re-issue recordings (so they’re no financial threat to copyright holders).

Of ARSC’s experiences lobbying Congress members, Tim reports that many were simply unaware of the situation, but were sympathetic when informed; that libraries are seen as non-partisan and a public good, “the guys in the white hats”; and that there is now much “soft” support in Congress. Other ARSC activities include a “white paper” for the Obama administration, and the establishment of an organization called the Historical Recording Coalition for Access and Preservation (HRCAP) to further lobbying efforts.

In another copyright session, attendees and speakers offered some good tips for approaching your legal counsel re digitization projects:

  • Present your own credentials (copyright workshops you’ve attended, etc.) pertaining to libraries and copyright.
  • Cite specific passages of the law (section 108, 110, etc.)
  • Show you’ve done due diligence (e.g., you’ve replaced LPs with CD re-issues where available; you’ve determined other LPs are in deteriorating condition, etc.)
  • Try to persuade counsel to adopt a “risk assessment” approach (i.e., just how likely is it that a copyright holder will challenge you in this case) versus the more typical “most conservative” approach.
  • File a “contemporaneous writing” — a memo or other document, written at the outset of a digitization project, in which you explain why you believe that you are acting in good faith. This will go a long way towards protecting you if you are in fact challenged by a copyright holder.

Is the Compact Disc Dead?

This was the question addressed by a very interesting panel of speakers, including a VP of Digital Product Strategy at Universal Music Group; the CEO of the Cedille recording label; a concert violinst (Rachel Barton Pine); a former president of the American Symphony Orchestra League; and a music librarian at Northwestern U.

The panel quickly cited a number of reasons to believe that the CD remains a viable format: among these, the universal human desire to own a physical artifact “to give and to show”; the ability to listen on room speakers, not just earbuds; violinst Pine noted that she sells and autographs some 40-70 of her CDs after each performance, that people enjoy the personal contact with the artist, and relish being able to take home a souvenir of the concert. Flaws of downloadable releases were cited in comparison: garbled indexing, making identifying and retrieving of classical works difficult; frequent lack of program notes to provide historical context; the inferior audio quality of compressed files. Changes in student behavior were also noted: in online databases, students tend to retrieve only selected works, or excerpts of works; there doesn’t seem to be the inherent incentive to browse like that offered by physical albums, with the result that students don’t develop as much in-depth knowledge of a composer’s works. On the other hand, the reduced cost of digital distribution has enabled smaller orchestras and other groups to reach a larger audience.

Concern was expressed over an increasing trend among major labels to release performances only in the form of downloadable files, often with a license restricted to “end user only” — preventing libraries from purchasing and making available these performances to their users. The panel proposed that performers and IAML (the International Association of Music Libraries) put pressure on the record companies. Alternative approaches? CDs-on-demand: Cedille’s boss sees this as a growing trend. Also, consortial deals with individual record companies: OhioLink has recently done one with Naxos.

Finally, a concern was expressed about the aggregator model of audio-steaming databases: that these hamper libraries’ responsiveness to local user needs, and the building of the unique collections important for research. The music library community needs to negotiate for distribution models that enable individual selection for traditional collection development.

How Music Libraries are Using New Technologies

  • Videos demonstrating specific resources, such as composers’ thematic catalogs (similar to Lauren’s Research Toolkits).
  • “Un-associations,” in informal online forums like Yahoo or Google groups. There are currently groups for orchestra libraries, flutists, etc.
  • Use of Delicious to create user guides.
  • Meebo for virtual ref.
  • Twitter for virtual ref and for announcements/updates.
  • Widgets and gadgets to embed customized searches, other libraries’ searchboxes, and other web content into LibGuides, etc.
  • ChaCha (a cellphone question-answering service) for virtual ref. Indiana U is partnering with ChaCha in a beta test.

JSTOR

A JSTOR rep presented palns to add 20 more music journals to the database, including more area-studies and foreign-language titles. Attendees pointed out that popular music serials (Downbeat, Rolling Stone, etc.) are becoming primary source material for scholarly research — would JSTOR consider including them? The rep replied that JSTOR originally required that journals be peer-reviewed, but had recently begun to relax this rule. A dabate ensued among attendees as to whether the pop publications were sufficiently relevant to JSTOR’s mission — some believed that JSTOR should stick to its original focus on scholarly literature, and that others could preserve the pop stuff.

Bibliographic Control and the LC Working Group (or: Music Catalogers Freak Out)

The MOUG plenary session gave catalogers a forum to discuss ramifications of the LC Working Group’s recommendations on bibliographic control (see my blog posting for RTSS 08). Concerns expressed:

If collaboration is properly defined as “doing something together for a purpose,” then the disparate (and sometimes opposing) purposes of publishers, vendors, and libraries means that LC’s vision of collective responsibility for metadata and bibliographic control will not constitute true collaboration, but merely exploitation.

The Working Group appears to some to harbor a naive faith in digital architecture to meet all discovery and retrieval needs (it reminded one attendee of predictions that microform would solve all our problems). This is perceived to cultivate a gobal, generalist, one-size-fits-all outlook divorced from existing patterns of scholarly communication and “communities of practice” (e.g., the subject specialist and the community of practitioners that he/she serves). Bibliographic control should be “a network of communication between communities of practice.” An MLA liaison to ALA’s RDA committee noted that the RDA folks expected local catalogers to help fill in the gaps in the currently-vague RDA code — but when specialist communities actually propose details (such as a list of genre terms for music), they’re “dissed.”

Others fear that if LC backs away from its historical role as national library, relying on the larger community of publishers, vendors, and libraries to collaborate in bibliographic control, the actual effect will be that library administrators will think: “If LC isn’t doing this work, then we don’t have to either” — and collaboration will disappear.

Yet others fear the “commodification of cataloging.” With the increasing availability of MARC records and other metadata from third-party sources, there seems to be a growing perception that all metadata is the same — and a concommitant decline in willingness to investigate its source and quality. Administrators increasingly speak of metadata as a commodity.

Remember Katrina?

I’ll close with an item from the business meeting of SEMLA (the Southeast chapter) which was a cause of great celebration: our colleagues from Tulane University in New Orleans, whose music collection was flooded in Hurricane Katrina, announced that 70% of their collection has successfully been restored, and the last portion of it recently returned to them. They brought along a few representative items for show and tell — including a score died pink by its red paper covers. Recalling photos of the original damage, a 70% recovery rate seems a miracle!

Leslie at SEMLA ’08

Monday, October 13, 2008 6:25 pm

On Oct. 9, I drove down to East Carolina University in Greenville for the annual meeting of the Southeast Music Library Association. It was a very interesting and varied program this year:

Library “Infomercials”

Nathalie Hristov, Music Librarian at UT Knoxville, gave a presentation titled “The Music Library Informercial: a Practical Guide for Creating the Most Powerful Marketing Tool You Will Ever Use.” Nathalie had noticed that certain materials in the Music Library — audio-streaming databases, directories, vocational literature (job ads, etc.) — seemed to be under-utilized. She contacted Alan Wallace, UT’s Education Librarian, who had made videos for the main library, about producing an infomercial on the Music Library’s resources and services, with a special focus on the under-used resources, to be shown at the music school’s fall convocation, which all students were required to attend.

The infomercial fulfilled all expectations: surveys conducted before and after showed increased student awareness of the Music Library’s services in general; an increase in the number of students who knew about the under-used materials and who had used or planned to use them; and a large majority who reported that they found the infomercial to be both entertaining and helpful.

Nathalie’s and Alan’s advice on the nuts-and-bolts of producing an infomercial:

Script:

  • Don’t overload your infomercial. Decide what you want to focus on (e.g. under-used resources), and cut your script to make it as concise as possible.
  • Keep narration to a minimum, or you’ll lose viewers’ attention.
  • Speak the students’ language (not librarianese).
  • Play on students’ strengths, wants, and needs (papers due, rehearsals to prepare for, finding a job after graduation).

Scheduling:

  • Create a timeline. Divide the project into sections, and set a deadline for each section’s completion.
  • Stay on schedule to avoid losing currency of information.

Cast:

  • Use local talent. (One option: drama students.)

Taping:

  • Survey your venue for aesthetics. Ugly objects like trash receptacles, signs taped up on walls, etc., are “forgiven” by the eye in real life, but jump out on the video screen.
  • Use cue cards, since your cast are likely not to be trained actors.
  • Use uniform clothing (a school T-shirt is good) for your cast. Otherwise, if you’re filming the same people in separate sessions, subsequent editing can create a comical impression of sudden costume changes (say, for warm and cold weather).
  • Go for interesting angles (from above, below, etc.). In cramped stacks spaces, the UT team shot through openings between shelves.

Editing:

  • The UT team used iMovie, a Mac-based software. They also used Final Cut Pro, but warned that this product was expensive and involved a steep learning curve.
  • Screencasting tools like Snagit and Camtasia can be used.
  • The final step is exporting and burning to disk, which depending on the application can take anywhere from a couple of hours to fifteen.

Evaluation:

  • Solicit viewer feedback, as the UT folks did with before-and-after surveys.
  • Also important is cost/benefit analysis. Document everything: the UT team made daily records of time spent, tools used, etc.

Embedded Info-Lit

Sarah Manus, Music Librarian for Public Services at Vanderbilt, gave a presentation titled “Librarian in the Classroom: an Embedded Approach to Music Information Literacy for First-Year Students.” Vanderbilt’s music curriculum includes a “core” of four courses on music history and literature which all incoming music majors are required to take. Sarah took advantage of this opportunity to embed herself in all four courses, giving progressive instruction from the basics (the library’s catalog) in the introductory course to advanced research tools (composers’ thematic catalogs) in the fourth. Her original plan was to give two info-lit sessions per course, but faculty subsequently asked her to “front-load” her syllabus with more sessions in the first course.

Sarah’s participation included:

  • Attending all class sessions.
  • Participating in class discussion, when asked to by the instructor.
  • Answering students’ questions about their research.
  • Holding office hours twice a week.

Sarah warned that this degree of embedment required a huge time committment, especially after the music school added a second section to the core, and she consequently found herself attending class five days a week. Sarah said she also had difficulty remembering which material she had given when to each section!

(It’s also worth noting that Vanderbilt has three music librarians — one for public services, one for cataloging, and a director of the music library — which enabled Sarah to make the necessary time committment to an embedded project of this scale. As Sarah noted, where you have one person performing all three roles (like at Wake), or you have a large program with several hundred students enrolled, it would not be the most feasible option.)

There were some other unanticipated difficulties with the embedded approach. Sarah’s familiar presence in the classroom led some students to draw the wrong conclusion. The inevitable procrastinators expected her to do their research for them, and others prevailed on her to pull strings on their behalf, such as having library fines forgiven. The instructor had to give the class a stern lecture to the affect that “Sarah is not your slave, and will not do your work for you!” Still, Sarah found that the opportunity to get to know the students and their needs, and to be more closely involved in the overall educational process, was well worth it.

Improvements Sarah plans:

  • Devote more time to the research process. Sarah found that many of the students were used to doing short critical essays, and had never done an extended research project before.
  • Use active learning techniques, such as small-group work.

Ethnological fieldwork

Holling Smith-Borne, also from Vanderbilt, gave a presentation on “Recording the Traditional Music of Uganda.” This was an update on the development of the Global Music Archive project, a website hosted by Vanderbilt that offers audio streaming of traditional music, so far from Africa. Holling became acquainted with a prominent Ugandan musician who served as an adjudicator for Uganda’s annual national music festival. This man consequently knew all the best traditional musicians in the country, and had an extensive network of contacts with universities, govenment agencies, and other institutions interested in preserving Ugandan culture. Vanderbilt provided him with a salary, recording equipment, and training, and engaged him to travel the country supplying material for the Global Music Archive. Holling and his team hope to identify similar contacts in other African countries, to expand on this work.

They next plan to add to the Archive:

  • Appalachian dulcimer music
  • Indigenous Mexican music
  • An existing Vanderbilt archive of tango music

http://www.globalmusicarchive.org/

Greenville being so near the coast, our guest speaker was retired ethnomusicologist Otto Henry, who shared wonderful reminiscences of his fieldwork on the Outer Banks, recording old-timers singing and playing folk music of the area. Many of his recordings were issued on the Folkways label.

Business meeting

We missed the company of a number of colleagues this year due to cutbacks in travel funding (Georgia’s state library system in fact announced the total elimination of travel funding just a day before the SEMLA meeting). We dovoted some time in our business meeting discussing how the general downturn in the economy was likely to make professional travel increasingly difficult for many for some time to come, and explored ways of compensating for this unfortunate trend, including screencasting future SEMLA meetings.

Also in the business meeting, a student member proposed creating a Facebookaccount for SEMLA, with the object of outreach to library-school students, and of increasing awareness of music librarianship as a career. The idea was well received, and an exploratory committee was set up.

All in all, a very enjoyable and informative meeting this year — I’ve come back with lots of ideas for our LIB250 course and other endeavors!

NCLA RTSS Spring Workshop

Monday, May 26, 2008 3:56 pm

RTSS 2008 – The Future of Bibliographic Control

At NCLA’s Resources & Technical Services Section’s Spring workshop, held this year on May 22 in Raleigh, the keynote speaker was Jose-Marie Griffiths, Dean of the Library School at Chapel Hill, and also a member of a working group charged by the Library of Congress to:

(1) Explore how bibliographic control (formerly known as cataloging, also including related activities) can support access to library materials in the web environment;

(2) Advise the Library of Congress on its future roles and priorities.

The group published its report, titled “The Future of Bibliographic Control”, in January of this year. It’s available on LC’s website: http://www.loc.gov/bibliographic-future/

Concerning the web environment, Giffiths began by noting that many users nowadays turn first to Google or some other web browser for their information needs; that despite the number of web-based library catalogs, there are still many separate library databases that are not accessible by a web search; that, due to the web’s worldwide reach, our users are increasingly diverse, using multiple venues (vendors, databases, social networking, etc); also, that bibliographic data now comes from increasingly diverse sources via the web; and that, as a result, bibliographic control must be thought of as “dynamic, not static”, and that the “bibliographic universe,” traditionally controlled by libraries, will in future involve “a vast field of players” (including vendors, publishers, users, even authors/creators themselves).

As for LC’s role, the report reminds us that LC’s official mandate is to support the work of Congress. It has never been given any official mandate — and most importantly, the funding — to be a national library, providing the kinds of services (cataloging, authority control, standards) for the nation’s other libraries that national libraries typically do. Of course, over the years LC has become a de facto national library, providing all the above services, upon which not only American libraries but libraries worldwide rely heavily. As this unfunded mandate is rapidly becoming unsustainable, pressures are building to “identify areas where LC is no longer the sole provider” and create partnerships to distribute the responsibility for creating and maintaining bibliographic data more widely (among other libraries, vendors, publishers, etc.); also, to review current LC services to other libraries with an eye to economic viability, or “return on investment.”

To achieve these aims (exploiting the web environment, and sharing responsibility), the working group offers 5 recommendations:

(1) Increase efficiency in producing and maintaining bibliographic data. Griffiths noted that duplicated effort persists not so much in creating bib records nowadays (thanks to OCLC and other shared databases), but in the subsequent editing and maintaining of these records: many libraries do these tasks individually offline. Proposed solutions: recruit more libraries into the CCP (Cooperative Cataloging Program, those other large research libraries that contribute LC-quality records to OCLC). Convince OCLC to authorize more libraries to upgrade master records (the ones we see when we search) in the OCLC database. Also, exploit data from further upstream: Publishers and vendors create bib data before libraries do. Find more ways to import vendor data directly into library systems, without library catalogers having to re-transcribe it all. (This may cause some of us who’ve seen certain vendor records in OCLC to blanch; however, the Working Group’s report adds: “Demonstrate to publishers the business advantages of supplying complete and accurate metadata”[!]). Similarly, recruit authors, publishers, abstracting-and-indexing services, and other communities that have an interest in more precisely identifying the people, places, and things in their files, to collaborate in authority control. Team up with other national libraries to internationalize authority records.

(2/3) Position our technology, and the library community, for the (web-based) future. We need to “integrate library standards into the web environment.” Proposed solutions: Ditch the 40-year-old MARC format (only libraries use it), and develop a “more flexible, extensible metadata carrier [format]“, featuring “standard” “non-language-specific” “data identifiers” (tags, etc.) which would allow libraries’ bib data to happily roam the World Wide Web, and in turn enable libraries to import data from other web-based sources. Relax standards like ISBD (the punctuation traditionally used in library bib records) to further sharing of data from diverse sources. “Consistency of description within any single environment, such as the library catalog, is becoming less significant than the ability to make connections between environments, from Amazon to WorldCat to Google to PubMed to Wikipedia, with library holdings serving as but one node in this web of connectivity.” Incorporate user-contributed data (like we see in Amazon, LibraryThing, etc.) that helps users evaluate library resources. Take all those lists buried in library-standards documentation – language codes, geographical codes, format designators (GMDs), etc. – and put those out on the web for the rest of the world to use. Break up those long strings of carefully-coordinated subdivisions in LC subject headings (“Work — Social aspects — United States — History — 19th century”) so they’ll work in faceted systems (like NC State’s Endeca) that allow users to mix-and-match subdivisions on their own. (This is already generating howls of protests from the cataloging community, with counter-arguments that the pre-coordinated strings provide a logical overview of the topic — including those aspects the user didn’t think of on their own.) The Working Group supports development of FRBR (Functional Requirements for Bibliographic Records, a proposed digital-friendly standard), but like many in the library community, remains skeptical of RDA (Resource Description and Access, another proposed standard meant to bring the Anglo-American Cataloging Rules into the digital age) until a better business case can be made for it: “The financial implications … of RDA adoption … may prove considerable. Meanwhile, the promised benefits of RDA — such as better accommodation of electronic materials, easier navigation, and more straightforward application — have not been discernible in the drafts seen to date…. Indeed, many of the arguments received by the Working Group for continuing RDA development unabated took the form of ‘We’ve gone too far to stop’ or ‘That horse has already left the barn,’ while very few asserted either improvements that RDA may bring or our need for it.”

(4) Strengthen the profession. Griffiths noted that in many areas we lack the comprehensive data we need for decision-making and for cost-benefit analysis. We need to build an evidence base, and “work to develop a stonger and more rigorous culture of formal evaluation, critique, and validation.”

(5) Finally, with the efficiencies gained from the above steps, LC and other libraries will be able to devote more resources to cataloging and digitizing their rare and unique materials. The Working Group feels that enhancing access to more of these “hidden materials” should be a priority.

Griffiths shared with us LC’s immediate reactions to the Working Group’s report. The concepts of shared responsibility, and of accepting data from multiple sources, were “expected.” More controversial were the shifting of priorities to rare materials; the relinquishing of the MARC format; and the focus on return-for-investment in assessing standards, such as RDA.

LC’s final decisions regarding the Working Group’s recommendations are expected to be announced this summer.

Electronic Resources & Libraries Conference

Thursday, March 27, 2008 4:00 pm

TV Screens at Farmington PL from Flickr Before I talk about the conference, I saw one idea on my vacation that might be worth stealing. This is the public library in Farmington, New Mexico. They have a wall section devoted to TV screens. Some show TV news and others display library events and tips (like how to place a hold).

I saw WorldCat Identities for the first time. It uses WorldCat data to graph activity by and about an author over time.

This conference was also my first encounter with Library Thing’s Unsuggester (Did you like…? Then you will not like…)

Workflow Ideas

  • One library created an e-book task force to look at the Tech Services options for dealing with them.
  • Another library assigned serials staff to manage e-journals based on publisher. Therefore one staff member became adept at the quirks associated with Blackwell and the next with ScienceDirect and so forth.
  • This library also used Gold Rush to evaluate some abstracting databases for overlap.
  • Planned Abandonment must be held in tension with New Initiatives. Any process you abandon will adversely affect a few users. The key is to strategically replace it with something new that will benefit many users.

Collaboration Ideas

OCLC revolutionized data sharing for printed books. How can libraries share data related to e-resources? We could share

  • E-journal title change and transfer data
  • Librarian reviews of databases similar to Amazon reviews of consumer products.
  • Troubleshooting information. Internally, we’ve begun documenting how to recognize and solve specific problems. What if that info were in a public wiki? IMHO, that would be more useful than digging through listserv archives.

SerialsSolutions Presentation

One time slot was devoted to vendor presentations. I chose SerialsSolutions and their 360 Counter usage statistics product.

  • So far it doesn’t download the stats for you (they are waiting for full SUSHI compliance first)
  • It normalizes titles using the SeSo knowledge base
  • It assigns (SeSo’s) subjects to journals
  • It assigns cost per use (Unclear how much manual input would need to be done for us to realize this.)

Marketing Ideas

I also went to a session on marketing electronic resources. Very little of this presentation had to do with e-resources specifically, but there were plenty of ideas for library marketing in general. A few we might try…

  • Branded coffee sleeves (in our new coffee shop?)
  • Branded sticky notes inserted in our annual letter to faculty
  • They also mentioned linking to your digitized collections from Wikipedia, but Digital Forsyth has already done this.

Concluding Thoughts

Users don’t compare us to other libraries and universities. They compare us to other information providers like Google.

Finally I did a personal e-book experiment on my conference trip. I downloaded a book from Project Gutenberg to my PDA and read it on the subway and during other downtimes. I read the first few paragraphs about ten times before figuring out a good way to move a virtual bookmark. (I cut and pasted the word “BOOKMARK” every time I moved ahead in the book.) I finished the book on my last day. The book was merely OK, but I enjoyed the PDA format enough that I will try it again next time.

Electronic Resources and Libraries–Final thoughts

Monday, February 26, 2007 2:32 pm

Electronic Resources and Librarians is an excellent conference–focused, well-run, and invigorating.

I highly recommend it for folks who may want to attend next year.

The entire conference–schedule, PowerPoint presentations, conference blog, conference wiki, and audio from each presentation–is available online, and if you’d like more information, I’d be happy to help you find it. (That is, I’d be happy to log in and get the information to you.)

If you’d like, check out some photos taken by ERL staff or check out some photos uploaded by ERL attendees.

Electronic Resources and Libraries, Day 3, Continued

Monday, February 26, 2007 2:21 pm

On Saturday morning, I attended 2 sessions.

The Challenges and Opportunities for Cataloging in Today’s Changing Metadata Environment

Sandy Chen, New College of Florida.

Chen started with the question: Must MARC die? She concluded, no–thanks only to XML which is breathing new life into MARC.

She then provided a broad overview of several metadata schema and crosswalks necessary to work back and forth among them.

Next, she provided an overview of FRBR and examined what OPAC display might be like under FBRR.

And lastly, she looked at FAST (Faceted Application of Subject Terminology)–which I had never heard of. The easiest and quickest explanation I can provide is that it’s a new protocol which breaks LC subject headings into their constituent parts and treats each part as its own individual tag in order to facilitate the tagging of texts in an electronic environment.

I’d be happy to discuss this presentation in further detail with anyone who’s interested, but I imagine this will be limited to a small subset of Technical Services folks….

Mediawiki Open-Source Software as Infrastructure for Electronic Resource Outreach

Millie Jackson, Jonathan Blackburn, FSU

This presentation explained FSU’s nascent attempts to create wiki subject guides to replace their static subject guide web pages. They explained their choice of MediaWiki as their software choice and explained how they were providing wiki training to subject/outreach librarians at FSU. Plus, they commented on their fast roll-out–accomplished in only a month, and their need to develop a set of “best practices” to share among their colleagues in order to ensure a quality presence.

Furthermore, they discussed how their implementation doesn’t really fit the Wikipedia model in that only the subject/outreach librarians will have editing privileges. On the surface, this seems like a perfectly reasonable choice. But I have to admit that I had an epiphany during this presentation–but not necessarily from anything said during this one hour. Instead, I think it was a culmination of the entire conference which came to fruition during this final session.

And the epiphany took the form of a question–why not provide a true wiki for subject guides? Why not trust the user-generated content? Why position the librarian as the sole “expert” in this situation? As a profession, we are frequently lamenting the fact that we can’t seem to get professors to allow us to “partner” with them. However, in this instance, aren’t we being guilty of not being willing to partner with our users? Aren’t we constructing a barrier behind which we can claim a position of authority? Isn’t this what frustrates us about faculty?

Erik and I have been round-and-round about the idea of trust, and I’m finally ready to make that leap with Erik…. It’s time for us to open ourselves up–and trust our users.

It’s also time, on a larger scale, to stop doing things just because we’ve always done them.


Pages
About
Categories
2007 ACRL Baltimore
2007 ALA Annual
2007 ALA Gaming Symposium
2007 ALA Midwinter
2007 ASERL New Age of Discovery
2007 Charleston Conference
2007 ECU Gaming Presentation
2007 ELUNA
2007 Evidence Based Librarianship
2007 Innovations in Instruction
2007 Kilgour Symposium
2007 LAUNC-CH Conference
2007 LITA National Forum
2007 NASIG Conference
2007 North Carolina Library Association
2007 North Carolina Serials Conference
2007 OCLC International ILLiad Conference
2007 Open Repositories
2007 SAA Chicago
2007 SAMM
2007 SOLINET NC User Group
2007 UNC TLT
2007_ASIST
2008
2008 Leadership Institute for Academic Librarians
2008 ACRL Immersion
2008 ACRL/LAMA JVI
2008 ALA Annual
2008 ALA Midwinter
2008 ASIS&T
2008 First-Year Experience Conference
2008 Lilly Conference
2008 LITA
2008 NASIG Conference
2008 NCAECT
2008 NCLA RTSS
2008 North Carolina Serials Conference
2008 ONIX for Serials Webinar
2008 Open Access Day
2008 SPARC Digital Repositories
2008 Tri-IT Meeting
2009
2009 ACRL Seattle
2009 ALA Annual
2009 ALA Annual Chicago
2009 ALA Midwinter
2009 ARLIS/NA
2009 Big Read
2009 code4lib
2009 Educause
2009 Handheld Librarian
2009 LAUNC-CH Conference
2009 LAUNCH-CH Research Forum
2009 Lilly Conference
2009 LITA National Forum
2009 NASIG Conference
2009 NCLA Biennial Conference
2009 NISOForum
2009 OCLC International ILLiad Conference
2009 RBMS Charlottesville
2009 SCLA
2009 UNC TLT
2010
2010 ALA Annual
2010 ALA Midwinter
2010 ATLA
2010 Code4Lib
2010 EDUCAUSE Southeast
2010 Handheld Librarian
2010 ILLiad Conference
2010 LAUNC-CH Research Forum
2010 LITA National Forum
2010 Metrolina
2010 NASIG Conference
2010 North Carolina Serials Conference
2010 RBMS
2010 Sakai Conference
2011 ACRL Philadelphia
2011 ALA Annual
2011 ALA Midwinter
2011 CurateCamp
2011 Illiad Conference
2012 SNCA Annual Conference
ACRL
ACRL 2013
ACRL New England Chapter
ACRL-ANSS
ACRL-STS
ALA Annual
ALA Annual 2013
ALA Editions
ALA Midwinter
ALA Midwinter 2012
ALA Midwinter 2014
ALCTS Webinars for Preservation Week
ALFMO
APALA
ARL Assessment Seminar 2014
ARLIS
ASERL
ASU
Audio streaming
authority control
Berkman Webinar
bibliographic control
Book Repair Workshops
Career Development for Women Leaders Program
CASE Conference
cataloging
Celebration: Entrepreneurial Conference
Charleston Conference
CIT Showcase
CITsymposium2008
Coalition for Networked Information
code4lib
commons
Conference Planning
Conferences
Copyright Conference
costs
COSWL
CurateGear 2013
CurateGear 2014
Designing Libraries II Conference
DigCCurr 2007
Digital Forsyth
Digital Humanities Symposium
Disaster Recovery
Discovery tools
E-books
EDUCAUSE
Educause SE
EDUCAUSE_SERC07
Electronic Resources and Libraries
Embedded Librarians
Entrepreneurial Conference
ERM Systems
evidence based librarianship
FDLP
FRBR
Future of Libraries
Gaming in Libraries
General
GODORT
Google Scholar
govdocs
Handheld Librarian Online Conference
Hurricane Preparedness/Solinet 3-part Workshop
ILS
information design
information ethics
Information Literacy
innovation
Innovation in Instruction
Innovative Library Classroom Conference
Inspiration
Institute for Research Design in Librarianship
instruction
IRB101
Journal reading group
Keynote
LAMS Customer Service Workshop
LAUNC-CH
Leadership
Learning spaces
LibQUAL
Library 2.0
Library of Congress
licensing
Lilly Conference
LITA
LITA National Forum
LOEX
LOEX2008
Lyrasis
Management
Marketing
Mentoring Committee
MERLOT
metadata
Metrolina 2008
MOUG 09
MOUG 2010
Music Library Assoc. 07
Music Library Assoc. 09
Music Library Assoc. 2010
NASIG
National Library of Medicine
NC-LITe
NCCU Conference on Digital Libraries
NCICU
NCLA
NCLA Biennial Conference 2013
NCPC
NCSLA
NEDCC/SAA
NHPRC-Electronic Records Research Fellowships Symposium
NISO
North Carolina Serial Conference 2014
Offsite Storage Project
OLE Project
online catalogs
online course
OPAC
open access
Peabody Library Leadership Institute
plagiarism
Podcasting
Preservation
Preservation Activities
Preserving Forsyth LSTA Grant
Professional Development Center
rare books
RDA/FRBR
Reserves
RITS
RTSS 08
RUSA-CODES
SAA Class New York
SAMM 2008
SAMM 2009
Scholarly Communication
ScienceOnline2010
Social Stratification in the Deep South
Social Stratification in the Deep South 2009
Society of American Archivists
Society of North Carolina Archivists
SOLINET
Southeast Music Library Association
Southeast Music Library Association 08
Southeast Music Library Association 09
SPARC webinar
subject headings
Sun Webinar Series
tagging
TALA Conference
Technical Services
technology
ThinkTank Conference
Training
ULG
Uncategorized
user studies
Vendors
video-assisted learning
visual literacy
WakeSpace
Web 2.0
Webinar
WebWise
WFU China Initiative
Wikis
Women's History Symposium 2007
workshops
WSS
ZSR Library Leadership Retreat
Tags
Archives
July 2014
June 2014
May 2014
April 2014
March 2014
February 2014
January 2014
December 2013
November 2013
October 2013
August 2013
July 2013
June 2013
May 2013
April 2013
March 2013
February 2013
January 2013
December 2012
November 2012
October 2012
September 2012
August 2012
July 2012
June 2012
May 2012
April 2012
March 2012
February 2012
January 2012
December 2011
November 2011
October 2011
September 2011
August 2011
July 2011
June 2011
May 2011
April 2011
March 2011
February 2011
January 2011
December 2010
November 2010
October 2010
September 2010
August 2010
July 2010
June 2010
May 2010
April 2010
March 2010
February 2010
January 2010
December 2009
November 2009
October 2009
September 2009
August 2009
July 2009
June 2009
May 2009
April 2009
March 2009
February 2009
January 2009
December 2008
November 2008
October 2008
August 2008
July 2008
June 2008
May 2008
April 2008
March 2008
February 2008
January 2008
November 2007
October 2007
September 2007
August 2007
July 2007
June 2007
May 2007
April 2007
March 2007
February 2007
January 2007

Powered by WordPress.org, protected by Akismet. Blog with WordPress.com.