Professional Development

Author Archive

NASIG 2015

Friday, June 5, 2015 4:50 pm

Last week I attended the 30th annual NASIG conference, presided over by our own Steve Kelley, who looked more and more carefree as the conference progressed and he got closer to handing over the presidential gavel. Well, at NASIG the outgoing president actually receives a gavel; the new one usually gets a hat. This was also the first conference under the newly-official name “NASIG” (no longer the “North American Serials Interest Group”).

The presentations at this year’s NASIG conference (or at least, the presentations I attended) seemed to steer away from “how we done it good” and focused instead on “here’s what we learned from looking at the data.” The following synopses are taken primarily from my notes, so I apologize for any misrepresentation.

  • Marlene van Ballegooie, from the University of Toronto, spoke about the OCLC Knowledgebase (OCLC KB), which is designed to reduce the time librarians spend managing e-resource holdings. Rather than the library having to communicate to the KB provider which journals they subscribe to from which publishers, the publisher sends a holdings file directly to the knowledgebase. Van Ballegooie attempted to assess the effectiveness of the service by comparing each load against information she obtained directly from the publishers. Results of course varied by publisher, but common problems were irregular data loads, and a time lag between a title’s availability at the publisher site and its activation in the KB. Also, any local changes get overwritten in the next data load. But the presenter concluded that the method does have potential for saving time, especially with custom packages or aggregator platforms where manual selection is necessary (such as e-book providers).
  • Gabrielle Wiersma and Esta Tovstiadi, from Univ. of Colorado at Boulder, presented an analysis of approximately 100 randomly-selected e-books published in 2014 across multiple platforms. Using a rubric based on a tool developed by the Center for Research Libraries, they assessed 16 aspects of the user experience, such as metadata, linking, pagination, etc. Some examples from their findings:
    Metadata – some platforms include subtitles, others do not; “date” may refer to date published, copyright date, or date posted online; editors are sometimes named as authors; etc. In none of the cases they examined did the platform-generated “MLA citation” actually match MLA format.
    Searching – different platforms may return search results at the word, page, or chapter level. Most (61%) were chapter-level, which is probably the least useful for searchers.
    Pagination – system page numbers often don’t match the page number displayed on the PDF (probably due to how front matter is counted); in EPUB format, page numbers are often missing altogether.
    The presenters showed examples of how search results may vary wildly from one platform to the next. This can be caused by search functionality, such as auto-stemming, or how the platform treats hyphenation, or whether it defaults to AND or OR searches. They also found problems caused by OCR spacing errors — e.g. “Japa nese” or “infl uential”, or words joinedtogether withouta space.
    See their slides on SlideShare for side-by-side examples.
  • Michael Matos of American University shared his analysis comparing library journal holdings to works referenced in faculty publications. The goal was to use the data to demonstrate the extent to which faculty rely on the library for their research. I confess that his complex methodology lost me. Next steps include looking more closely at the materials referenced which are not held by the library, then compare that to ILL data (thus demonstrating that the researcher also used the library for those materials).
  • In “Strategies for Expanding eJournal Preservation,” Shannon Regan, from Columbia University, described a Mellon Foundation Grant-funded project to identify e-journals that are not currently being preserved by a trusted 3rd-party repository, learn why they are not being preserved, and explore ways to get them preserved. I was kind of surprised, and kind of not-so-much, at the amount of content—even from major publishers—not being preserved. As for reasons why, I came away with the impression that the most prevalent reason is a question of rights/permissions. In some cases, a publisher may not have secured rights from the authors; in other cases, publishers (typically smaller ones) have no understanding of the need or the process for preserving content, or may fear a loss of control over the content (thinking, for example, that permitting an archiving agency to preserve the content would be equivalent to making the journal open access). Other times, the step of preservation may just slip through the cracks. Regan recommended that librarians should make preservation a part of the conversation with publishers, vendors, consortia, faculty, and other stakeholders.
  • In a fun presentation, Kristen Garlock of JSTOR and Eric Johnson of the Folger Shakespeare Library described some projects/products developed as an outgrowth of usage data. The first was JSTOR Classroom Readings (http://labs.jstor.org/readings), a free tool intended to give educators a list of articles for core courses. Developers had originally wanted to gather college syllabi and curate a list of articles from those, but there were too many obstacles. So instead they looked at usage data for signs of “teaching use” (short bursts of use at a single institution). Though not perfect, and not yet considered final, Garlock seemed pleased with the methodology and the resulting product. Johnson talked about (among other projects) a JSTOR tool called Understanding Shakespeare (http://labs.jstor.org/shakespeare). The user can select a play, then choose a line in that play and get a list of articles in JSTOR that quote that line. Again, not complete (only includes 12 plays so far), but a pretty nifty tool.

In other sessions, I learned a few new Excel functions to try out, plus a couple of things to try with CORAL. I was also pleased to hear EBSCO Chief Strategist Oliver Pesch say very plainly and repeatedly that EBSCO supports customer choice and is actively seeking ways to optimize customer choice. I felt encouraged when he said that “no one vendor can offer libraries all the resources” they need, and that if you want to use an EBSCO product for one part of your workflow and a competitor’s product for another part, the workflow should not only work, it “should be optimized.”

Finally, my favorite quotes from the conference:

Scott Vieira, Rice Univ., referring to typical functionality in e-resource management systems: “Forcing the acquisition of e-resources into a linear workflow is like trying to train tortoises to walk in a straight line.”

Marcella Lesher, St. Mary’s Univ., about a journal weeding project (I’m probably paraphrasing): “We’re talking here about the care and feeding of print resources … although at this point we’re probably starving them to death.”

Derrik at the NC Serials Conference, 2015 edition

Monday, April 20, 2015 2:03 pm

Well, the latest few PD blog posts have guilted me into finally writing about my trip to the 2015 NC Serials Conference. Now, if I can just find my notes …

Aha! Here we are.

Steve’s post already covered Katherine Skinner’s opening keynote address quite well. I’ll add an “Aha” moment I had. Do you know who invented the incandescent light bulb? Hint: It wasn’t Thomas Edison; he merely perfected the design. Skinner also said that the jukebox was not invented by the record industry. Lesson: Innovation per se isn’t the only thing that’s important, and positive changes can come from outside the area you’d normally think to look for them.

I presented a session on library-vendor negotiation, along with co-presenter Lesley Jackson, our EBSCO Account Manager. We presented nine different principles of negotiation, along with examples. There were things like “Be prepared,” “Don’t be afraid to ask,” and “Don’t take it personally.” We finished earlier than expected, but the audience participated and asked good questions. We had a number of vendor reps in the audience too, which made it more fun.

Another plenary session was a panel discussion about text and data mining. A fair amount of this was over my head, but one thing that was clear is that everybody’s still trying to figure it out. The vendor representative on the panel pointed out the difficulty vendors have with managing and licensing text mining because librarians can’t really articulate what “text mining” means. But it was also pointed out that (1) it means different things to different libraries and to different researchers; and (2) in many cases the researchers themselves don’t yet know where the research will take them, so it’s hard to know what permissions to ask for.

Derrik at ALA Midwinter 2015

Monday, February 9, 2015 11:49 am

Vendor highlights

Lauren and I had a really good dinner discussion with a VP of a database vendor, talking about what is and isn’t important for researchers and libraries. That VP and our regular sales rep have already scheduled a campus visit to continue the conversation.

I had a conversation with a publishing company’s VP of Sales regarding demand-driven acquisition (DDA). I described the DDA usage and spending patterns we have seen here, and we talked about the difficulties of finding a sustainable balance for publishers and libraries. We also talked about “evidence-based acquisition” (EBA), where the customer pays first for access, then at the end of the access period can select content for perpetual access, up to the amount paid. I told the VP that the entry cost for EBA is typically too high. He immediately understood—the up-front price that a large library could afford would be cost-prohibitive for smaller libraries. He seemed to like my suggestion that they base the entry cost on the customer’s historic spend.

I had a good meeting getting to know our e-book vendor’s new rep, and his supervisor sat in on part of our meeting so I was able to bend her ear too, mainly about DDA. I learned that there is talk of developing a variation on the short-term-loan DDA model, though nothing concrete yet as far as I know. I don’t want to divulge any secrets here, but I am cautiously optimistic about what they told me.

There were lots of other productive conversations; in all I spoke with at least 17 vendors (that I kept track of). It feels weird to keep this section of my report so brief, but I fear the rest of the vendor stories would get tedious.

Sessions

A speaker from a large university library described how they collect and analyze data about e-resource outages. Staff enter and track e-resource problem reports in a commercial incident-tracking system. They record the cause (e.g. metadata error, simultaneous-user limit, user error, etc.), the time it took to resolve, and other data. Tracking outages allows them to become aware of trends. One benefit is that they can present a record of incidents to vendors, with actual numbers instead of “your site goes down a lot.” In the first year of collecting data, proxy problems accounted for a small fraction of the total errors. 25% of errors were because the target content was missing from the vendor’s site (i.e. an article or issue missing from a database).

In another session, a representative from a large university press spoke about how usage-based acquisition is affecting the Press. She acknowledged that DDA is scary because they know that not every book will get used, but the only way to know which books will get used is to publish them. She said it will take a while for them to evaluate DDA because they don’t know yet when the revenue for a book will come in and it is difficult to assess which marketing efforts are working. She also expressed a concern that was a new idea to me—she wondered whether access to a large pool of DDA titles might actually obscure the fact that libraries are underfunded.

I attended a presentation by Len Vlahos, Executive Director of the Book Industry Study Group (BISG). Vlahos said wholesale book revenue has remained fairly flat over the past five or six years. The rate of growth of e-book sales has slowed (i.e. still growing, but the curve has flattened); hardcover revenue dipped in 2010 but has regained overall. Sales of print textbooks are declining, but that trend is publisher-driven, unlike the consumer-driven trade market. Publishers are developing online interactive learning systems as a replacement for printed textbooks, since textbooks that are simply digitized versions of the print are not well received. Vlahos predicted that the next big disruption in the book industry will be a business model (like retail discounting in the 1970s or e-commerce in the 1990s) rather than technological (like the printing press or the Kindle). He noted the growth of a subscription economy, in which consumers are being trained that it’s ok not to own content (Netflix, Spotify, Pandora, etc.), and even beyond content (ZipCar, bikeshare), and suggested that publishers expect the subscription model to have a positive effect on revenue within the next 5 years.

The Continuing Resources Standards Forum included an overview of the NISO Recommended Practice for Demand-Driven Acquisition of Monographs. The standard was published last June and to me it already felt a little out of date because it doesn’t address some of the more recent tensions in the DDA market. The Forum also included a review of the very new (published last month) NISO Recommended Practice on Access and License Indicators. This is a simple standard for encoding at the article level whether or not that article is “free to read,” plus a link to the article’s license information.

In an excellent overview of linked data, the presenter described the evolution of the Web from a web of static documents to a web of data. In the web of data, instead of describing an entity with a record (i.e. a surrogate for the entity), an entity has its own unique identifier, and that’s where you go for information about that entity. Note that BIBFRAME is about identifying bibliographic entities. The presenter said that libraries have been very involved in the web of documents, but cautioned about the danger of a “library-shaped black hole” in the web of data. Library projects have tended to use library vocabulary instead of the vocabulary of the larger web, so it is difficult for web searches to find and link to them. The presenter said that the reason libraries should share linked data on the web is the same as the historical reason for cataloging – “So people can find our stuff.”

Derrik’s ALA roundup

Wednesday, July 9, 2014 4:43 pm

I’ve sorted my 2014 ALA Annual Conference experience into 3 categories–Committee work, Vendor chats, and Sessions.

Committee work

The ALCTS Standards Committee was formed last fall to promote member involvement in and education about the development of information standards. Part of my assignment on that committee is to act as liaison to the ALCTS Continuing Resources Section (CRS), which means I got to go to two sets of committee meetings for the price of one! Saturday morning I traveled to Paris (i.e. Paris Las Vegas) where CRS was holding its “all committees” meeting. I met with the CRS Standards Committee, Cataloging Committee, Committee on Holdings Information, and the CRS Executive Committee to discuss the division-level committee’s charge and the best way for me to liaise with CRS. Then Sunday afternoon at the ALCTS division-level “all committees” meeting, we discussed reports from the 5 sections of ALCTS. We are still trying to pin down the best ways to carry out our charge. We are looking for ways to foster collaboration between the sections, and trying to determine the best way to interact with external standards organizations.

Vendor chats

With all the committee meetings and sessions I needed to attend, I wasn’t sure I would have enough time in the vendor exhibit hall. As I look at my results, I’m still not sure how I packed all this in. I won’t give details here, but feel free to follow up with me if you want to know more about any of these.

I had some fairly long, productive discussions with
JSTOR – ebooks & DDA
YBP – DDA profile management (deletions) & more
NYTimes – academic site license
EBSCO – Usage Consolidation questions
Kanopy – DDA
ProQuest/EBL – STL pricing, e-books in Summon, Academic Complete & DDA, etc.

Lauren, Jeff, and I attended a ProQuest-sponsored discussion about DDA, with librarians and publishers participating. Basically, everybody is struggling to adapt.

I had shorter discussions with
Wiley – new article interface coming
Data-Planet – new Java-free interface for Statistical Datasets coming
CLCD – we’re trying to get their author-title catalog lookup to work
Project MUSE – e-books, DDA (which they aren’t doing), & evidence-based acquisition (which they’re working on)
Taylor & Francis – e-book STL pricing
ProQuest – brief Intota demo
McFarland – thanked them for participating in NC LIVE’s Home Grown e-books pilot
BrowZine – now has an iPhone app
and the New York Philharmonic Archives, a free resource I had been unaware of – do you know what they played at their first concert?

And several others, of course. I met some sales reps that I had previously only corresponded with by e-mail (Alexander Street, SAGE, Taylor & Francis, and Euromonitor). I even managed to find a few book signings with no lines!

Sessions

Michael Levine Clark gave an e-book usage report, very similar to the one I attended at Midwinter last January. The basic (unanswered) question is “What constitutes meaningful use of an e-book?” (Or from a more practical standpoint, what type(s) of e-book use should we be measuring?) At one point, Clark suggested that an e-book being downloaded may be an indicator of significant use, but ZSR’s early data seemed to indicate that a download was usually an indicator of the user’s unfamiliarity with the platform. Answering an audience question about user preference, Clark said that if you ask users “Do you prefer print books or e-books?” most of them will select a preference, but if your questions are more nuanced-Which do you prefer for looking up a fact? Which do you prefer for immersive reading? If you could get an e-book immediately but had to wait 5 minutes for a print book, which would you prefer?-then no clear preference emerges, at least in the research he has done. Clark’s presentation slides are available at www.slideshare.net/MichaelLevineClark

In the CRS Standards Forum, presenter Aron Wolf, a ProQuest software developer, spoke about the NISO Recommended Practice called IOTA (“Improving OpenURL Through Analytics”). IOTA, released in 2013, came out of a 4-year research project to produce a standard way for link resolver vendors (think of the WFU Full Text Options button) to measure & describe how well their product works. Wolf, who was part of the working group for IOTA, said they concluded that there was not an objective, cross-vendor metric, so IOTA instead recommends a methodology for testing a link resolver against itself. At that point he either lost me in the technical details or else I understood it so well I didn’t think I needed to take any notes. One future possibility he suggested was that it may become possible for reports to check accuracy within a matter of hours, enabling link resolvers to respond much more quickly when publishers change linking formats.

The last session I attended at ALA was about “articles on demand,” also called pay-per-view (PPV), which is essentially DDA for journal articles. First, Beth Bernhardt from UNCG talked about their experience dropping PPV about 9 years ago in favor of “big deal” journal subscription bundles, and now having to reconsider PPV in light of ongoing budget cuts. Susanna Bossenga from Northeastern Illinois University explained her library’s on-demand article delivery, which they currently provide using the Copyright Clearance Center’s Get It Now service. Article requests are mediated and processed by their ILL department. Finally, Mark England from the University of Utah described their implementation of ReadCube Access. When authenticated users come across un-owned articles, ReadCube Access presents them with options to rent for 48 hours, download, or get a “cloud” copy (online reading only, with no time limit). The library pays, of course, with each access option costing a different amount–$4 to rent, $10 for the cloud option, or $25 for a downloadable PDF.

 

That summarizes my conference, and may help explain why I still feel jet-lagged a week later. Speaking of jets, my flight home left Las Vegas 10 minutes before a National Weather Service Excessive Heat Warning went into effect. I must say that if the heat over the weekend wasn’t “excessive” (Monday’s high was 111°), I’m really glad I got out when I did.

 

Derrik’s takeaways from NASIG 2014

Tuesday, May 20, 2014 4:45 pm

The 2014 conference of the North American Serials Interest Group (NASIG) was a good one, from my point of view. A wide variety of topics and some very good keynote addresses gave me lots to chew on.

Since I started attending NASIG conferences 12 years ago, one of my favorite aspects has been the vendor involvement. NASIG is not just a librarians’ organization; subscription vendors and publishers are also encouraged to join, serve on committees, and be otherwise involved in the organization. Each conference usually includes sessions in which a vendor/publisher perspective is offered. I attended four such sessions this year, including “Vendor Lightning Talks,” a new conference feature in which six different content providers each took 5-7 minutes to tell what’s new with their products. I heard a panel of publishers (Nature, American Chemical Society, and IEEE) describe what they are doing in the Open Access arena, and a subscription vendor and a library consortium officer described their negotiation processes. My favorite vendor-related session was one in which a vendor sales rep, a consortium officer (the same from the previous presentation), and a librarian sat together on the stage and discussed a set of ethical questions (e.g., “Is it fair for a library to write an RFP so narrow that it is obviously customized to a specific vendor?” or “Is it fair for a vendor to go over the head of an acquisitions librarian if he/she says no?”). The probing into gray areas was a good exercise in seeing the other side’s point of view.

In other practical areas, I heard presentations about

  • results of an availability/usability study, in which students were able to successfully locate full text only 41% of the time, sometimes due to system errors, but in many cases, the students simply did not click on the right link, or missed key information that was presented on the screen;
  • survey results regarding methods of tracking perpetual access to online journals, which reminded me of the need to distinguish between post-cancellation access (usually on the publisher’s website) and archival access, meaning access when the publisher no longer provides it (often via Portico or LOCKSS)
  • updates on some emerging NISO standards – PESC, a communication standard for transmitting serial content; KBART, related to vendor knowledge bases; PIE-J, which has to do with how e-journals, especially title changes, are presented on vendor websites; ODI, for sharing metadata for discovery systems; and OAMI, a new metadata standard for open-access content, which includes the wonderful (IMO) feature of not referring to an item as “open access” but rather as “free-to-read” (yes/no).

What really made this year’s conference stand out for me was the amazing slate of “Vision session” (i.e. keynote) speakers. Katherine Skinner’s opening address on “Chance, Choice, & Change” made two particular impressions on me: (1) “Frontier” depends on your viewpoint-you may see empty space ready for development, but “empty” space is never truly empty, and there will often be people who see your pioneering as encroaching on their territory; (2) The cultural processes of production, distribution, and reception “always, always, always” depend on networks of people, not the lone genius.

Chris described Herbert Van de Sompel’s thought-provoking address very well. This address got me wondering to get the broader scholarly communication world to see the problem of “reference rot.”

In the closing keynote address, Jenica Rogers talked about how often she hears people say “I could never do what you did” (i.e. cancel the library’s ACS package), but she said she believes what they really mean is they would like to, but … (“but our faculty would riot”; “but I don’t want to rock the boat”; etc.). So Rogers presented a list of actions/habits that would help prepare us to make life’s tough decisions. Here is the list as I captured it:

  • Know thyself – know why you do the things you do
  • Claim and demonstrate your expertise & authority – know your reputation and how to leverage it
  • Gather data – evidence can shout when you can only whisper
  • Make friends
  • Start now, immediately
  • Find common ground – insisting that everybody else thinks X is important will only frustrate & annoy the people who don’t
  • Communicate effectively
  • Embrace serendipity
  • Evolve, even when it’s uncomfortable
  • Release fear – “Fear doesn’t make smart decisions, fear makes safe decisions.”

 

 

ALA Midwinter according to Derrik

Friday, February 7, 2014 12:09 pm

Vendor meetings

As usual, I spent a large part of this conference in the vendor exhibit hall.

I learned that Alexander Street Press is close to signing a deal to offer a certain film collection that I’ve heard people here express specific interest in (I don’t want to jinx the deal by naming the collection before it’s finalized). Bad news: it will be by subscription only, at least at the start. I also had an interesting after-dinner conversation with the President of Alexander Street Press, about how hard it is to come up with a good short-term loan model for streaming media (do you charge by the minute? what’s the appropriate price point?), and the difficulty of getting rights holders on board with it.

Data-Planet (provider of Statistical Datasets database) is targeting June/July for the release of a java-free user interface, and are also contemplating offering a one-time purchase option for their Statistical Data Sheets.

Elsevier‘s main development focus seems to be on Mendeley right now.

I learned more about the respective e-book models of both JSTOR and Project MUSE. Both e-book collections primarily feature university presses. A Project MUSE presenter said they try to select their e-books to match their existing subject strengths, so that the journal and book collections will complement each other. MUSE offers single-title purchasing via YBP, and JSTOR expects to offer it “in a few months.” JSTOR also offers a DDA model for e-books.

Oxford University Press now offers individual title purchasing for their Oxford Scholarship Online books. They are also now offering journal backfiles for title-by-title purchase.

A week before ALA, ZSR was asked/invited to become a Beta test site for the new EBL administrative module. So I spent half an hour at the ProQuest booth with Alison Bobal of EBL, getting a sneak preview. The new module seems much easier to use, and includes some functionality that up until now could only be handled by contacting EBL support, so I’m excited to be an early adopter. I also appreciate the opportunity to help shape the product. We went live on the new admin module today!

At the Third Iron booth, I learned more about a new feature of BrowZine, a product we subscribed to last August that allows WFU users to create a personalized collection of library-subscribed journals on their mobile devices. BrowZine can now include journals from ProQuest, EBSCOhost, and Ovid aggregator databases (initially it only included journals on publisher websites). We have, of course, turned on this new feature.

I learned about a couple of new e-book providers, and also had one-on-one meetings with our sales reps from APA Publishing, SAGE, Springer, Taylor & Francis, Thomson, and Wiley.

 

Committee meetings

I am in my third year of serving on the ALCTS Transforming Collections Task Force. The Task Force manages an ALCTS microgrant program to fund projects in support of the ALCTS goal of transforming collections. The first year we received a good number of applications, but the second year (last year) we only got a few few, so a good portion of this committee meeting was spent discussing whether or not to continue the microgrants (we decided yes, for at least one more year) and how to drum up applications. We talked about the many different ways in which collections and collecting are being transformed (e.g. shared collections, DDA, digitized local collections, open-access journals, user-generated content, etc.) and brainstormed ways to promote the theme of transforming collections.

I am also a member of the newly-formed ALCTS Standards Committee. The purposes of the committee are to educate ALCTS members about and encourage their involvement in the development of relevant standards and to support ALA’s voting representative to NISO. As this was the committee’s first meeting, it was mostly about getting organized, discussing how to best fulfill the committee’s role.

At my previous ALA conferences, I have enjoyed attending presentations sponsored by the Publisher-Vendor-Library Relations Interest Group (PVLR). At last month’s conference, I had (or made) time available to attend the PVLR business meeting. The group is made up of 3 co-chairs and anybody else who wants to attend. There were about 15-20 people there, and the meeting was simply an open discussion of possible topics for future PVLR presentations. Ideas included security/hacking; data mining & analysis (how to explain legitimate uses to publishers, how to explain rights holders’ concerns to the end-user); self-publishing; hybrid Open Access; and the future of society publishing.

 

Presentations

I did manage to attend a few presentations. In one, Rick Anderson, Associate Dean for Scholarly Resources & Collections at the University of Utah, discussed “predatory publishing,” and what makes a publisher “predatory.” Anderson admitted that there could be several kinds of predation, but he focused on two broad categories: misrepresentation (e.g. deliberately misleading journal titles, publisher name mimicking a legitimate-sounding organization, fictitious editorial board or real people’s names used without permission) and selling false prestige (e.g. false claims of peer review or impact factor). Anderson encouraged getting the word out to scholars about predatory publishers, but emphasized the need to do it delicately, lest we inadvertently send the message that all open-access third-world, small, or new publishers are bad. Following Anderson’s remarks, Regina Reynolds of the U.S. ISSN Center discussed how the ISSN network (U.S. and internationally) can and cannot help. Reynolds stressed that the ISSN is not a stamp of legitimacy, it’s just a “dumb number.” On the other hand, the ISSN network recognizes that they cannot be perceived to enable fraudulent publishing, so they have established some guidelines, such as no longer assigning an ISSN prior to the publication of the first issue, and being more careful about publishers requesting a large block of ISSNs.

In a session sponsored by ProQuest, Michael Levine-Clark presented results of his analysis of e-book usage over multiple years on the ebrary and EBL platforms. ProQuest had provided him with usage data from both providers, covering 4 years and 750,000 titles. It’s hard to pick out the salient points when there was so much information presented, but here’s my version of the highlights:

  • University press titles consistently got higher use (sessions, page views, printing, etc.) than the overall collection average. BUT this might simply be because university press titles are available in more libraries and therefore to more users. Levine-Clark was working with aggregate data, and did not have information about individual library usage or holdings.
  • Books in the Social Sciences seem to be used at a slightly higher rate than the Humanities or STM. BUT page views and printing per user session were highest in STM. In other words, even though a lower percentage of STM books got used, they seem to get used more intensely.
  • Question: Does more page views per session = more time in the book? or just rapidly “flipping” through? Levine-Clark did not have data regarding amount of time spent in the book.
  • Question: What constitutes a meaningful use of an e-book? Levine-Clark suggested that copying may be the best measure (indicates the user found something they wanted to save). Printing or time in the book might be other possibilities, though it is not uncommon for a user to print something just for offline reading.

The presentation slides are available at http://www.slideshare.net/michaellevineclark.

One of the authors who spoke at the conference was David Baldacci. I credit Baldacci with getting me interested again in reading for pleasure, after I heard him speak back in 2002 or 2003, so of course I had to go hear him again. He only spoke for about 20 minutes, but made up for it with an unannounced book signing, signing proofs of his forthcoming young-adult fiction novel The Finisher.

Charleston Conference online

Thursday, November 14, 2013 9:49 am

I have never actually attended the Charleston Conference, but this year they broadcast a small number of sessions live over the Internet. I tuned in to watch two of those sessions.

In a pre-conference segment, Judy Ruttenberg from the Association of Research Libraries spoke about legal issues in providing online resource access for print-disabled patrons. I learned that Section 508 of the Rehabilitation Act, requiring accessible electronic technology, applies to institutions receiving certain federal funding (and Ruttenberg made it sound like it applies to virtually all universities in the U.S.), but it does not apply to the private sector. So while it is illegal for a school/university to require the use of an inaccessible device, it is not illegal for Amazon or B&N (for example) to produce an inaccessible e-reader. As a matter not just of legality but of providing good service, Ruttenberg encouraged compliance with standards, especially WCAG 2.0 (Web Content Accessibility Guidelines-I had to look it up). She also suggested that libraries could partner with campus offices for students with disabilities, and with professors, to advocate for technology and service standards and to help make sure content is accessible. Finally, Ruttenberg addressed the challenge of getting e-resource licenses in line with accessibility needs, especially given that content providers are not liable. As with the technology, model license language is a moving target, but she recommended pointing to standards (such as WCAG 2.0), as well as asking for the right to make the content usable. She closed by quoting someone (sorry, I didn’t catch who) asking why we don’t push for indemnification against third-party lawsuits for inaccessibility. In the Q&A, a discussion arose around whether an institution would be within their rights to make content accessible even if the license doesn’t permit it; Kevin Smith (Duke’s Scholarly Communications Officer), who was in the audience, asked which lawsuit you would rather defend-a content provider alleging you didn’t have the right to do that, or a disabled student who couldn’t access course material.

The other session I watched was a presentation of research on the effects of discovery systems on e-journal usage. The researchers (Michael Levine-Clark, U. of Denver; Jason Price, SCELC; John McDonald, U. of Southern California) looked at the usage of journals from 6 major publisher at 24 libraries-6 for each of the four major discover systems (Summon, Primo, EBSCO Discover Service [EDS], and WorldCat Local [WCL]). The presentation went fast and I had a hard time keeping up, but the methodology seemed logical and the results interesting. Results varied of course, especially the effect of the discovery system on the different publishers’ content, but there did appear to be a resulting increase in journal usage, with Primo and Summon affecting usage more than EDS and WCL. The main purpose of the current study was to see if they could detect a difference, which they did. Their next step will be to try to determine what factors are causing the differences.

Derrik at NCLA 2013

Monday, October 21, 2013 12:13 pm

Here’s my summary of last week’s North Carolina Library Association conference. Overall, I thought it was a great conference, and I was glad I attended.

E-books

Christopher Harris, editor for the American Libraries e-content blog, gave a very good update on the e-book industry, although it was mostly geared toward public libraries. Some of my favorite sound bites and key concepts:

  • Don’t stress out about change. “Stuff is constantly changing; let it flow.”
  • The last disruptive technology we saw was the iPod and mp3’s. Experts (audiophiles) hate mp3’s because of the lower sound quality, but for the average user, an iPod & earbuds sure beats walking around with a phonograph or boombox. Librarians need to avoid being the nay-saying experts.
  • If all we’re doing is providing e-books, we’re in trouble because it can be outsourced at a much lower cost. Libraries can be filters and help users avoid “analysis paralysis,” like shopping at Trader Joe’s, where much of the selection has already been done for you.

Harris encouraged us to be willing to experiment with new models of purchase and access, and to think with our “math brains” instead of our “emotional” brains. For example, we all got up in arms when Harper Collins announced a 26-loan maximum, but Harris pointed out that for a $20 book that amounts to about $0.72 per loan. “How much per loan does a print book cost?” (in labor and building/shelving costs), he asked. Harris reviewed the current license models used by some of the “Big 6″ publishers. He pointed out that Macmillan does not sell to library consortia, and said (almost angrily), “That’s where we should plant our flag!” because resource sharing is much more important than a 26- or 52-loan limit.

Harris’ parting advice:

The next day, I attended a panel discussion and found out that NC LIVE is already working on a new model for shared e-books. I confess I didn’t understand all this very well, and it’s all still in Beta, but I’ll try to keep this general in hopes that I won’t go too far off track. NC LIVE has been working with Wake County Public Library to develop a shared platform for library e-books. Note that it will be the platform technology that is shared, not necessarily the e-books. It will be up to individual libraries to implement the platform (developed by NC LIVE) on their own websites. The vision is that each member library will be able to purchase e-books and place them on the NC LIVE platform, either shareable or private to the purchasing library. NC LIVE has started negotiating with several NC publishers to make their e-books available on the platform. It wasn’t clear to me whether those are e-books that NC LIVE will purchase, or if they’ll simply be available for member libraries to purchase. Target launch date for the platform is January 2014. There will be some content from one publisher (John Blair, based in Winston-Salem) available at that time, and NC LIVE hopes to have additional content from other publishers available by July. For now, the only access model for these e-books will be single concurrent user.

 

Digital/Digitized Library Collections

I went to a couple of presentations on digital collections available via the State Library. See http://digital.ncdcr.gov/. There’s a lot of good stuff available for NC historical research, such as family bibles, wills, property records, cemetery photographs, a Civil War Roster index, an index of the Raleigh News & Observer covering 1926-1992, and an archive of all NC government websites. I also went to a session that gave an update on NC ECHO [http://ncecho.org/], which searches across the digital collections of various libraries, museums, and archives in North Carolina (including Digital Forsyth, for example). NC ECHO uses the OAI-PMH standard to gather metadata from the various collections, then builds a searchable index of all these collections.

 

Electronic Resource Management Systems

I formed and participated in a panel discussion about E-Resource Management Systems (ERMS). Our panel included librarians using an open-source ERMS (me, talking about CORAL), a ILS-vendor’s ERMS, and a content-vendor’s ERMS. It was fun (in an e-resource-managing-geeky sort of way) to see how the strengths of the systems varied according to provider. The presentation was well attended, and I received some positive feedback afterward.

 

Keynotes

I won’t try to summarize the keynote addresses, but here are a couple of my favorite highlights:

In speaking of our responsibility to present readers with all sides of a controversial topic, ALA President Barbara Stripling pointed out that in a print environment, libraries could place all the relevant resources together on the shelf, so readers have to “at least trip over” other points of view on their way to the books they’re looking for. But in an online environment, it is too easy to limit yourself to resources that you already agree with, so libraries have a responsibility to teach users to look for those other points of view.

I’m sure others will offer a better description of ACRL President Trevor Dawes’ address, but the point that stood out the most to me was his explanation of why Financial Literacy is one of his main areas of focus. Dawes said that student loan debt has now surpassed credit card debt in the United States. (Actually, that happened in 2010, but Yikes!)

 

Vendors

If you’ve read my past conference summaries, you won’t be surprised that I had some productive conversations with vendors in the exhibit hall. I talked with the Gale rep about the Cengage bankruptcy, and was again assured that it’s “business as usual” for Gale; she compared the bankruptcy to refinancing a mortgage (yeah, I know it’s more complicated than that, but I still thought it was a good analogy). The Reference USA rep gave me a heads up on a new data visualization feature, and told me to contact our sales rep about it (I think it’s available at no additional cost, waiting to hear back). I got an update on the new Alexander Street Press platform for streaming music & video, which is scheduled to be released later this week (but they’ve already had to push it back once). And I had another license-unjamming conversation with a publisher (like happened at ALA earlier this year). I had gone months without hearing a reply, then talked to the sales rep at the conference on Thursday, and I heard back from the license contact within a day!

 

A belated ALA report

Thursday, August 22, 2013 4:52 pm

Somehow, writing a blog post about my ALA 2013 experience seems to have slipped through the cracks. Could have something to do with the 5 licenses I currently have up in the air, I suppose. So here’s my report, to the best of my (and my notes’) memory. I thought I had some pictures to add, but alas, I can’t find them now, so this may be a boring post.

E-book Data Evaluation

Two presenters, one from a public library system and one from a university library, talked about how they use e-book usage data. The public librarian said that it is difficult, and perhaps invalid, to compare usage of e-books to usage of print books. She pointed out such differences as different loan periods; wait time for holds (much longer for print); overdues (none for e); different user base; e-book collection has more current, frontlist titles, and very few children’s e-books. The university librarian spoke a fair bit about demand-driven acquisition (DDA), but it didn’t sound like his library had any better grasp of things than we do. The bottom lines: be skeptical of the data, and so far no clear patterns are emerging.

Electronic Resource Management Interest Group

Two presenters from university libraries spoke about electronic resource management in the context of multiple user access models. That is, our users are presented with multiple means of accessing data; in our context, we’re talking VuFind, Summon, LibGuides, individual databases, library website, etc. The first speaker pointed out how difficult it is for the average user to navigate between those multiple avenues: If a user links from Summon into VuFind, how easy is it to get back to where they were in Summon? Is it confusing when they suddenly find themselves in a different UI? She challenged us to think about ways to make this environment more user-friendly. The second presenter pointed out that in many cases, we are managing similar data in multiple places. He also encouraged everyone to study information architecture to better understand the searchers’ perspective. Quotable quotes from this session: “We don’t call it cataloging any more; now it’s ‘discovery enhancement’,” and “There is not enough time or resources in the universe to be fully on top of e-resources maintenance.”

BIBFRAME Update

Steve gave a good report of this session in his ALA post. As a reminder, BIBFRAME (short for “bibliographic framework”) is being developed as a way of encoding bibliographic data (simplified version: replacement for MARC). As Steve said, BIBFRAME is still a long way from taking any recognizable form, but Eric Miller, co-founder and president of Zepheira (company working on BIBFRAME), described what I would call the theory behind BIBFRAME. According to Miller, the goal is to “make interconnectedness commonplace.” He compared it to Legos-you can buy them in different sets, but all are interoperable, allowing small bits of data to be joined in interesting ways. They don’t have to tell you in advance what the building blocks will form, just give communities the blocks and allow them to recombine them in ways meaningful to them. Beyond that, most of this session got very technical and was pretty much over my head.

Meeting with publishers & vendors

As usual, a very valuable aspect of ALA is the opportunity to meet with various vendors and publishers and either learn more about what they’re planning, tell them what we want them to plan, or both.

At the Project MUSE User Group breakfast, I learned that Project MUSE will have Highwire manage their computer operations (or something like that) beginning sometime next year. They assured us that they don’t plan to change the user interface; it will stay the same, but with Highwire “under the hood.” The MUSE folks said they are also looking at altmetrics and trying to find ways to measure the “impact” of humanities content. Project MUSE has been offering e-books for a year or two now (from 83 university presses & rising). Their e-books are now available for single-title purchase via YBP. In the Q&A, I asked if they are planning to stick with PDF format, or if they’re thinking of branching out into EPUB or other e-book formats. Answer: PDF for now, but EPUB and “other formats” are “on the radar” with the transition to Highwire. (My translation: don’t hold your breath.)

I also attended ProQuest’s sponsored breakfast, where speaker Megan Oakleaf gave essentially the same talk she gave at NASIG earlier that month, on using data to demonstrate the library’s value, based on things the larger institution values. I did like one example she gave, suggesting we look at course readings listed in Sakai/course syllabi and try to determine how much those readings would cost the students if they had to purchase each article individually. We need to explicitly connect the dots. Following Dr. Oakleaf, a Summon representative talked about the upcoming Summon 2.0. Then Kari Paulson, formerly President of EBL and now head of ProQuest’s combined EBL/ebrary division, talked about her vision for their new e-book venture. I mostly like what she said-striving to give customers more options (i.e. various acquisition models), integration with other ProQuest products, basically take the best of both EBL and ebrary-but it’s difficult to tell at this point how much of that is marketing-speak. But I at least like the overall vision. In a lighter moment, as Paulson began her portion, she quipped, “I no longer have sleepless nights worrying about what ebrary is up to.”

In other vendor interactions, I had a good discussion over lunch with Gale sale rep Matt Hancox, who picked my brain about DDA (and how Gale might enter that arena), and who also gave me a heads up about their parent company Cengage filing for bankruptcy (they’re calling it “debt restructuring,” but it’s business as usual for Gale customers). I also got a chance to meet a couple of vendor e-mail contacts face-to-face. My notes say something about JSTOR’s e-books and DDA, but I don’t remember anything beyond that. And finally (not just last in my report but also last in my conference), I dropped by the Palgrave booth to complain about our stalled license negotiation. We had sent in our request for some changes to the license back in December, and all we had heard back since then was that it was in their lawyer’s queue. I mentioned this to the person standing in the Palgrave booth at ALA, and said that it give the impression that they don’t care about our business. Well, it turns out that the person I was speaking to was in their Marketing department, and she took me very seriously. She said she had a meeting with their Legal department in a couple of weeks and would bring up our conversation. A good way to end the conference, eh? About 3 weeks later I got an e-mail from our Palgrave contact saying that our license was being reviewed by Legal. Nice!

Somehow, writing a blog post about my ALA 2013 experience seems to have slipped through the cracks. Could have something to do with the 5 licenses I currently have up in the air, I suppose. So here’s my report, to the best of my (and my notes’) memory. I thought I had some pictures to add, but alas, I can’t find them now, so this will probably be a boring post.

E-book Data Evaluation

Two presenters, one from a public library system and one from a university library, talked about how they use e-book usage data. The public librarian said that it is difficult, and perhaps invalid, to compare usage of e-books to usage of print books. She pointed out such differences as different loan periods; wait time for holds (much longer for print); overdues (none for e); different user base; e-book collection has more current, frontlist titles, and very few children’s e-books. The university librarian spoke a fair bit about demand-driven acquisition (DDA), but it didn’t sound like his library had any better grasp of things than we do. The bottom lines: be skeptical of the data, and so far no clear patterns are emerging.

Electronic Resource Management Interest Group

Two presenters from university libraries spoke about electronic resource management in the context of multiple user access models. That is, our users are presented with multiple means of accessing data; in our context, we’re talking VuFind, Summon, LibGuides, individual databases, library website, etc. The first speaker pointed out how difficult it is for the average user to navigate between those multiple avenues: If a user links from Summon into VuFind, how easy is it to get back to where they were in Summon? Is it confusing when they suddenly find themselves in a different UI? She challenged us to think about ways to make this environment more user-friendly. The second presenter pointed out that in many cases, we are managing similar data in multiple places. He also encouraged everyone to study information architecture to better understand the searchers’ perspective. Quotable quotes from this session: “We don’t call it cataloging any more; now it’s ‘discovery enhancement’,” and “There is not enough time or resources in the universe to be fully on top of e-resources maintenance.”

BIBFRAME Update

Steve gave a good report of this session in his ALA post [http://cloud.lib.wfu.edu/blog/pd/2013/07/12/steve-at-ala-annual-2013-and-rda-training-at-winthrop-university/]. As a reminder, BIBFRAME (short for “bibliographic framework”) is being developed as a way of encoding bibliographic data (simplified version: replacement for MARC). As Steve said, BIBFRAME is still a long way from taking any recognizable form, but Eric Miller, co-founder and president of Zepheira (company working on BIBFRAME), described what I would call the theory behind BIBFRAME. According to Miller, the goal of BIBFRAME is to “make interconnectedness commonplace.” He compared it to Legos-you can buy them in different sets, but all are interoperable, allowing small bits of data to be joined in interesting ways. They don’t have to tell you in advance what the building blocks will form, just give communities the blocks and allow them to recombine them in ways meaningful to them. Beyond that, most of this session got very technical and was pretty much over my head.

Meeting with publishers & vendors

As usual, a very valuable aspect of ALA is the opportunity to meet with various vendors and publishers and either learn more about what they’re planning, tell them what we want them to plan, or both.

At the Project MUSE User Group breakfast, I learned that Project MUSE will have Highwire manage their computer operations (or something like that) beginning sometime next year. They assured us that they don’t plan to change the user interface; it will stay the same, but with Highwire “under the hood.” The MUSE folks said they are also looking at almetrics and trying to find ways to measure the “impact” of humanities content. Project MUSE has been offering e-books for a year or two now (from 83 university presses & rising). Their e-books are now available for single-title purchase via YBP. In the Q&A, I asked if they are planning to stick with PDF format, or if they’re thinking of branching out into EPUB or other e-book formats. Answer: PDF for now, but EPUB and “other formats” are “on the radar” with the transition to Highwire. (My translation: don’t hold your breath.)

I also attended ProQuest’s sponsored breakfast, where speaker Megan Oakleaf gave essentially the same talk she gave at NASIG earlier that month [http://cloud.lib.wfu.edu/blog/pd/2013/06/26/nasig-2013/], on using data to demonstrate the library’s value, based on things the larger institution values. I did like one example she gave, suggesting we look at course readings listed in Sakai/course syllabi and try to determine how much those readings would cost the students if they had to purchase each article individually. We need to explicitly connect the dots. Following Dr. Oakleaf, a Summon representative talked about the upcoming Summon 2.0. Then Kari Paulson, formerly President of EBL and now head of ProQuest’s combined EBL/ebrary division, talked about her vision for their new e-book venture. I mostly like what she said-striving to give customers more options (i.e. various acquisition models), integration with other ProQuest products, basically take the best of both EBL and ebrary-but it’s difficult to tell at this point how much of that is marketing-speak. But I at least like the overall vision. In a lighter moment, as she began her portion, Paulson quipped, “I no longer have sleepless nights worrying about what ebrary is up to.”

In other vendor interactions, I had a good discussion over lunch with Gale sale rep Matt Hancox, who picked my brain about DDA (and how Gale might get a piece of that pie), and who also gave me a heads up about Cengage filing for bankruptcy (they’re calling it “debt restructuring,” but it’s business as usual for Gale customers). I also got a chance to meet a couple of vendor e-mail contacts face-to-face. My notes say something about JSTOR’s e-books and DDA, but I don’t remember anything beyond that. And finally (not just last in my report but also last in my conference), I dropped by the Palgrave booth to complain about our stalled license negotiation. We had sent in our request for some changes to the license back in December, and all we had heard back since then was that it was in their lawyer’s queue. I mentioned this to the person standing in the Palgrave booth at ALA, and said that it give the impression that they don’t care about our business. Well, it turns out that the person I was speaking to was in their Marketing department, and she took me very seriously. She said she had a meeting with their Legal department in a couple of weeks and would bring up our conversation. A good way to end the conference, eh? About 3 weeks later I got an e-mail from our Palgrave contact saying that our license was (finally) being reviewed by Legal!Somehow, writing a blog post about my ALA 2013 experience seems to have slipped through the cracks. Could have something to do with the 5 licenses I currently have up in the air, I suppose. So here’s my report, to the best of my (and my notes’) memory. I thought I had some pictures to add, but alas, I can’t find them now, so this will probably be a boring post.

E-book Data Evaluation

Two presenters, one from a public library system and one from a university library, talked about how they use e-book usage data. The public librarian said that it is difficult, and perhaps invalid, to compare usage of e-books to usage of print books. She pointed out such differences as different loan periods; wait time for holds (much longer for print); overdues (none for e); different user base; e-book collection has more current, frontlist titles, and very few children’s e-books. The university librarian spoke a fair bit about demand-driven acquisition (DDA), but it didn’t sound like his library had any better grasp of things than we do. The bottom lines: be skeptical of the data, and so far no clear patterns are emerging.

Electronic Resource Management Interest Group

Two presenters from university libraries spoke about electronic resource management in the context of multiple user access models. That is, our users are presented with multiple means of accessing data; in our context, we’re talking VuFind, Summon, LibGuides, individual databases, library website, etc. The first speaker pointed out how difficult it is for the average user to navigate between those multiple avenues: If a user links from Summon into VuFind, how easy is it to get back to where they were in Summon? Is it confusing when they suddenly find themselves in a different UI? She challenged us to think about ways to make this environment more user-friendly. The second presenter pointed out that in many cases, we are managing similar data in multiple places. He also encouraged everyone to study information architecture to better understand the searchers’ perspective. Quotable quotes from this session: “We don’t call it cataloging any more; now it’s ‘discovery enhancement’,” and “There is not enough time or resources in the universe to be fully on top of e-resources maintenance.”

BIBFRAME Update

Steve gave a good report of this session in his ALA post [http://cloud.lib.wfu.edu/blog/pd/2013/07/12/steve-at-ala-annual-2013-and-rda-training-at-winthrop-university/]. As a reminder, BIBFRAME (short for “bibliographic framework”) is being developed as a way of encoding bibliographic data (simplified version: replacement for MARC). As Steve said, BIBFRAME is still a long way from taking any recognizable form, but Eric Miller, co-founder and president of Zepheira (company working on BIBFRAME), described what I would call the theory behind BIBFRAME. According to Miller, the goal of BIBFRAME is to “make interconnectedness commonplace.” He compared it to Legos-you can buy them in different sets, but all are interoperable, allowing small bits of data to be joined in interesting ways. They don’t have to tell you in advance what the building blocks will form, just give communities the blocks and allow them to recombine them in ways meaningful to them. Beyond that, most of this session got very technical and was pretty much over my head.

Meeting with publishers & vendors

As usual, a very valuable aspect of ALA is the opportunity to meet with various vendors and publishers and either learn more about what they’re planning, tell them what we want them to plan, or both.

At the Project MUSE User Group breakfast, I learned that Project MUSE will have Highwire manage their computer operations (or something like that) beginning sometime next year. They assured us that they don’t plan to change the user interface; it will stay the same, but with Highwire “under the hood.” The MUSE folks said they are also looking at almetrics and trying to find ways to measure the “impact” of humanities content. Project MUSE has been offering e-books for a year or two now (from 83 university presses & rising). Their e-books are now available for single-title purchase via YBP. In the Q&A, I asked if they are planning to stick with PDF format, or if they’re thinking of branching out into EPUB or other e-book formats. Answer: PDF for now, but EPUB and “other formats” are “on the radar” with the transition to Highwire. (My translation: don’t hold your breath.)

I also attended ProQuest’s sponsored breakfast, where speaker Megan Oakleaf gave essentially the same talk she gave at NASIG earlier that month [http://cloud.lib.wfu.edu/blog/pd/2013/06/26/nasig-2013/], on using data to demonstrate the library’s value, based on things the larger institution values. I did like one example she gave, suggesting we look at course readings listed in Sakai/course syllabi and try to determine how much those readings would cost the students if they had to purchase each article individually. We need to explicitly connect the dots. Following Dr. Oakleaf, a Summon representative talked about the upcoming Summon 2.0. Then Kari Paulson, formerly President of EBL and now head of ProQuest’s combined EBL/ebrary division, talked about her vision for their new e-book venture. I mostly like what she said-striving to give customers more options (i.e. various acquisition models), integration with other ProQuest products, basically take the best of both EBL and ebrary-but it’s difficult to tell at this point how much of that is marketing-speak. But I at least like the overall vision. In a lighter moment, as she began her portion, Paulson quipped, “I no longer have sleepless nights worrying about what ebrary is up to.”

In other vendor interactions, I had a good discussion over lunch with Gale sale rep Matt Hancox, who picked my brain about DDA (and how Gale might get a piece of that pie), and who also gave me a heads up about Cengage filing for bankruptcy (they’re calling it “debt restructuring,” but it’s business as usual for Gale customers). I also got a chance to meet a couple of vendor e-mail contacts face-to-face. My notes say something about JSTOR’s e-books and DDA, but I don’t remember anything beyond that. And finally (not just last in my report but also last in my conference), I dropped by the Palgrave booth to complain about our stalled license negotiation. We had sent in our request for some changes to the license back in December, and all we had heard back since then was that it was in their lawyer’s queue. I mentioned this to the person standing in the Palgrave booth at ALA, and said that it give the impression that they don’t care about our business. Well, it turns out that the person I was speaking to was in their Marketing department, and she took me very seriously. She said she had a meeting with their Legal department in a couple of weeks and would bring up our conversation. A good way to end the conference, eh? About 3 weeks later I got an e-mail from our Palgrave contact saying that our license was (finally) being reviewed by Legal!

NASIG 2013

Wednesday, June 26, 2013 2:24 pm

This year’s conference of the North American Serials Interest Group (NASIG) was held June 6-9 in Buffalo, NY. After a bumpy plane ride, Chris and I arrived safely in Buffalo. (NASIG VP/Pres-elect Steve Kelley got there a day ahead of us.)

The opening session was presented by Dr. Bryan Alexander, from the National Institute for Technology in Liberal Education (NITLE). His address was primarily an overview of technology trends, especially use of mobile devices. The audience seemed engaged, but the audio was terrible where I sat, and then I lost most of my notes on his talk due -ironically- to a technology error, so Steve or Chris will have to fill you in.

I thought Saturday’s plenary address was a good combination of “Libraries are important” and “Libraries must change” [nod to Carol]. The speaker, Megan Oakleaf of Syracuse University, focused on how to communicate the importance of libraries to stakeholders. She said there is a lot of information on the value of libraries in general, but not much on their value to the sponsoring organization. The usual value metrics-user satisfaction, service quality, collection counts, usage (“a lot of people downloaded a lot of things”)-don’t communicate a compelling message. Oakleaf said we should identify what the institution/community values, then tie the message of library outcomes to those values. How/what does the library contribute toward student recruitment? student success? faculty recruitment/tenure/promotion? research funding? local economy? Do students who use more library resources ultimately get better grades? She encouraged us to think about what data we need to collect in order to answer such questions.

Sunday’s plenary session featured Siva Vaidhyanathan, author of The Googlization of Everything (and Why We Should Worry). He was scheduled to talk about “the Challenge of Big Data,” and it so happened that this session came just a few days after news of the NSA’s Prism surveillance program broke. I found his presentation fascinating. He pointed out that Google (along with other big-data endeavors) is in the prediction business, using the massive amounts of data on past user behavior to read our minds. He wondered aloud about the NSA’s ability to read and misread our information, and about how statistical correlation could kill the scientific method. This new system means no second chances; the stupid mistakes of our youth never go away. Yet most of us continue to carry GPS devices (aka cell phones) with us wherever we go, and we continue to use Google and grocery store “loyalty” cards. My favorite take-away was Vaidhyanathan’s explanation of privacy. He said that privacy is not about hiding all information, but is the ability to influence your reputation within certain contexts. There are some things you want your brother to know, but not your sister, or your clergy but not your coach. When the defaults are set to lock the flow of information open, then we lose that control and have no privacy. He said he sees a new digital divide, between those who are savvy enough to shape their digital profile and those who are victims of the system, who don’t understand, for instance, the connection between their poor credit score and their difficulty finding a job. He urged those of us on the savvier side to fight for those who don’t know how to protect their rights.

And that’s the short version of my notes on that session!

EBSCO Usage Consolidation
You probably didn’t know this, but ZSR recently subscribed to EBSCO Usage Consolidation (UC), an online service for aggregating journal usage statistics. So I went to a session in which librarians from two universities described their experience with the product. Their review was mixed. Specific problems noted included (1) a lot of up-front effort to reconcile title differences; (2) difficult user interface; (3) default cost-per-use display includes usage from aggregator databases without factoring in the cost. They liked having cost-per-use data, and the number of available reports, but librarians at one university found it too cumbersome for title-by-title review.

Designing User-Centered Services for Virtual Users
The main take-away from this session was how nice it is to work at ZSR. The presenters made a big deal out of public services and technical services working together, like it was something novel to get public services’ “endorsement” for customizing the EBSCOhost interface. Steve and I talked later about how nice it is that everybody here is focused on what is best for our users.

Aggregator Databases: Cornerstone or Annex?
The presenters in this session described their efforts to assess the value of full-text content in aggregator databases. Rather than looking just at title counts, they compared full-text aggregator titles against ISI’s top-ranked journals (i.e. highest impact factors) in various subjects. For example, they determined that Academic Search Premier contained 11 of ISI’s top 25 journals in Education. Not surprisingly, they said they ended up with more questions than answers, such as the value of the aggregators’ indexing for article discovery. It was an interesting (if tedious) methodology, but ultimately doesn’t apply much to us given the role of NC LIVE providing much of our access to aggregator databases.

FRBR, Linked Data, & New Possibilities for Serials Cataloging
This was a very good presentation about the potential of linked data to bring together catalog records for related resources. The presenters described a scenario of a patron looking for the English translation of an Einstein paper. The original German work was published as a journal article in 1903. After much digging, they discovered the English translation within a 1989 monograph, but there was nothing in the respective catalog records directly linking the two manifestations together. The principles of FRBR and linked data can overcome MARC’s weakness in showing such relationships between items. Journals, articles, authors, and even subject headings, are all described as individual entities, and the coding describes their relationships to each other. The presenters talked about BIBFRAME, the Library of Congress’ “Bibliographic Framework Initiative” that is working toward replacing MARC. I admit I didn’t understand all this very well, and I’ll definitely be looking to learn more about it.

ONIX-PL
Finally, I had the opportunity at NASIG to pick Selden Lamoureux’s brain to learn more about ONIX-PL. What’s that? I’m glad you asked. ONIX-PL is a NISO standard, kind of like MARC for e-resource licenses. The standard was released in 2009, but uptake has been slow (practically nil in the US). Learning to encode licenses in ONIX-PL isn’t easy, so there hasn’t been much incentive for publishers to start using the standard. NISO recently received a Mellon grant to encode a collection of license templates, to give publishers and libraries a starting point, and NISO has contracted with Selden Lamoureux to do the encoding. So it was a great opportunity for me to meet up with her and learn more about ONIX-PL and the encoding project (which I plan to write more about in an upcoming article).


Pages
About
Categories
2007 ACRL Baltimore
2007 ALA Annual
2007 ALA Gaming Symposium
2007 ALA Midwinter
2007 ASERL New Age of Discovery
2007 Charleston Conference
2007 ECU Gaming Presentation
2007 ELUNA
2007 Evidence Based Librarianship
2007 Innovations in Instruction
2007 Kilgour Symposium
2007 LAUNC-CH Conference
2007 LITA National Forum
2007 NASIG Conference
2007 North Carolina Library Association
2007 North Carolina Serials Conference
2007 OCLC International ILLiad Conference
2007 Open Repositories
2007 SAA Chicago
2007 SAMM
2007 SOLINET NC User Group
2007 UNC TLT
2007_ASIST
2008
2008 Leadership Institute for Academic Librarians
2008 ACRL Immersion
2008 ACRL/LAMA JVI
2008 ALA Annual
2008 ALA Midwinter
2008 ASIS&T
2008 First-Year Experience Conference
2008 Lilly Conference
2008 LITA
2008 NASIG Conference
2008 NCAECT
2008 NCLA RTSS
2008 North Carolina Serials Conference
2008 ONIX for Serials Webinar
2008 Open Access Day
2008 SPARC Digital Repositories
2008 Tri-IT Meeting
2009
2009 ACRL Seattle
2009 ALA Annual
2009 ALA Annual Chicago
2009 ALA Midwinter
2009 ARLIS/NA
2009 Big Read
2009 code4lib
2009 Educause
2009 Handheld Librarian
2009 LAUNC-CH Conference
2009 LAUNCH-CH Research Forum
2009 Lilly Conference
2009 LITA National Forum
2009 NASIG Conference
2009 NCLA Biennial Conference
2009 NISOForum
2009 OCLC International ILLiad Conference
2009 RBMS Charlottesville
2009 SCLA
2009 UNC TLT
2010
2010 ALA Annual
2010 ALA Midwinter
2010 ATLA
2010 Code4Lib
2010 EDUCAUSE Southeast
2010 Handheld Librarian
2010 ILLiad Conference
2010 LAUNC-CH Research Forum
2010 LITA National Forum
2010 Metrolina
2010 NASIG Conference
2010 North Carolina Serials Conference
2010 RBMS
2010 Sakai Conference
2011 ACRL Philadelphia
2011 ALA Annual
2011 ALA Midwinter
2011 CurateCamp
2011 Illiad Conference
2012 SNCA Annual Conference
ACRL
ACRL 2013
ACRL 2015
ACRL New England Chapter
ACRL-ANSS
ACRL-STS
ALA Annual
ALA Annual 2013
ALA Editions
ALA Midwinter
ALA Midwinter 2012
ALA Midwinter 2014
ALCTS Webinars for Preservation Week
ALFMO
ANCHASL
APALA
ARL Assessment Seminar 2014
ARLIS
ASERL
ASU
ATLA
Audio streaming
authority control
Berkman Webinar
bibliographic control
Book Repair Workshops
Career Development for Women Leaders Program
CASE Conference
cataloging
Celebration: Entrepreneurial Conference
Charleston Conference
CIT Showcase
CITsymposium2008
Coalition for Networked Information
code4lib
commons
Conference Planning
Conferences
Copyright Conference
costs
COSWL
CurateGear 2013
CurateGear 2014
Designing Libraries II Conference
DigCCurr 2007
Digital Forsyth
Digital Humanities Symposium
Disaster Recovery
Discovery tools
E-books
EDUCAUSE
Educause SE
EDUCAUSE_SERC07
Electronic Resources and Libraries
Embedded Librarians
Entrepreneurial Conference
ERM Systems
evidence based librarianship
FDLP
FRBR
Future of Libraries
Gaming in Libraries
General
GODORT
Google Scholar
govdocs
Handheld Librarian Online Conference
Hurricane Preparedness/Solinet 3-part Workshop
ILS
information design
information ethics
Information Literacy
innovation
Innovation in Instruction
Innovative Library Classroom Conference
Inspiration
Institute for Research Design in Librarianship
instruction
IRB101
Journal reading group
Keynote
LAMS Customer Service Workshop
LAUNC-CH
Leadership
Learning spaces
LibQUAL
Library 2.0
Library Assessment Conference
Library of Congress
licensing
Lilly Conference
LITA
LITA National Forum
LOEX
LOEX2008
Lyrasis
Management
Marketing
Mentoring Committee
MERLOT
metadata
Metrolina 2008
MOUG 09
MOUG 2010
Music Library Assoc. 07
Music Library Assoc. 09
Music Library Assoc. 2010
Music Library Association
NASIG
National Library of Medicine
NC-LITe
NCCU Conference on Digital Libraries
NCICU
NCLA
NCLA Biennial Conference 2013
NCPC
NCSLA
NEDCC/SAA
NHPRC-Electronic Records Research Fellowships Symposium
NISO
North Carolina Serial Conference 2014
North Carolina Serials Conference
Offsite Storage Project
OLE Project
online catalogs
online course
Online Learning Summit
OPAC
open access
Peabody Library Leadership Institute
plagiarism
Podcasting
Preservation
Preservation Activities
Preserving Forsyth LSTA Grant
Professional Development Center
rare books
RDA/FRBR
Reserves
RITS
RTSS 08
RUSA-CODES
SAA Class New York
SACS-COC
SAMM 2008
SAMM 2009
Scholarly Communication
ScienceOnline2010
Social Stratification in the Deep South
Social Stratification in the Deep South 2009
Society of American Archivists
Society of North Carolina Archivists
SOLINET
Southeast Music Library Association
Southeast Music Library Association 08
Southeast Music Library Association 09
SPARC webinar
subject headings
Sun Webinar Series
tagging
TALA Conference
Technical Services
technology
ThinkTank Conference
Training
UIPO Symposium
ULG
Uncategorized
user studies
Vendors
video-assisted learning
visual literacy
WakeSpace
Web 2.0
Webinar
WebWise
WFU China Initiative
Wikis
Women's History Symposium 2007
workshops
WSS
ZSR Library Leadership Retreat
Tags
Archives
July 2015
June 2015
May 2015
April 2015
March 2015
February 2015
January 2015
December 2014
November 2014
October 2014
August 2014
July 2014
June 2014
May 2014
April 2014
March 2014
February 2014
January 2014
December 2013
November 2013
October 2013
August 2013
July 2013
June 2013
May 2013
April 2013
March 2013
February 2013
January 2013
December 2012
November 2012
October 2012
September 2012
August 2012
July 2012
June 2012
May 2012
April 2012
March 2012
February 2012
January 2012
December 2011
November 2011
October 2011
September 2011
August 2011
July 2011
June 2011
May 2011
April 2011
March 2011
February 2011
January 2011
December 2010
November 2010
October 2010
September 2010
August 2010
July 2010
June 2010
May 2010
April 2010
March 2010
February 2010
January 2010
December 2009
November 2009
October 2009
September 2009
August 2009
July 2009
June 2009
May 2009
April 2009
March 2009
February 2009
January 2009
December 2008
November 2008
October 2008
August 2008
July 2008
June 2008
May 2008
April 2008
March 2008
February 2008
January 2008
November 2007
October 2007
September 2007
August 2007
July 2007
June 2007
May 2007
April 2007
March 2007
February 2007
January 2007

Powered by WordPress.org, protected by Akismet. Blog with WordPress.com.