Professional Development

Author Archive

Charleston 2014 According to Carol: Kanopy and E-Books

Thursday, November 13, 2014 4:56 pm

Illinois State University spoke about their experience with Kanopy. Two key observations about impact:

  • After starting DDA, they saw an increased number of requests to license non-DDA Kanopy titles – suggesting that some percentage of faculty users treat Kanopy as a standalone database.
  • ISU had previously bought streaming rights to some individual titles, which they hosted locally. When these titles were duplicated in the Kanopy DDA set, the Kanopy version generally had more use. This implies that the Kanopy versions are either more useful or more easily discoverable.

At Wake Forest, two Kanopy DDA films have already been used enough to trigger a purchase, and this is before loading the MARC records or doing any promotion beyond a single ZSReads article.

Two librarians from Wesleyan University did both qualitative (anthropology-style + usability) and quantitative (survey) studies of student attitudes and behaviors regarding e-books. Their observations:

  • Having personal control over a copy was most important, e.g. printing or making a PDF.
  • E-books work best for discovery. Print is better for deep reading.
  • Students read just what they need to write the paper. This holds true for print books and e-books.
  • Students are not interested in pirating per se, but they prioritize easy over legitimate.
  • Indexes to e-books are still exact reproductions of the paper format. The index terms are not hyperlinked; therefore, the index does not get used.

I saw two presentations on e-books featuring the always interesting Michael Levine-Clark from Denver. In the first presentation, he was on a panel that included reps from Wiley, OUP and YBP. They focused on the rapidly increasing costs of short-term loans, i.e. the one-day rental fees paid for the DDA books. Rebecca Seger from OUP presented on the economics of publishing a book. In a nutshell, OUP could predict the revenue streams for print but not for DDA. However, Levine-Clark pointed out that in the aggregate Denver spends the same amount on book content regardless of the existence of DDA. It’s just spread around differently. (At WFU, ZSR is actually spending more on monographs since the advent of DDA.) Any total reduction in monographs spending (at Denver or nationally) is due to journal inflation, which both Oxford and Wiley engage in. Since Denver is facing a flat budget, if current trends continue, their monograph spending (print or e) will be $0 by 2020. The panel did not offer any concrete suggestions on resolving the crisis beyond general statements about publishers and librarians working together.

The second presentation explored e-book usage in the Humanities. Levine-Clark had a national data set, and he compared usage in Humanities vs. Social Sciences vs. STEM. Then he compared the disciplines within Humanities to each other. I quickly realized that – based on usage patterns – Linguistics & Communication act more like the Social Sciences than Humanities. One interesting thing that he noted: The number of use sessions per 100 books available is lower in the Humanities than in Social Sciences or STEM. He did not speculate on a reason, but personally, I wonder if this reflects an oversupply of Humanities research compared to the demand for consuming Humanities research – especially since Humanities faculty are often specifically evaluated by whether they have published a book.

Imagine for a moment that ZSR cancelled its DDA plan: What might take its place? The two main alternative purchasing models are subscriptions (e.g. ebrary) and the Big Deal. I attended two sessions that probed different aspects of the Big Deal model. For e-books, Big Deal purchases are usually brokered directly by publishers (instead of by aggregators like EBL and ebrary). They generally do not have any DRM, and the books can be used by unlimited users. After UNC-Charlotte serendipitously discovered that they had 30 course adoption books within their Big Deal packages, they began deliberately promoting this idea with the faculty. They ultimately paid $14K for 117 additional titles. (They purchased some books one-by-one in addition to the Big Deals.) The bookstore was a good partner. A faculty member who used this program for his Film Studies course talked about how this program positively impacted his teaching.

Examples:

  • He did not feel morally obligated to use every single chapter in the textbook, since the students were not required to pay out-of-pocket for it.
  • A corollary: he felt free to use single chapters from various books.
  • He likes a tech-free classroom, yet he still found ways to use the text within the class session.

Sidebar: This generally works for “course adoption” books. Rebecca Seger had helpfully explained the distinction between a “course adoption” book and a textbook. A textbook is something like Intro to Statistics, 18th edition. A “course adoption” book is something like The Kirghiz and Wakhi of Afghanistan: adaptation to closed frontiers and war, which was not expressly designed as a textbook, but was indeed adopted for course use by a faculty member at WFU. Publishers do not know in advance which general monographs will become course adoption books. Generally, publishers do not sell multi-user textbooks to libraries, since that harms their lucrative (extortionate?) textbook revenue stream.

The last presentation I attended painted a less rosy picture of the Big Deal. Miami University thoroughly analyzed 2.5 years of usage statistics for Big Deal e-books purchased in 2012. Only 19% of titles had a use. Just three books (by their titles, clearly textbooks) accounted for 17% of downloads. Miami’s FTE is roughly 15K, or twice that of WFU. Therefore, I speculate that WFU would see only 10% usage if ZSR were to purchase this kind of package. Every time I have investigated the pricing of one of these packages, I have noted that the discount for buying in bulk does not even come close to accounting for the nearly inevitable low usage rates. While packages differ as to subject coverage, the ones that cover everything published by Publisher X in a given year are the worst deal, as there is no price break for the large swaths of content (e.g. agriculture) that would see virtually no use at a school like WFU.

While the Big Deal for journals is frequently (and sometimes with justice) maligned among librarians, the extra you pay for the journals without any previous subscription (i.e. likely low-use journals) rarely exceeds more than 10% of prior spend. I would not advocate for pursuing the Big Deal model for monographs unless publishers begin offering much steeper discounts.

Carol at the International Medieval Congress

Sunday, May 11, 2014 7:47 pm

Thanks to a fortunate alignment of events, I got to go on an all-expenses-paid (by me) trip to the International Medieval Congress in Kalamazoo, MI at Western Michigan University.

Most sessions were 1½ hours long and included three presentations around a common theme. I attended the following sessions (WFU faculty who presented are listed in parentheses):

  • In Honor of Dolores Warwick Frese I: Medieval Mothers and the Mother Tongue
  • New Research in Old High German Literature and Linguistics (Tina Boyer, German & Russian)
  • Gothic Language and Linguistics
  • In a Word, Philology: Etymology, Lexicography, Semantics and More in Germanic (Heiko Wiggers, German & Russian)
  • Crusades
  • Hanse Realm: Trade, Culture, and Exchange
  • Late Antiquity II: Late Antique Italy
  • Rethinking Reform II: Councils as Context, Catalyst, and Communicator of Reform
  • Philosophical Texts and Traditions (Michael Sloan, Classical Languages)

Since several of the papers were about the history of translating certain texts, I managed to touch on all five of my liaison areas in a single conference. Gale Sigal (English, co-chair of the WFU Medieval Studies program) invited me to attend dinner with her and four WFU students who were there. Lunches were in the school cafeteria. Mealtime conversations with WFU faculty led to five discrete requests from three different WFU faculty for me to buy a book, check up on a standing order possibility, etc. At one lunch, Michael and Tina discussed how they’ve used my instruction services, and they each pledged to use them more often. I almost never get PRS’s unless I’ve visited the class, so I mentioned the possibility of a 10-minute class drop-in, which is mainly a commercial for the PRS service and the relevant Research Guide. (Fortunately I give Classical Languages and German fairly even attention – if I didn’t I would’ve been busted!)

A few observations on this type of conference in comparison to librarian conferences:

The exhibit hall. My stereotype…

Ask Your Library to Subscribe Today!

Ask Your Library to Subscribe Today!

There was only one booth like this. This booth was more typical…

One of many booths selling individual books to attendees

One of many booths selling individual books to attendees

There was basically no vendor swag. The vendors were more focused on selling a single book today as opposed to selling $50K worth of product six months from now. This conference also featured sellers of “medieval sundries.”

Drinking Horns for Sale

Drinking Horns for Sale

They didn’t say, but I’m assuming these drinking horns are not dishwasher safe.

During the presentations, PowerPoints were relatively uncommon, and Tina was the only presenter I saw using Prezi. Much more typical was a paper handout. Usually the handout had a few paragraphs of medieval text that the presenter was going to analyze. In one case, the speaker handed out his entire paper! It was also not uncommon for a presenter to read a paper verbatim – something I almost never see at a library conference.

The sheer number of sessions was overwhelming. There were 565 sessions total, including as many as 54 in the same time slot!

The attendees seemed to be much more international (or at least European) than the crowd at library conferences. I either saw a presentation by or ate a meal with someone from South Africa, Germany, Italy, Finland, Hungary, Switzerland, Italy, Canada and the UK. Since church history is a significant aspect of medieval culture, there were several monks, nuns and priests in the crowd and among the presenters.

One similarity with the Charleston Conference that I attend annually: This conference is held in the same place every year, and a significant number of attendees go year after year. This situation leads to social groups forming and re-forming each year, as well as certain annual rituals like visiting the same restaurants.

Come talk to me if you’d like to hear more details about all the presentations I saw. Be forewarned: I might go on and on about two Old Norse words for “word” if you do.

2014 ER&L virtual conference

Thursday, April 10, 2014 5:03 pm

For the second year running, I “attended” the Electronic Resources & Libraries conference by watching streamed sessions. I still plan on watching sessions as time permits throughout the year, since the group purchase that Derrik made runs until the next conference is held in 2015. (ZSR folks: Ask Derrik if you need the password.)

One trend that popped up in multiple presentations was Evidence-Based Acquisition (EBA). Like its close relative Demand-Driven (or Patron-Driven) Acquisition, it has two names and two initialisms. So you may also hear of Usage-Driven Acquisition (UDA). With EBA, you give a provider an up-front deposit, say, $5,000. Then then provider turns on their entire catalog of e-books or streaming films. After a set time, say, a year, you get a usage report and can choose $5,000-worth of products for permanent ownership. There are some pros and cons to this approach, especially vis-à-vis DDA. (What if you don’t get $5,000 worth of use? What if all the use is long tail with no “short head”?) However, since providers who use this model generally do not participate in DDA models, EBA may be the most cost-effective way to buy certain types of material.

Another hot topic was the end-user experience with e-books and certain multimedia databases. Basically, it’s bad. Typical problems with e-books include not being able to print, not being able to use the book on certain devices, not being able to store the book for later consultation. Multimedia has a different but related set of concerns. (I’m reminded of this comic and this infographic. They both claim that poor UX drives customers to piracy.) The presenters didn’t go as far as claiming that library resources drive folks to piracy, but they did claim that students will instead either download free alternatives or the “haves” might buy individual copies instead, which could magnify the effects of economic disparities among students. The presenters insisted that libraries should put their collective foot down and refuse to buy user-hostile resources (even if the information contained is high quality). They called out one well-known database as particularly awful. A quick check of that library’s website established that they still subscribe to the bad product, so the force of their argument was somewhat undermined. I have hope, however, because I can remember a time in the 90s when e-journals and e-newspapers were just as bad as e-books are today. Printing from JSTOR used to be a nightmare, and you had to use certain specific computers if you wanted to use ProQuest. Then you had to use a different computer entirely for LexisNexis. These days, e-journals generally just work. Maybe e-books and multimedia sites will get there someday if we keep leaning on the vendors and if we at least occasionally refuse to buy products that are the worst UX offenders.

Two Thoughts from the NC Serials Conference

Monday, April 7, 2014 1:59 pm

I also attended the North Carolina Serials Conference last month. Since several other ZSR bloggers have already reported, I will focus on two ideas put forward during a late-morning plenary session, which featured David Crotty again.

Crotty remarked that the paper announcing the cure is not as important as the actual cure. We might make the paper available via Open Access while the cure itself (say, a drug) might be protected by patent law.

Crotty also asserted that, contrary to popular belief, Humanities often runs at a profit while Physical Science runs a deficit within a university budget. He claimed that’s because a lot of tuition money is paid by Humanities majors, which subsidizes expensive lab space in Physical Science. (I’m carefully noting that he didn’t say Life Sciences, which probably attracts the most grant money of all and is also a popular undergraduate major.) He cited a recent NPR story about Duke that focused on where all the money goes. I listened to that story today, but didn’t hear the same interdepartmental subsidy message that Crotty asserted. So, I don’t know if he cited some other evidence for his claim (that I didn’t write down) or what. Nevertheless, I would be very interested in whether this is true at Wake Forest or not. I have often thought about this issue on a smaller scale when we allocate the collections budget. Even if you just look within a broad discipline group like Humanities, it appears that larger, more popular majors subsidize smaller ones. I have two defenses to offer. The first is that Demand Driven Acquisition serves as a correction to this tendency. The second is that a certain amount of inter-departmental subsidizing is necessary. Students are attracted to Wake Forest because they like the idea that they have over 40 choices for their major. Once they actually get here, over half of the students cluster in a just a few majors. However, many students would not choose WFU at all if we only offered, say, ten majors. Crotty’s broader point, and the point of the NPR story, is to ask whether it’s a good idea for student tuition dollars to go towards research, especially when the tuition comes in the form of a loan that must be repaid with interest.

2013 Charleston according to Carol

Wednesday, November 20, 2013 10:12 am

Here are the highlights of the most important sessions I attended at Charleston:

Derrik has already covered the first session on discovery services. I won’t repeat what he said, except to link to the slides. I’ll also point out that we were one of the 149 libraries that gave approval to be studied (slide 10), but I don’t know if we were ultimately selected. In a related presentation on Friday, Bruce Heterick from JSTOR discussed efforts in getting their content to show appropriately in discovery services. JSTOR found that usage plummeted after certain schools implemented certain discovery layers. (My opinion: Students will frequently use JSTOR on name recognition alone – even when it’s not the optimal source for their topic. If the discovery service delivers more appropriate up-to-date content, so much the better.) Heterick said that many discovery services depend heavily on subject metadata for relevancy ranking. JSTOR does not include that metadata, and it would be expensive to produce. (Just a thought – many JSTOR articles are indexed with subject metadata in A&I places like MLA, which are sometimes included in the discovery service as well. How can that be harvested appropriately?)

Librarians from Ferris State reported on how they processed titles that they committed to retain within their Michigan consortium. They used a 912 field in the MARC record to indicate reasons for retention. Missing books and those in poor condition took extra time to process since they needed to find another consortium member who would take responsibility for keeping the title.

Kristin Calvert from Western Carolina reported on a project to move all their usage stats to EBSCO Usage Consolidation (hence: EUC). Before implementing this project, it took them four full working days each year to collect e-journal stats. I know Derrik would identify with some of the frustrations that Calvert expressed. After the decision to use EUC, it took…

  • 2-3 weeks to set up (I’m not sure if non-stop work is implied here.)
  • 8 hours for initial cleanup
  • 4-6 hours for quarterly loads (could do this annually to save time)
  • <1 hour/month for cleanup

The product includes an “Exceptions” list of journals that had some kind of mismatch in the system. WCU staff had to reconcile the exceptions, but once they did, EUC remembered the fix so the same exception wouldn’t pop up again. The screenshot that Calvert showed had zero exceptions. Calvert concluded that she found this project worthwhile given the efficiencies gained at the end.

On Saturday, two librarians from Bucknell discussed how they dropped their approval plan and went with print DDA for everything. They use WorldCat/WorldShare for their catalog and discovery layer, so they could accomplish this without any loading (or deactivation) of records in their system. Patrons click on a ‘Get It’ button (powered by GIST), and a librarian decides whether to fulfill the request by purchase or by ILL. In the end, they ordered 1/3 fewer titles, spent 50% less, and ILL decreased. Bucknell took this path because their approval books circulate at a low rate. They also weed aggressively (12K new books/year and 6K deletions/year), so their collection was a revolving door. They pointed out that their library focuses on undergraduate curriculum, not research, so WFU may not want to pursue this idea. One point that resonates with me though: they reminded us that ‘efficient’ does not necessarily mean ‘effective.’ Approval plan ordering is the most efficient way to get books, and e-book DDA is even more efficient at delivery. However, are they as effective in getting users to the content they need in the format they want?

Carol at NCLA 2013

Thursday, October 24, 2013 10:02 am

Since I live very close to the Convention Center, I volunteered for the Local Arrangements Committee. In addition to managing the bag-stuffing operation, I spent several hours staffing the Local Information Booth, from which I gave opinionated advice about local restaurants (and handed out restaurant guides prepared by Hu!). I was thrilled to leave my car at home for three days straight, but was mildly disappointed to discover that I didn’t win the short distance award. To my knowledge, that honor belongs to another ZSR librarian (ask around offline if you want to know who!) and a librarian from High Point U. who lives downtown.

Local Information Table
I still had time to attend some of the sessions. I’ll skip talking about sessions already discussed by other ZSR bloggers, and a few others where my main takeaway was confirming that I am already up on current trends. Here are more details on three sessions where I learned a lot of valuable new-to-me information.

Demystifying Fund Formulas in an Academic Library Setting

Lisa Barricella & Cindy Shirkey, ECU
ECU was looking for a different way to allocate the monograph portion of their budget. Their previous formula – based on factors like credit hours, faculty headcount, grad students, etc. – had several flaws. For instance, using credit hours earned in a subject would overfund areas like foreign languages where there is a lot of enrollment at the lower levels, but not a lot of need for library materials. Also their old formula – just for monographs – didn’t account for the journals v. books breakdown which is unique to each discipline. (There was also the procedural issue that the data, which came from sources external to the library, was sometimes very difficult to collect.) Their new formula relied heavily on two factors: how much the collection in, say, Art was used as a proportion of the entire collection and how many ILLs did Art generate in proportion to their holdings. Both of these criteria more closely map to the actual demand for monographic materials in that subject. (The ILL part was not fully implemented due to specific failures in ILLIAD reporting.) Finally the average price of books was considered. While I’m not looking to redo all the monograph budgets anytime soon, I will keep these ideas in mind in case we ever need to overhaul our monograph budget structure.

Taming the Hydra: A Strategic Approach

Kim Vassiliadis, Emily King & Chad Haefele, UNC
This presentation is about how UNC corralled a whole bunch of subject guide thingies all over their website, deleted about half of them, and got all the rest into LibGuides with an updated (and consistent) look-and-feel. Then they initiated a plan to make sure that each LibGuide gets some maintenance at least once a year. Guides that are not updated are given “unpublished” status (i.e. suppressed from public view) in LibGuides. I’m impressed that they were able to pull this off in the decentralized environment at UNC. One rule they implemented was that you can’t have more than one row of tabs. Also every guide has to have an intro paragraph that lists all the tabs. I actually disagree with the intro paragraph idea. More on that in a minute.

I Honestly Had No Idea: LibGuides Usability Assessment in an Academic Library

Randall Bowman, Teresa LePors & Shannon Tennant, Elon
LibGuides best practices is an area where a lot of folks (including yours truly) have lots of ideas but very little evidence. Elon conducted a usability test with some undergraduates to fill the evidence gap. In addition to asking students to perform tasks, they asked some subjective questions at the end.
Some of their conclusions:

  • Students go straight for the search box, any search box. That’s bad news on my guides since the only embedded search box is for “Search this Guide.” That’s also bad if the source with a search box is not the best place to go for that topic. For one task, the relevant guide had a JSTOR search box embedded (also with the pretty “J” logo). However, JSTOR did not contain the particular article that students needed to find.
  • Students don’t read the text on the page. They quickly scan for something that looks familiar.
  • Students ignore the tabs. (Paraphrased comment from audience: I’ve been to three presentations on LibGuides, and they all say that students don’t use tabs! However, LibGuides is built around tabs!)
  • Students were split (on both the tasks and the subjective questions) as to whether “Articles” or “Databases” was the best word for leading students to databases that contain articles. (My own guides hedge on this one by saying e.g. “Linguistics Databases for Finding Journal Articles”)
  • Students don’t scroll, which is bad news if you’re also not using tabs
  • Elon’s main LibGuides page prominently featured the tag cloud. Students didn’t use it, and on the subjective questions they Xed it out as an unnecessary element.
  • Students liked the librarian profiles, which include an embedded chat window. A significant percentage of their chat questions are referred by the research guide pages.

Based on what they learned, Elon is going to lose the tag cloud and have the front page of each LibGuide list all the tabs (like UNC). I disagree with this “intro paragraph” approach since it was also established that students don’t read the text! When I have time, I’m going to edit my LibGuides so that the #1 resource is a search widget, preferably with a pretty logo. If there is no pretty logo, then maybe I’ll add the “Best Bet” star like we use on the database pages.

Carol “at” ER&L

Friday, April 5, 2013 4:45 pm
Local Scenery "at" the conference

Local Scenery "at" the conference

No, my post title isn’t a candidate for the “Blog” of “Unnecessary” Quotation Marks. I was one of the virtual attendees at the conference. Since Derrik and Chris have already blogged, I’ll focus my reflections on some of the topics they haven’t covered yet.

What would Google Do?

Elizabeth German from the University of Houston reported on a transaction log analysis. Like us, their home page had several tabs roughly equating to Summon, catalog, e-journals, etc. Forty-three percent of searches were for known items vs. 53% for unknown item searching.

DDA as a Game Changer

Michael Levine-Clark University of Denver and Barb Kawecki YBP

This presentation was partly an overview of what DDA is. The new part for me is that NISO is working on some Best Practices on how to do DDA. For instance, publishers should keep titles available indefinitely in case a book languishes unused for decades but eventually gets a use. Another key thing they’re working on is how to get titles out of the consideration pool. (At WFU, we are particularly worried about superseded editions staying in the DDA pool alongside the newer editions.)

E or P: A Comparative Analysis of Electronic and Print Book Usage

Christopher C. Brown and Michael Levine-Clark from the University of Denver updated a study that I reported on last year about print and electronic use of the same book. This time, they analyzed Duke University Press books they had bought in print and from ebrary. When both formats were available, print was more likely to be used than e. (619 different print titles were used vs. 451 e-titles. This is from a universe of 1150 titles available in both formats.) However, books that were used in both formats tended to have the highest amount of total usage, and the type of e-usage tended to be more significant (e.g. higher number of pages used, higher number of pages printed). Once again, use in print correlated with use of e. Maybe patrons are using the e-book for discovery, which leads to a checkout in print. (Note this is just a speculated pattern; the data aren’t granular (or invasive) enough to prove that the same person uses a book electronically and then subsequently checks out the print.)

OpenURL Link Resolvers: Tracking Effectiveness over Time

Kenyon Stuart from the University of Michigan studied how SFX and 360 Link and Summon’s Direct Linking succeeded in linking users to full text. After all this time, there’s still a lot of room for improvement in this linking. Supplements, book reviews and newspapers are particular problems. He did find ways to reduce bad links for the most frequently used journals. For instance, he moved better-performing providers up in the 360 Link results list. (We’ve done that too, for instance ranking publisher-based sites generally higher than aggregators and putting print last.)

Reflections on virtual attendance

ER&L is a relevant conference for me, but not my #1 conference for the year. The group virtual attendance option was affordable. It also brought in folks for single sessions who never would have traveled to Texas. Thanks to Derrik for arranging this!

Carol at ASERL Journal Retention Meeting

Thursday, February 21, 2013 12:29 pm

On February 13, I arose ere the dawn to attend the ASERL Journal Retention Steering Committee meeting on the Georgia Tech campus. I don’t normally drink coffee, but I downed two cups once I arrived. (OK, really two cups of coffee-flavored sugared cream.) The opening session reviewed the project (http://www.aserl.org/programs/j-retain/ ) and introduced the WRLC (Washington [i.e., DC] Research Library Consortium) print retention project. Each library representative mentioned what they’re retaining for the group. For WFU, that’s mostly the Wiley-Blackwell journals. Most other libraries are following a subject-based approach or are archiving JSTOR. You can find our commitments by searching for ‘ASERL’ in VuFind.

Cheryle Cole-Bennett covered “How to Document this Retention Agreement within the MARC 583 field.” ZSR currently uses a prescribed basic 583a statement that “This title is in the ASERL Print Journal Archive.” In VuFind and Classic, this note appears only in the staff view (good, since only staff care). However, this data does not feed into the master OCLC record, where an audience of librarians across the country may care to read this data. Also, the minimal data in the 583 does not specify which years of the journal are committed for retention, how long ZSR has committed to retain it, or the conditions of retention (e.g. in a closed-stack facility). For instance, in the case of Psychological Reports, ZSR has newer volumes that aren’t part of the commitment yet. OCLC and ASERL have developed some complicated recommendations for expanding the data in the 583 field, as well as a recommendation to include holdings-level data in OCLC for these titles. (At this point, the presentation got technical, and I hope the catalogers can make sense of the PPT slides.) Questions from the audience included: Can you enhance the 583 by batch? How can you communicate that a title is part of two or more retention plans (e.g. ASERL & TRLN)? Can you have multiple 583 fields or repeated subfields within 583? Apparently, the enhanced 583 does not completely erase the problems when only part of the run is officially retained. ASERL has not yet officially decided what members of the group should do with their 583s, so no action is required yet.

Next, we discussed adding subject categories to the retained journals. Apparently the Deans want this (that so, Lynn?). The group agreed to use Ulrich’s to assign subjects, with subscribers pitching in to provide the headings for the non-Ulrich’s-subscribers (like WFU). I noted that they didn’t specifically assign a library to cover our titles, which might lead to a ‘diffusion of responsibility’ effect. Maybe the deans that care about this the most will step forward with the labor to cover this effort.

Next, Winston Harris from UF demonstrated the Journal Retention and Needs Listing (JRNL) tool that they developed for the group. It answers two questions:

  1. What’s in ASERL?
  2. If I’m weeding, can I fill in a gap for someone else who’s retaining this title?

During lunch, the branding subcommittee (that’s me and Steve Knowlton from U. Memphis) took feedback from the group on potential guidelines for a catchy name. (What?!? “ASERL Cooperative Journal Retention” isn’t snappy enough for you??) I took down some notes, but I don’t want to reveal too much too soon….

Amy Wood from CRL demonstrated the PAPR (Print Archives Preservation Registry) system, which tries to track nationally where journals are being retained. One use case: Say you’re weeding, but you want to make sure someone’s retaining the title. Maybe you’re not comfortable unless several institutions are retaining the title. WorldCat can give you a raw holdings count, but not who’s committed to permanent retention. PAPR fills that need.

Finally, we discussed targeted retention. First, they discussed agriculture titles (Zzzz). Next, they discussed retention of certain big sets. I hoped to turn the discussion to humongous physics runs, but the group was more intent on retaining paper indexes that WFU has long since weeded (like Chem Abstracts and NUC). Most surprising to me was the claim that the pre-1956 NUC is a high-use item. Ah, the world of an ARL….

The meeting ended early, so I took MARTA to the airport, where I played out a round of “Grande Tea vs. Dramamine.” I think the tea won, since I stayed awake all the way home.

Carol in Charleston, with Random Linguistic Side Notes

Wednesday, November 21, 2012 10:47 am

A keynote speaker used ‘gatekept’ as a past participle verb. The OED hasn’t caught on to that yet, but the Google Ngram shows a small but steady increase in the word since 1970.

In “The Changing World of eBooks,” Mike Shatzkin focused on the viewpoint of trade publishers. They’ve discovered that most readers just want to be alone with their books. They don’t care about enhanced content. (He pointed out that this applies to immersive reading for adults. It does not apply to children’s books, how-to, cookbooks and a few other categories.)

In “Ebook Availability Revisited” (the session I saw with Lauren C.), the authors advocated against buying (as opposed to renting) any e-books. They assume that the legal issues surrounding Hathi Trust and Google Books will resolve in a few years. Then we can just buy/lease from them. They promoted subscription over DDA, and came down strongly against doing e-book approvals when DDA is available.

Later that afternoon, I attended the “TRLN Oxford University Press Consortial E-Books Pilot” representatives from Duke, UNC-CH, NCSU, YBP and OUP described how they initiated a shared-cost model for the entire output of Oxford’s UPSO product. (BTW, the Charleston program copyeditors need to decide among ‘e-book,’ ‘eBook’ or ‘ebook.’) I’m skeptical about the ‘Big Deal’ model spreading to e-monographs, but I nonetheless heard this session with great interest. The schools shared costs based on what they thought fair, e.g. accounting for size of school, nature of school, etc. They also purchased one print copy of all non-STEM books. They placed the print in OSS. Records processing work was shared out, with one school processing all the print and another all the electronic. They didn’t go into this hoping to save money – their hope was to expand access for the same money. Access for alumni was also included. I wrote that down calmly in my notes, and then I got excited since that includes me! The speakers noted one significant challenge: OUP excludes some books from UPSO and releases others online after a delay. Therefore, if a selector sees an OUP book of interest, should they buy the print or not?

I ended the day by giving a “shotgun” presentation on the incentive program we ran last year. See my slides on slideshare.

On Friday, I attended “Overview of the Altmetrics Landscape.” The presenters outlined at least five alternatives to traditional journal-level metrics: Impact Story, Altmetric, Plum Analytics, Science Card and Mendeley. They also mentioned the attributes of an ideal altmetric system: free, API available, relevant, and immune to rigging/gaming. The next steps are to explore use cases, give context to numbers and continue to combat gaming.

My final Friday session was “Changing the DNA of Scholarly Publishing – The Impact of the Digital Leap.” Damon Zucca from OUP discussed how the Oxford Handbooks series changed when it became an online product. From the print world, they knew that authors who met the deadline didn’t want their chapters held hostage by those who didn’t meet deadline. They also learned that users often sought out specific essays. Therefore, the obvious decision was to make chapters available online as soon as possible and not worry as much about the container. Lisa Jones from Georgia Gwinnett College had the privilege of starting a brand-new collection when her college opened in 2006. She had an e-book subscription (i.e. the approach advocated by the Thursday presenter), but dropped it due to insufficient use. (I also heard a speaker in this session say ‘editors-in-chief.’ Google Ngrams reveals this is indeed the popular usage. Can you tell how much I love Google Ngrams?)

Finally on Saturday, various vendors hosted 30-minute sessions on new products. As Classics and Linguistics liaison, my obvious choice was a presentation on the Dictionary of American Regional English (DARE) and the Loeb Classical Library. Both online products are still in development, but I’ve signed up to beta-test DARE.

Carol at MSU LEETS, Part II

Tuesday, August 14, 2012 3:32 pm

The second day of the MSU LEETS conference focused on emerging technologies. These presentations overlapped more with each other so I’ll just give some general impressions. The main speaker was Nicole Hennig from MIT.

  • NUIs (Natural User Interfaces) to replace GUIs
  • Libraries creating “hackerspaces” or “makerspaces” which feature 3-D printers. Our own Dr. Atala got a shout-out in the context of 3-D printers (look at 11:05).

We watched the video “What is a MOOC?” by Dave Cormier. The narrator highlighted “distribution” as a key component of MOOCs (starting at 2:50). He mentions “pockets and clusters” of information like blogs, tweets, tags, discussions posts, etc. Later, in the context of user experience studies, a major theme was “Fragmentation Hurts.” What is “fragmentation” but a negative way to say “distribution”? Hennig mentioned another of her presentations on this topic, and I followed it up more thoroughly. I learned that “fragmentation” was used in several contexts, such as the annoyances of e-books (e.g. platform proliferation; some work on certain devices but not others). Fragmentation was also mentioned in light of the cloud and one’s personal cache of information. I know I have work information on Google Docs, Evernote, Gmail, acad1, my hard drive and the wiki. I feel the pain when a needed piece of information isn’t in the first (or second) place I look for it. I also think about distribution/fragmentation in light of the library sharing information with patrons. We currently use Facebook, Twitter, Flickr, (multiple) blogs, ZSReads, Pinterest and maybe others. In some cases, the various outlets point back and forth to each other. I understand that some patrons are on Facebook but not Twitter and vice versa, and we certainly want to reach them all. However, it’s a constant challenge to make sure that our information sharing is aptly described as “distribution” instead of “fragmentation.” What I’m describing is just “events” type info. Don’t get me started on the fragmented sources for research, a problem only partially solved by Summon.

Resources to follow up on:

Augmented Reality Apps that I might consider:

  • Layar
  • Tagwhat
  • Google Goggles
  • LeafSnap

Apparently, MSU LEETS was the first library conference to have an official Instagram feed.

MSU On-Campus Guest House, Dwarfed by Football Stadium

MSU On-Campus Guest House, Dwarfed by Football Stadium

The folks at Mississippi State practiced pleasant hospitality and treated their speakers royally. The MSU community clearly loves its football.

The stadium backed right over the on-campus hotel where I stayed. (By contrast, I never found the basketball arena.) They also got me a guest pass to their fabulous fitness center. I thoroughly enjoyed my visit and would recommend this conference to others.


Pages
About
Categories
2007 ACRL Baltimore
2007 ALA Annual
2007 ALA Gaming Symposium
2007 ALA Midwinter
2007 ASERL New Age of Discovery
2007 Charleston Conference
2007 ECU Gaming Presentation
2007 ELUNA
2007 Evidence Based Librarianship
2007 Innovations in Instruction
2007 Kilgour Symposium
2007 LAUNC-CH Conference
2007 LITA National Forum
2007 NASIG Conference
2007 North Carolina Library Association
2007 North Carolina Serials Conference
2007 OCLC International ILLiad Conference
2007 Open Repositories
2007 SAA Chicago
2007 SAMM
2007 SOLINET NC User Group
2007 UNC TLT
2007_ASIST
2008
2008 Leadership Institute for Academic Librarians
2008 ACRL Immersion
2008 ACRL/LAMA JVI
2008 ALA Annual
2008 ALA Midwinter
2008 ASIS&T
2008 First-Year Experience Conference
2008 Lilly Conference
2008 LITA
2008 NASIG Conference
2008 NCAECT
2008 NCLA RTSS
2008 North Carolina Serials Conference
2008 ONIX for Serials Webinar
2008 Open Access Day
2008 SPARC Digital Repositories
2008 Tri-IT Meeting
2009
2009 ACRL Seattle
2009 ALA Annual
2009 ALA Annual Chicago
2009 ALA Midwinter
2009 ARLIS/NA
2009 Big Read
2009 code4lib
2009 Educause
2009 Handheld Librarian
2009 LAUNC-CH Conference
2009 LAUNCH-CH Research Forum
2009 Lilly Conference
2009 LITA National Forum
2009 NASIG Conference
2009 NCLA Biennial Conference
2009 NISOForum
2009 OCLC International ILLiad Conference
2009 RBMS Charlottesville
2009 SCLA
2009 UNC TLT
2010
2010 ALA Annual
2010 ALA Midwinter
2010 ATLA
2010 Code4Lib
2010 EDUCAUSE Southeast
2010 Handheld Librarian
2010 ILLiad Conference
2010 LAUNC-CH Research Forum
2010 LITA National Forum
2010 Metrolina
2010 NASIG Conference
2010 North Carolina Serials Conference
2010 RBMS
2010 Sakai Conference
2011 ACRL Philadelphia
2011 ALA Annual
2011 ALA Midwinter
2011 CurateCamp
2011 Illiad Conference
2012 SNCA Annual Conference
ACRL
ACRL 2013
ACRL New England Chapter
ACRL-ANSS
ACRL-STS
ALA Annual
ALA Annual 2013
ALA Editions
ALA Midwinter
ALA Midwinter 2012
ALA Midwinter 2014
ALCTS Webinars for Preservation Week
ALFMO
APALA
ARL Assessment Seminar 2014
ARLIS
ASERL
ASU
Audio streaming
authority control
Berkman Webinar
bibliographic control
Book Repair Workshops
Career Development for Women Leaders Program
CASE Conference
cataloging
Celebration: Entrepreneurial Conference
Charleston Conference
CIT Showcase
CITsymposium2008
Coalition for Networked Information
code4lib
commons
Conference Planning
Conferences
Copyright Conference
costs
COSWL
CurateGear 2013
CurateGear 2014
Designing Libraries II Conference
DigCCurr 2007
Digital Forsyth
Digital Humanities Symposium
Disaster Recovery
Discovery tools
E-books
EDUCAUSE
Educause SE
EDUCAUSE_SERC07
Electronic Resources and Libraries
Embedded Librarians
Entrepreneurial Conference
ERM Systems
evidence based librarianship
FDLP
FRBR
Future of Libraries
Gaming in Libraries
General
GODORT
Google Scholar
govdocs
Handheld Librarian Online Conference
Hurricane Preparedness/Solinet 3-part Workshop
ILS
information design
information ethics
Information Literacy
innovation
Innovation in Instruction
Innovative Library Classroom Conference
Inspiration
Institute for Research Design in Librarianship
instruction
IRB101
Journal reading group
Keynote
LAMS Customer Service Workshop
LAUNC-CH
Leadership
Learning spaces
LibQUAL
Library 2.0
Library Assessment Conference
Library of Congress
licensing
Lilly Conference
LITA
LITA National Forum
LOEX
LOEX2008
Lyrasis
Management
Marketing
Mentoring Committee
MERLOT
metadata
Metrolina 2008
MOUG 09
MOUG 2010
Music Library Assoc. 07
Music Library Assoc. 09
Music Library Assoc. 2010
NASIG
National Library of Medicine
NC-LITe
NCCU Conference on Digital Libraries
NCICU
NCLA
NCLA Biennial Conference 2013
NCPC
NCSLA
NEDCC/SAA
NHPRC-Electronic Records Research Fellowships Symposium
NISO
North Carolina Serial Conference 2014
Offsite Storage Project
OLE Project
online catalogs
online course
OPAC
open access
Peabody Library Leadership Institute
plagiarism
Podcasting
Preservation
Preservation Activities
Preserving Forsyth LSTA Grant
Professional Development Center
rare books
RDA/FRBR
Reserves
RITS
RTSS 08
RUSA-CODES
SAA Class New York
SAMM 2008
SAMM 2009
Scholarly Communication
ScienceOnline2010
Social Stratification in the Deep South
Social Stratification in the Deep South 2009
Society of American Archivists
Society of North Carolina Archivists
SOLINET
Southeast Music Library Association
Southeast Music Library Association 08
Southeast Music Library Association 09
SPARC webinar
subject headings
Sun Webinar Series
tagging
TALA Conference
Technical Services
technology
ThinkTank Conference
Training
ULG
Uncategorized
user studies
Vendors
video-assisted learning
visual literacy
WakeSpace
Web 2.0
Webinar
WebWise
WFU China Initiative
Wikis
Women's History Symposium 2007
workshops
WSS
ZSR Library Leadership Retreat
Tags
Archives
November 2014
October 2014
August 2014
July 2014
June 2014
May 2014
April 2014
March 2014
February 2014
January 2014
December 2013
November 2013
October 2013
August 2013
July 2013
June 2013
May 2013
April 2013
March 2013
February 2013
January 2013
December 2012
November 2012
October 2012
September 2012
August 2012
July 2012
June 2012
May 2012
April 2012
March 2012
February 2012
January 2012
December 2011
November 2011
October 2011
September 2011
August 2011
July 2011
June 2011
May 2011
April 2011
March 2011
February 2011
January 2011
December 2010
November 2010
October 2010
September 2010
August 2010
July 2010
June 2010
May 2010
April 2010
March 2010
February 2010
January 2010
December 2009
November 2009
October 2009
September 2009
August 2009
July 2009
June 2009
May 2009
April 2009
March 2009
February 2009
January 2009
December 2008
November 2008
October 2008
August 2008
July 2008
June 2008
May 2008
April 2008
March 2008
February 2008
January 2008
November 2007
October 2007
September 2007
August 2007
July 2007
June 2007
May 2007
April 2007
March 2007
February 2007
January 2007

Powered by WordPress.org, protected by Akismet. Blog with WordPress.com.