Professional Development

NCICU Assessment Conference

Thursday, May 28, 2009 9:38 pm

North Carolina Independent Colleges and Universities is a consortium of private schools that provides lobbying in the North Carolina legislature and professional development programming for various units in universities including libraries. Besides the fact that you just can’t get too much information on assessment, I was interested in this conference because I have just been added to the list of potential reviewers for SACS. In the small world department, I started talking to the man next to me at lunch and learned that he grew up in the Detroit area and worked at Wayne State in the Center for Urban Studies at the same time I worked there. And he is also a big Red Wings fan (but who isn’t at this time of year)!

Keynote, Steven Sheeley, Southern Association of Colleges and Schools (SACS)

Sheeley talked about accountability in higher education increasing by a more vocal and demanding public. In this economic downturn, all institutions of higher education have been hit, but publics may have been hit the hardest. The book Turn Around (Johns Hopkins 2009) is prescient in examining fragile institutions that may not survive additional financial stress. There will be a focus on efficiencies across the campus. The recession will affect enrollment in both positive and negative ways (community colleges enrollment is expected to go through the roof). Strategic decision making, informed by data and analysis, becomes even more important in times of financial stress.

Navigating the SACS Accreditation Process, Steven Sheeley

Standards and Policies are equal responsibilities for institutions, but Guidelines (such as faculty qualifications) are informative, not normative. Some standards require a policy and require that the policy be followed. Decennial review is necessary, leading to reaffirmation of accreditation every ten years.

Tracks A (baccalaureate only) or Track B (master’s and above)

Off-site Committee: compliance certification document review; each “cluster” reviews 3 or 4 institutions with a 2 day meeting in Atlanta. Committee report goes to institutions and forms basis of On-site Committee report. They give findings of either compliant or non-compliant.

On-Site Committee: Focused report and QEP document sent to committee 6 weeks before visit. Final report is narrative, institution has chance to respond within 5 months. C&R reviews on-site report, response, chair’s evaluation of the response to make a decision.

Quality Enhancement Plan (QEP): should still be in the planning stage until approved as part of reaffirmation. QEP should come out of assessment activities, NOT just brainstorming. Needs to focus on student learning outcomes. QEP lead evaluator can be from the outside, even outside SACS region.

Common areas of off-site non-compliance: faculty competence (not sending in enough documentation), college-level competencies (Gen Ed), institutional effectiveness, administrative staff evaluations.

Common areas of on-site non-compliance: QEP, college level competencies, faculty competencies

Common areas of C&R (in monitoring) non-compliance: institutional effectiveness, college level competencies, QEP, library/learning resources, financial stability

Danger zones: institutional effectiveness

Some sound practices: think like the reviewer (get off your own campus), begin early, clear documentation is key, burden of persuasion is on the institution, READ standards carefully, assessment woven throughout, ask if you don’t understand.

Fifth Year Report: mini-compliance report on progress

Use what you’ve got and get what you need:Strengthening your library’s assessment program, Yvonne Belanger and Diane Harvey, Perkins Library, Duke

I met Yvonne when we toured the Center for Instructional Technology at Duke a few months ago. She does assessment for CIT and is a resource for the libraries as well.They presented a very practical program on the basics of library assessment. My favorite quote was “Culture eats strategy for breakfast,” credited to Ford Motor Company in 2006. That rang true, because I once heard an ARL consultant say that it takes 15 years (give or take) to change a culture. So that got me thinking how I would describe the predominant culture at ZSR and I think I’d say intensely personal service. But I digress…

The growth in assessment programs in libraries mirrors the growth in assessment in higher education. On many campuses, SACS accreditors say that the library does a better job at assessment than most campus units (and probably 1/3 of the attendees here today are librarians). Libraries singled out for excellence in assessment efforts are: University of Pennsylvania, University of Washington, and University of Virginia. A key in library assessment is demonstrating the impact on institutional goals. The most successful library assessment programs are those that are infused throughout the organization, rather than just being the responsibility of one coordinator or one committee, hence the “culture of assessment” that we hear about. A good rule of thumb, attributed to Susan Gibbons of the University of Rochester, is “don’t guess, just ask.” With the availability of easy web survey tools and built-in focus groups of student employees or Lib100 classes, this is good advice to follow. So for example, when I see our virtual reference statistics declining, which seems counter-intuitive to all other prevailing trends, it seems a good approach would be to get some focus groups of students together to ask them what is going on.

Other nuggets that I picked up and will bring to various people when I get home:

  • Bring together all assessment data in one place on the website so all can access and use it
  • Look into Lib Stats, as a free, open source resource
  • Build evaluative thinking by linking assessment to staff development
  • Give data back: eg. analyze instruction sections by academic department and report back to department chairs and liaisons
  • Our OCLC replication study given at ACRL was cited here as an example of how data tends to be local!

Better Assessment:The I-E-O Model Revisited, Libby Joyce and Rob Springer, Elon

They used the National Survey of Student Engagement (NSSE-spring semester of freshman year) and Beginning College Survey of Student Engagement (BCSSE- entering freshmen before they get to campus). They did a study of 331 matched pairs using the IEO model as a framework:Input (student profile), Environment (engagement), Output (outcome)

Impressively, they performed an ANOVA (analysis of variance) with

  • Dependent variable:retention
  • Fixed variables: NSSE cognitive variables
  • Covariates of BCSSE cognitive variables

They found very strong statistical significance in their outcomes by looking at the complete picture of the student profile coming in (BCSSE), the environmental intervention, and then the outcome as self-reported in the NSSE survey.

Always have to ask the question: how much data do you gather before it becomes a burden?

What does this mean and where do we go from here? Assessing an information literacy program, Jennifer Hanft and Susan McClintock, Meredith College

Assessment is hard to define, but has elements of: accountability, focus, outcomes alignment, measurement, and acknowledgement of professional knowledge. It doesn’t have to be comprehensive, unchanging, intimidating, exceptional, self-sufficient or expensive. You are already assessing your program if you are meeting regularly with instruction faculty to discuss best practices, conducting regular student evaluations, grading assignments, conducting pre-tests, or partnering with faculty on assignments

ACRL Information Literacy Competency Standards for Higher Education: At Meredith, three tiered program (English 111, English 200, IL Thread), incremental and developmental. Where to go from here: continue as part of Gen Ed program (survived revision), extend to graduate programs, continue to assess.

How they tackle assessment in Information Literacy:

  • Identify a skill
  • Find the applicable ACRL standard
  • Identify appropriate level(s) of program
  • Align with program’s defined outcome
  • Decide how best to measure

One Response to “NCICU Assessment Conference”

  1. Interesting to see some discussion of open source options for assessment!


Pages
About
Categories
2007 ACRL Baltimore
2007 ALA Annual
2007 ALA Gaming Symposium
2007 ALA Midwinter
2007 ASERL New Age of Discovery
2007 Charleston Conference
2007 ECU Gaming Presentation
2007 ELUNA
2007 Evidence Based Librarianship
2007 Innovations in Instruction
2007 Kilgour Symposium
2007 LAUNC-CH Conference
2007 LITA National Forum
2007 NASIG Conference
2007 North Carolina Library Association
2007 North Carolina Serials Conference
2007 OCLC International ILLiad Conference
2007 Open Repositories
2007 SAA Chicago
2007 SAMM
2007 SOLINET NC User Group
2007 UNC TLT
2007_ASIST
2008
2008 Leadership Institute for Academic Librarians
2008 ACRL Immersion
2008 ACRL/LAMA JVI
2008 ALA Annual
2008 ALA Midwinter
2008 ASIS&T
2008 First-Year Experience Conference
2008 Lilly Conference
2008 LITA
2008 NASIG Conference
2008 NCAECT
2008 NCLA RTSS
2008 North Carolina Serials Conference
2008 ONIX for Serials Webinar
2008 Open Access Day
2008 SPARC Digital Repositories
2008 Tri-IT Meeting
2009
2009 ACRL Seattle
2009 ALA Annual
2009 ALA Annual Chicago
2009 ALA Midwinter
2009 ARLIS/NA
2009 Big Read
2009 code4lib
2009 Educause
2009 Handheld Librarian
2009 LAUNC-CH Conference
2009 LAUNCH-CH Research Forum
2009 Lilly Conference
2009 LITA National Forum
2009 NASIG Conference
2009 NCLA Biennial Conference
2009 NISOForum
2009 OCLC International ILLiad Conference
2009 RBMS Charlottesville
2009 SCLA
2009 UNC TLT
2010
2010 ALA Annual
2010 ALA Midwinter
2010 ATLA
2010 Code4Lib
2010 EDUCAUSE Southeast
2010 Handheld Librarian
2010 ILLiad Conference
2010 LAUNC-CH Research Forum
2010 LITA National Forum
2010 Metrolina
2010 NASIG Conference
2010 North Carolina Serials Conference
2010 RBMS
2010 Sakai Conference
2011 ACRL Philadelphia
2011 ALA Annual
2011 ALA Midwinter
2011 CurateCamp
2011 Illiad Conference
2012 SNCA Annual Conference
ACRL
ACRL 2013
ACRL New England Chapter
ACRL-ANSS
ACRL-STS
ALA Annual
ALA Annual 2013
ALA Editions
ALA Midwinter
ALA Midwinter 2012
ALA Midwinter 2014
ALCTS Webinars for Preservation Week
ALFMO
APALA
ARL Assessment Seminar 2014
ARLIS
ASERL
ASU
Audio streaming
authority control
Berkman Webinar
bibliographic control
Book Repair Workshops
Career Development for Women Leaders Program
CASE Conference
cataloging
Celebration: Entrepreneurial Conference
Charleston Conference
CIT Showcase
CITsymposium2008
Coalition for Networked Information
code4lib
commons
Conference Planning
Conferences
Copyright Conference
COSWL
CurateGear 2013
CurateGear 2014
Designing Libraries II Conference
DigCCurr 2007
Digital Forsyth
Digital Humanities Symposium
Disaster Recovery
Discovery tools
E-books
EDUCAUSE
Educause SE
EDUCAUSE_SERC07
Electronic Resources and Libraries
Embedded Librarians
Entrepreneurial Conference
ERM Systems
evidence based librarianship
FDLP
FRBR
Future of Libraries
Gaming in Libraries
General
GODORT
Google Scholar
govdocs
Handheld Librarian Online Conference
Hurricane Preparedness/Solinet 3-part Workshop
ILS
information design
information ethics
Information Literacy
innovation
Innovation in Instruction
Inspiration
instruction
IRB101
Journal reading group
Keynote
LAMS Customer Service Workshop
LAUNC-CH
Leadership
Learning spaces
LibQUAL
Library 2.0
Library of Congress
licensing
Lilly Conference
LITA
LITA National Forum
LOEX2008
Lyrasis
Management
Marketing
Mentoring Committee
MERLOT
metadata
Metrolina 2008
MOUG 09
MOUG 2010
Music Library Assoc. 07
Music Library Assoc. 09
Music Library Assoc. 2010
NASIG
NC-LITe
NCCU Conference on Digital Libraries
NCICU
NCLA
NCLA Biennial Conference 2013
NCPC
NCSLA
NEDCC/SAA
NHPRC-Electronic Records Research Fellowships Symposium
NISO
North Carolina Serial Conference 2014
Offsite Storage Project
OLE Project
online catalogs
online course
OPAC
open access
Peabody Library Leadership Institute
plagiarism
Podcasting
Preservation
Preservation Activities
Preserving Forsyth LSTA Grant
Professional Development Center
rare books
RDA/FRBR
Reserves
RITS
RTSS 08
RUSA-CODES
SAA Class New York
SAMM 2008
SAMM 2009
Scholarly Communication
ScienceOnline2010
Social Stratification in the Deep South
Social Stratification in the Deep South 2009
Society of American Archivists
Society of North Carolina Archivists
SOLINET
Southeast Music Library Association
Southeast Music Library Association 08
Southeast Music Library Association 09
SPARC webinar
subject headings
Sun Webinar Series
tagging
Technical Services
technology
ThinkTank Conference
Training
ULG
Uncategorized
user studies
Vendors
video-assisted learning
visual literacy
WakeSpace
Web 2.0
Webinar
WebWise
WFU China Initiative
Wikis
Women's History Symposium 2007
workshops
WSS
ZSR Library Leadership Retreat
Tags
Archives
April 2014
March 2014
February 2014
January 2014
December 2013
November 2013
October 2013
August 2013
July 2013
June 2013
May 2013
April 2013
March 2013
February 2013
January 2013
December 2012
November 2012
October 2012
September 2012
August 2012
July 2012
June 2012
May 2012
April 2012
March 2012
February 2012
January 2012
December 2011
November 2011
October 2011
September 2011
August 2011
July 2011
June 2011
May 2011
April 2011
March 2011
February 2011
January 2011
December 2010
November 2010
October 2010
September 2010
August 2010
July 2010
June 2010
May 2010
April 2010
March 2010
February 2010
January 2010
December 2009
November 2009
October 2009
September 2009
August 2009
July 2009
June 2009
May 2009
April 2009
March 2009
February 2009
January 2009
December 2008
November 2008
October 2008
August 2008
July 2008
June 2008
May 2008
April 2008
March 2008
February 2008
January 2008
November 2007
October 2007
September 2007
August 2007
July 2007
June 2007
May 2007
April 2007
March 2007
February 2007
January 2007

Powered by WordPress.org, protected by Akismet. Blog with WordPress.com.