Notes
Slide Show
Outline
1
Evaluation of Library Services
  • Module 2 Min-Yen KAN
  • Fundamentals of LIS
2
Why Evaluation?
  • Run as a business, need to justify costs and expenditure
  • Quantitative data analysis necessitated by evolution into automated and digital libraries


  • Need benchmarks to evaluate effectiveness of library
3
Quantitative metrics
  • Circulation per capita
  • Library visits per capita
  • Program attendance per capita
  • ________________
  • ________________


  • - Output measures for public libraries
  • Zweizig and Rodger (1982)
4
Evaluation types
  • Macroevaluation
    • Quantitative
    • Degree of exposure


  • Microevaluation
    • Diagnostic
    • Gives rationale for performance
5
Macroevaluation
  • Axiom
    • The more a book in a library is exposed, the more effective the library.

  • Defining “an exposure” as a simple count
    • Pros
      • Easy; can different levels of granularity
    • Cons
      • 5 × 1 day borrowing is five times more exposure than 1 × 5 day borrowing
      • __________________________
6
More exact ways to quantify exposure
  • Item-use days: Meier (61)
    • A book borrowed for five days may not be used at all

  • Effective user hours: De Prospo et al. (73)
    • Sample users in library
7
Bang for the buck?



  • ___________________________, the greater the exposure.



8
Macroevaluation - Conclusions
  • In general, more exact measures require sampling and tend towards microevaluation
    • So it’s a continuum after all

  • Administrators use a battery of measures; not a single one, to measure effectiveness – Spray (76)
9
Microevaluation Axes
  • Quality
  • Time
  • Costs (including human effort)
  • User satisfaction (ultimately, they are bearing the library’s operating costs)
10
Microevaluation Trends
  • The more concrete the need, the easier to evaluate
  • Failure is harder to measure than success
    • Case 1: Got a sub-optimal resource
    • Case 2: Got some material but not all
11
"Technical Services"
  • Technical Services Public Services


  • Quality 1. Select and acquisition 1. Range of services offered
  •    Size, appropriateness, and 2. Helpfulness of shelf order and
  •    balance of collection     guidance
  • 2. Cataloging and Indexing 3. Catalog
  •    Accuracy, consistency, and    Completeness, accuracy and
  •    completeness    ease of use
  • 4. Reference and retrieval
  •     Completeness, accuracy and
  •     percentage success
  • 5. Document Delivery
  •     Percentage Success


  • Time 1. Delays in Acquisition 1. Hours of Service
  • 2. Delays in Cataloging 2. Response Time
  • 3. Productivity of Staff 3. Loan Periods


  • Cost 1. Unit cost to purchase 1. Effort of use
  • 2. Unit cost to process    Location of library
  •     Accession    Physical accessibility of collection
  •     Classify    Assistance from staff
  •     Catalog 2. Charges Levied
12
Principle of Least Effort
  • Zipf’s Law (49)
    • “Least Effort”

  • A corollary:
    • Mooer’s Law (60):
      • “An information retrieval system will tend not be used whenever it is more troublesome for a customer to have information than for him not to have it.”
13
More on accessibility and convenience
  • Expanding on this, Allen and Gerstberger (67) note:


    • Perceived accessibility is the most important determinant of the overall extent to which an information channel is used.


    • The more experience a user has with a channel, the more accessible he or she will perceive it to be.


    • After the user finds an accessible source of information, he or she will screen it on the basis of other factors (e.g., technical quality)


    • High motivation to find specific information may prompt users to seek out less-accessible sources of information
14
Accessibility versus Motivation
  • A supply and demand relationship
15
Materials-centered Collection Evaluation
  • What’s the purpose…


    • … of the collection
      • Who’s the readership – academic, public?


    • … of the evaluation
      • Document change in demand?
      • Justify funding?
      • ___________________
      • ___________________
16
Principled methods for
materials-based evaluations
  • Checklist
    • Use standard reference bibliographies to check against
  • Citation
    • Use an initial seed of resources to search for resources that cite and are cited by them


  • Are these methods really distinct?
    • How do people compile bibliographies in the first place?
17
Use-centered Collection Evaluation Methodologies
  • Circulation
  • General
  • Interlibrary Loan (ILL)


  • In-house uses
  • Stack
  • Catalog
18
Effectiveness as Circulation
19
Collection Mapping
  • Idea: Build the collection in parts
    • Prioritize and budget specific subjects
      • Shrink, grow, keep constant
    • Evaluate subjects according to specific use
      • Which courses it serves, what are each courses’ needs

20
Use Factors
  • Age
  • Language
  • Subject
  • Shelf Arrangement
  • Quality
  • Expected Use
    • Popularity
    • Information Chain placement

21
In-House Use Evaluation Methods
  • Mostly done by sampling
  • Table Counting
  • Slip
  • Interviews
  • Observation
22
Material Availability
  • The myth: If we have it, you can get it.


  • The reality: If we have it, you have a chance of getting it.
23
 
24
Evaluating Catalog Use
  • Usability Evaluation
    • Does the interface allow you to find things by the way you want?
    • Experiment on finding a set of resources
    • Return to this issue in UI Module


  • Analysis of Transaction Logs
    • Different types of searches: known-item, by subject
    • Return to this issue in Bibliometrics Module
25
References
  • Baker and Lancaster (91) The Measurement and Evaluation of Library Services, Information Resources Press (On Reserve)