Skip to Content

Evaluation of Digital Curation Functions and Repository Services, Part I: Introduction to Evaluation of Systems and Services

Focus: 
Type of resource: 

Digital Curation Curriculum (DigCCurr) Project

Digital Curation Education Module - Evaluation of Digital Curation Functions and Repository Services, Part I: Introduction to Evaluation of Systems and Services
[Scope | Date and Version | Matrix Elements | Keywords | Learning Objectives | Effort Level | Relationships | Prerequisites | Body of Knowledge
Supporting Materials | Required Readings | Other Related Resources | Activities | Glossary | Contributors | Change History | Copyright]

Scope

This module discusses the Archival Metrics project, which encourages archivists to assess their website and its use through web analytics and the components of effective evaluation techniques.

Date and Version

Elements of Matrix of Digital Curation Knowledge and Competencies Addressed

Mandates, Values and Principles

  • Stakeholders
  • Standardization

Functions and Skills

  • Reference and User Support Services
  • Evaluation and Audit of Curation Functions
  • Research and Development to Support Curation Functions

Professional, Disciplinary, Institutional, or Organizational Context

  • Institutional/Organizational Context

Type of Resource

N/A

Prerequisite Knowledge

N/A

Transition Point in Information Continuum

N/A

Keywords

[Choose whatever terms you think best describe the module. Think of these as tags.]

Learning Objectives

Upon completion of the module, the student will be able to:

  • Define web analytics and explain the use of web analytics as a tool for the evaluation of digital repositories.
  • Identify what data digital curators can collect about usage and explain why the data should be collected.
  • Identify the components of effective evaluation of projects and the necessity of each component to useful evaluation.

Level of Effort Required

In-class and out-of-class time required for students.

Relationships with Other Modules

Evaluation-part II

Prerequisite Knowledge or Skills Required

N/A

Body of Knowledge

  1. The need for web analytics
    • new technology
    • users now access many archival collections online, through a repository website
    • lack of archival intelligence
  2. Web Analytics
    • definition
    • importance of understanding website use
    • why web analytics is useful for digital curators
  3. Useful data for digital curators
    • characteristics of useful data
    • approaches to collecting data
    • literature about web analytics
    • Google analytics
  4. Introduction to Evaluation
    • difference between evaluation and research
    • stakeholders and audience
    • political context
  5. Evaluation Measurement
    • metrics vs. benchmarks
    • outcome vs. outputs
    • weight the library mission, visions, values
    • efficiency vs. effectiveness
    • Duke LibQual Example

Supporting Materials for Teaching Module

N/A

Required Readings

Duff, Wendy, Sarah Carson, and Jean Dryden. "Developing Standardized Metrics for Assessing Use and User Services for Primary Sources: A Literature Review for the AX-SNet Project."

Perspectives on Outcome Based Evaluation for Libraries and Museums. Washington, DC: Institute of Museum and Library Services, 2000. http://www.imls.gov/pdf/pubobe.pdf

Westell, Mary. "Institutional Repositories: Proposed Indicators of Success." Library Hi Tech 24, no. 2 (2006): 211-26.

Yakel, Elizabeth, and Elizabeth Goldman. "Archival Metrics: An Overview of Current Practices."

Other Related Resources

Yakel, Elizabeth and Deborah A. Torres. “AI: Archival Intelligence and User Expertise.” American Archivist 66/1 (Spring/Summer 2003): 51-78.

Activities and Exercises

Archival Intelligence Exercise: How would you determine Archival Intelligence of a website? After reading “AI: Archival Intelligence and User Expertise, ” visit a repository website and describe how you might measure archival intelligence using web analytics.

Internet Public Library Exercise (20-25 minutes)

  1. Go to http://ipl.org
  2. Navigate to About the IPL > Mission and Vision Statements of the Internet Public Library
  3. Browse the page

Q: If you were managing the IPL, what evaluative questions (EQs) would you ask to determine if the IPL is accomplishing its goals?

Q: If you were managing the IPL, what sort of data would you want answer these EQs?

Q: Given what data is available, what data would you use to answer these EQs?

  • Who is the audience for an evaluation of IPL? Who are the stakeholders?
  • Given that, how would you present a report?

Glossary

DCC Glossary

Contributors

Amber Cushing
Jeffery Pomerantz
Helen Tibbo

Change History

2009-10-01
Initial conversion to HTML format

Copyright

Creative Commons License

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 Unported License. Creative Commons Attribution Non-Commercial Share-Alike 3.0 License

[http://creativecommons.org/licenses/by-nc-sa/3.0/]-->

AttachmentSize
week03-evaluation.html7.88 KB


about seo | resource