Skip to Content

Digitizing - User Assessment

Why not in the same format of a question with Read/Take action/etc.? (CB)


  • Center for Research Libraries.  "Trustworthy Repositories Audit & Certification: Criteria and Checklist." Version 1.0.  Chicago, Center for Research LIbraries, February 2007.,15,15,B/l856~b2212602&FF=Xtrusted+repositories&searchscope=1&SORT=R&6,6,,1,0
    Envisioned uses of this document include repository planning guidance, planning and development of a certified repository, periodic internal assessment of a repository, analysis of services which hold critical digital content on which institutions rely, and objective third-party evaluation of any repository or archiving service.
  • Ross, Seamus and Andrew McHugh.  “The Role of Evidence in Establishing Trust in Repositories.”  D-Lib Magazine 12 no. 7/8 (July/August 2006).  doi:10.1045/july2006-ross.
    This article arises from work by the Digital Curation Centre (DCC) Working Group examining mechanisms to roll out audit and certification services for digital repositories in the United Kingdom.  It explores the role of evidence within the certification process and identifies examples of the types of evidence (e.g., documentary, observational, and testimonial) that might be desirable during the course of a repository audit.
  • Digital Curation Centre and DigitalPreservationEurope.  "DRAMBORA."  Last updated February 1, 2008.
    Digital Repository Audit Method Based On Risk Assessment -- provides information about and access to a toolkit that is intended to facilitate internal audit by providing repository administrators with a means to assess their capabilities, identify their weaknesses, and recognise their strengths.
  • Nestor Working Group on Trusted Repository Certification.  "Catalog of Criteria for Trusted Digital Repositories."  Version 1.0, June 2006.
    The Network of Expertise in long-term STORage produced this draft that identifies criteria which facilitate the evaluation of digital repository trustworthiness, both at organisational and technical levels.
  • Hedstrom, Margaret L., Christopher A. Lee, Judith S. Olson, and Clifford Lampe.  “'The Old Version Flickers More:' Digital Preservation from the User’s Perspective."  American Archivist 69 no. 1 (Spring/Summer 2006): 159-87.
    Presents the results of two experiments in the CAMiLEON Project that used human subjects to learn about user preferences for different formats of preserved digital objects by testing subjects’ reactions to digital materials that were preserved using three common methods: 1) conversion to a “software-independent” format; 2) migration; and 3) presenting the original bitstream using emulation.
  • Bearman, David and Jennifer Trant.  “Authenticity of Digital Resources: Towards a Statement of Requirements in the Research Process.”  D-Lib Magazine (June 1998).
    Calls for further definition of requirements for digital authenticity in the realm of scholarly digital documentation as well as evaluation of the mechanisms being offered in order to hasten the development of trusted and widely adopted solutions.
  • Lynch, Clifford.  "Authenticity and Integrity in the Digital Environment: An Exploratory Analysis of the Central Role of Trust."  In Authenticity in a Digital Environment, 32-51.  Washington, DC.: CLIR, May 2000.
    One goal of this paper is to help distinguish between what can be done in code and what must be left for human and social judgment in areas related to authenticity and integrity.  (The other papers in this report also warrant scanning.)
  • "Integrity and Authenticity of Digital Cultural Heritage Objects."  DigiCULT Thematic Issue 1 (August 2002).
    Includes a position paper, interviews, feedback from an expert roundtable, and a case study.  Focus especially on the Seamus Ross position paper on pp.7-8.
  • InterPARES Project.  “Authenticity Task Force Report.”  The Long Term Preservation of Authentic Electronic Records: Findings of the InterPARES Project.  2002.
    The goal of the Authenticity Task Force was to identify conceptual requirements for assessing and maintaining the authenticity of electronic records.  Under the original InterPARES research plan, five questions were to be addressed within the domain of investigation assigned to this task force: (1) What are the elements that all electronic records share?, (2) What are the elements that allow us to differentiate between different types of electronic records?, (3) Of those elements, which will permit us to verify their authenticity over time?, (4) Are the elements for verifying authenticity over time the same as those that permit us to verify their authenticity in time, that is, at the point at which they are originally created and transmitted?, and (5) Is it possible to move the elements from their current position to a place where they can more easily be preserved, without affecting validity?


Last updated on 10/03/13, 2:24 pm by emilykader



about seo | group_wiki_page