An ongoing multiple volume set with updated index.
Publisher
West Group
Publication Location
St. Paul, MN
Critical Arguements
CA "Contains over 400 separate titles on a broad range of legal topics which, taken together, systematically describe the entire field of American legal doctrine. Documents available for each topic may include a summary, topic contents, each (TRUNCATED)
Type
Conference Proceedings
Title
Interoperability between Multimedia Collections for Content and Metadata-Based Searching
Artiste is a European project developing a cross-collection search system for art galleries and museums. It combines image content retrieval with text based retrieval and uses RDF mappings in order to integrate diverse databases. The test sites of the Louvre, Victoria and Albert Museum, Uffizi Gallery and National Gallery London provide their own database schema for existing metadata, avoiding the need for migration to a common schema. The system will accept a query based on one museumÔÇÖs fields and convert them, through an RDF mapping into a form suitable for querying the other collections. The nature of some of the image processing algorithms means that the system can be slow for some computations, so the system is session-based to allow the user to return to the results later. The system has been built within a J2EE/EJB framework, using the Jboss Enterprise Application Server.
Secondary Title
WWW2002: The Eleventh International World Wide Web Conference
Publisher
International World Wide Web Conference Committee
ISBN
1-880672-20-0
Critical Arguements
CA "A key aim is to make a unified retrieval system which is targeted to usersÔÇÖ real requirements and which is usable with integrated cross-collection searching. Museums and Galleries often have several digital collections ranging from public access images to specialised scientific images used for conservation purposes. Access from one gallery to another was not common in terms of textual data and not done at all in terms of image-based queries. However the value of cross-collection access is recognised as important for example in comparing treatments and conditions of paintings. While ARTISTE is primarily designed for inter-museum searching it could equally be applied to museum intranets. Within a MuseumÔÇÖs intranet there may be systems which are not interlinked due to local management issues."
Conclusions
RQ "The query language for this type of system is not yet standardised but we hope that an emerging standard will provide the session-based connectivity this application seems to require due to the possibility of long query times." ... "In the near future, the project will be introducing controlled vocabulary support for some of the metadata fields. This will not only make retrieval more robust but will also facilitate query expansion. The LouvreÔÇÖs multilingual thesaurus will be used in order to ensure greater interoperability. The system is easily extensible to other multimedia types such as audio and video (eg by adding additional query items such as "dialog" and "video sequence" with appropriate analysers). A follow-up project is scheduled to explore this further. There is some scope for relating our RDF query format to the emerging query standards such as XQuery and we also plan to feed our experience into standards such as the ZNG initiative.
SOW
DC "The Artiste project is a European Commission funded collaboration, investigating the use of integrated content and metadata-based image retrieval across disparate databases in several major art galleries across Europe. Collaborating galleries include the Louvre in Paris, the Victoria and Albert Museum in London, the Uffizi Gallery in Florence and the National Gallery in London." ... "Artiste is funded by the European CommunityÔÇÖs Framework 5 programme. The partners are: NCR, The University of Southampton, IT Innovation, Giunti Multimedia, The Victoria and Albert Museum, The National Gallery, The research laboratory of the museums of France (C2RMF) and the Uffizi Gallery. We would particularly like to thank our collaborators Christian Lahanier, James Stevenson, Marco Cappellini, John Cupitt, Raphaela Rimabosci, Gert Presutti, Warren Stirling, Fabrizio Giorgini and Roberto Vacaro."
CA "Ironically, electronic records systems make it both possible to more fully capture provenance than paper recrods systems did and at the same time make it more likely that provenance will be lost and that archives, even if they are preserved, will therefore lack evidential value. This paper explores the relationship between provenance and evidence and its implications for management of paper or electronic information systems." (p. 177)
Conclusions
"Electronic information systems, therefore, present at least two challenges to archivists. The first is that the designers of these systems may have chosen to document less contextual information than may be of interest to archivists when they designed the system. The second is that the data recorded in any given information system will, someday, need to be transferred to another system. ... [A]rchivists will need to return to fundamental archival principles to determine just what they really wanted to save anyway. ... It may be that archivists will be satisfied with the degree of evidential historicity they were able to achieve in paper based record systems, in which case there are very few barriers to implementing successful electronic based archival environments. Or archivists may decide that the fuller capability of tracking the actual participation of electronic data objects in organizational activities needs to be documented by archivally satisfactory information systems, in which case they will need to define those levels of evidential historicity that must be attained, and specify the systems requirements for such environments. ... At a meeting on electronic records management research issues sponsored by the National Historical Publications and Records Commission in January 1991, participants identified the concept of technological and economic plateaux in electronic data capture and archiving as an important arena for research ... Hopefully this research will produce information to help archivists make decisions regarding the amount of contextual information they can afford to capture and the requirements of systems designed to document context along with managing data content. ... I will not be surprised as we refine our concepts of evidential historicity to discover that the concept of provenance takes on even greater granularity." (p. 192-193)
CA Makes a distinction between archival description of the record at hand and documentation of the context of its creation. Argues the importance of the latter in establishing the evidentiary value of records, and criticizes ISAD(G) for its failure to account for context. "(1) The subject of documentation is, first and foremost, the activity that generated the records, the organizations and individuals who used the records, and the purposes to which the records were put. (2). The content of the documentation must support requirements for the archival management of records, and the representations of data should support life cycle management of records. (3) The requirements of users of archives, especially their personal methods of inquiry, should determine the data values in documentation systems and guide archivists in presenting abstract models of their systems to users." (p. 45-46)
Phrases
<P1> [T]he ICA Principles rationalize existing practice -- which the author believes as a practical matter we cannot afford; which fail to provide direct access for most archives users; and which do not support the day-to-day information requirements of archivists themselves. These alternatives are also advanced because of three, more theoretical, differences with the ICA Principles: (1) In focusing on description rather than documentation, they overlook the most salient characteristic of archival records: their status as evidence. (2) In proposing specific content, they are informed by the bibliographic tradition rather than by concrete analysis of the way in which information is used in archives. (3) In promoting data value standardization without identifying criteria or principles by which to identify appropriate language or structural links between the objects represented by such terms, they fail adequately to recognize that the data representation rules they propose reflect only one particular, and a limiting, implementation. (p. 33-34) <P2> Archives are themselves documentation; hence I speak here of "documenting documentation" as a process the objective of which is to construct a value-added representation of archives, by means of strategic information capture and recording into carefully structured data and information access systems, as a mechanism to satisfy the information needs of users including archivists. Documentation principles lead to methods and practices which involve archivists at the point, and often at the time, of records creation. In contrast, archival description, as described in the ICA Principles[,] is "concerned with the formal process of description after the archival material has been arranged and the units or entities to be described have been determined." (1.7) I believe documentation principles will be more effective, more efficient and provide archivists with a higher stature in their organizations than the post accessioning description principles proposed by the ICA. <warrant> (p. 34) <P3> In the United States, in any case, there is still no truly theoretical formulation of archival description principles that enjoys a widespread adherence, in spite of the acceptance of rules for description in certain concrete application contexts. (p. 37) <P4> [T]he MARC-AMC format and library bibliographic practices did not adequately reflect the importance of information concerning the people, corporate bodies and functions that generated records, and the MARC Authority format did not support appropriate recording of such contexts and relations. <warrant> (p. 37) <P5> The United States National Archives, even though it had contributed to the data dictionary which led to the MARC content designation, all the data which it believed in 1983 that it would want to interchange, rejected the use of MARC two years later because it did not contain elements of information required by NARA for interchange within its own information systems. <warrant> (p. 37) <P6> [A]rchivists failed to understand then, just as the ISAD(G) standard fails to do now, that rules for content and data representation make sense in the context of the purposes of actual exchanges or implementation, not in the abstract, and that different rules or standards for end-products may derive from the same principles. (p. 38) <P7> After the Committee on Archival Information Exchange of the Society of American Archivists was confronted with proposals to adopt many different vocabularies for a variety of different data elements, a group of archivists who were deeply involved in standards and description efforts within the SAA formed an Ad Hoc Working Group on Standards for Archival Description (WGSAD) to identify what types of standards were needed in order to promote better description practices.  WSAD concluded that existing standards were especially inadequate to guide practice in documenting contexts of creation.  Since then, considerable progress has been made in developing frameworks for documentation, archival information systems architecture and user requirements analysis, which have been identified as the three legs on which the documenting documentation platform rests. <warrant> (p. 38) <P8> Documentation of organizational activity ought to begin long before records are transferred to archives, and may take place even before any records are created -- at the time records are created -- at the time when new functions are assigned to an organization. (p. 39) <P9> It is possible to identify records which will be created and their retention requirements before they are created, because their evidential value and informational content are essentially predetermined. (p. 39) <P10> Archivists can actively intervene through regulation and guidance to ensure that the data content and values depicting activities and functions are represented in such a way that will make them useful for subsequent management and retrieval of the records resulting from these activities. This information, together with systems documentation, defines the immediate information system context out of which the records were generated, in which they are stored, and from which they were retrieved during their active life. (p. 39) <P11> Documentation of the link between data content and the context of creation and use of the records is essential if records (archives or manuscripts) are to have value as evidence. (p. 39) <P12> [C]ontextual documentation capabilities can be dramatically improved by having records managers actively intervene in systems design and implementation.  The benefits of proactive documentation of the context of records creation, however, are not limited to electronic records; the National Archives of Canada has recently revised its methods of scheduling to ensure that such information about important records systems and contexts of records creation will be documented earlier. <warrant> (p. 39) <P13> Documentation of functions and of information systems can be conducted using information created by the organization in the course of its own activity, and can be used to ensure the transfer of records to archives and/or their destruction at appropriate times. It ensures that data about records which were destroyed as well as those which were preserved will be kept, and it takes advantage of the greater knowledge of records and the purposes and methods of day-to-day activity that exist closer to the events. (p. 40) <P14> The facts of processing, exhibiting, citing, publishing and otherwise managing records becomes significant for their meaning as records, which is not true of library materials. (p. 41) <P15> [C]ontent and data representation requirements ought to be derived from analysis of the uses to which such systems must be put, and should satisfy the day to day information requirements of archivists who are the primary users of archives, and of researchers using archives for primary evidential purposes. (p. 41) <P16> The ICA Commission proposes a principle by which archivists would select data content for archival descriptions, which is that "the structure and content of representations of archival material should facilitate information retrieval." (5.1) Unfortunately, it does not help us to understand how the Commission selected the twenty-five elements of information identified as its standard, or how we could apply the principle to the selection of additional data content. It does, however, serve as a prelude to the question of which principles should guide archivists in choosing data values in their representations. (p. 42) <P17> Libraries have found that subject access based on titles, tables of contents, abstracts, indexes and similar formal subject analysis by-products of publishing can support most bibliographic research, but the perspectives brought to materials by archival researchers are both more varied and likely to differ from those of the records creators. (p. 43) <P18> The user should not only be able to employ a terminology and a perspective which are natural, but also should be able to enter the system with a knowledge of the world being documented, without knowing about the world of documentation. (p. 44) <P19> Users need to be able to enter the system through the historical context of activity, construct relations in that context, and then seek avenues down into the documentation. This frees them from trying to imagine what records might have survived -- documentation assists the user to establish the non-existence of records as well as their existence -- or to fathom how archivists might have described records which did survive. (p. 44) <P20> When they departed from the practices of Brooks and Schellenberg in order to develop means for the construction of union catalogues of archival holdings, American archivists were not defining new principles, but inventing a simple experiment. After several years of experience with the new system, serious criticisms of it were being leveled by the very people who had first devised it. (p. 45)
Conclusions
RQ "In short, documentation of the three aspects of records creation contexts (activities, organizations and their functions, and information systems), together with representation of their relations, is essential to the concept of archives as evidence and is therefore a fundamental theoretical principle for documenting documentation. Documentation is a process that captures information about an activity which is relevant to locating evidence of that activity, and captures information about records that are useful to their ongoing management by the archival repository. The primary source of information is the functions and information systems giving rise to the records, and the principal activity of the archivist is the manipulation of data for reference files that create richly-linked structures among attributes of the records-generating context, and which point to the underlying evidence or record." (p. 46)
Type
Journal
Title
Warrant and the Defintion of Electronic Records: Questions Arising from the Pittsburgh Project
The University of Pittsburgh Electronic Recordkeeping Research Project established a model for developing functional requirements and metadata specifications based on warrant, defined as the laws, regulations, best practices, and customs that regulate recordkeeping. Research has shown that warrant can also increase the acceptance by records creators and others of functional requirements for recordkeeping. This article identifies areas related to warrant that require future study. The authors conclude by suggesting that requirements for recordkeeping may vary from country to country and industry to industry because of differing warrant.
Publisher
Kluwer Academic Publishers
Publication Location
The Netherlands
Critical Arguements
CA Poses a long series of questions and issues concerning warrant and its ability to increase the acceptance of recordkeeping requirements. Proposes that research be done to answer these questions. Discusses two different views about whether warrant can be universal and/or international.
Phrases
<P1> As we proceeded with the project [the University of Pittsburgh Electronic Recordkeeping Research Project] we ultimately turned our attention to the idea of the literary warrant -- defined as the mandate from law, professional best practices, and other social sources requiring the creation and continued maintenance of records. Wendy Duff's doctoral research found that warrant can increase the acceptance of some recordkeeping functional requirements, and therefore it has the potential to build bridges between archival professionals and others concerned with or responsible for recordkeeping. We did not anticipate the value of the literary warrant and, in the hindsight now available to us, the concept of the warrant may turn out to be the most important outcome of the project. <P2> In Wendy Duff's dissertation, legal, auditing and information science experts evluated the authority of the sources of warrant for recordkeeping. This part of the study provided evidence that information technology standards may lack authority, but this finding requires further study. Moreover, the number of individuals who evaluated the sources of warrant was extremely small. A much larger number of standards should be included in a subsequent study and a greater number of subjects are needed to evaluate these standards. <P3> We found a strong relationship between warrant and the functional requirements for electronic recordkeeping systems. Research that studies this relationship and determines the different facets that may affect it might provide more insights into the relationship between the warrant and the functional requirements. <P4> [W]e need to develop a better understanding of the degree to which the warrant for recordkeeping operates in various industries, disciplines, and other venues. Some institutions operate in a much more regulated environment than others, suggesting that the imporance of records and the understanding of records may vary considerably between institutional types, across disciplines and from country to country. <P5> We need to consider whether the recordkeeping functional requirements for evidence hold up or need to be revised for recordkeeping requirements for corporate memory, accountability, and cultural value -- the three broad realms now being used to discuss records and recordkeeping. <P6> The warrant gathered to date has primarily focused on technical, legal or the administrative value of records. A study that tested the effectiveness of warrant that supported the cultural or historical mandate of archives might help archivists gain support for their archival programs. <P7> This concern leads us to a need for more research about the understanding of records and recordkeeping in particular institutions, disciplines, and societies. <P8> A broader, and perhaps equally important question, is whether individual professionals and workers are even aware of their regulatory environment. <P9> How do the notion of the warrant and the recordkeeping functional requirements relate to the ways in which organizations work and the management tools they use, such as business process reengineering and data warehousing? <P10> What are the economic implications for organizations to comply with the functional requirements for recordkeeping in evidence? <P11> Is there a warrant and separate recordkeeping functional requirements for individual or personal recordkeeping? <P12> As more individuals, especially writers, financial leaders, and corporate and societal innovators, adopt electronic information technologies for the creation of their records, an understanding of the degree of warrant for such activity and our ability to use this warrant to manage these recordkeeping systems must be developed. <P13> We believe that archivists and records managers can imporve their image if they become experts in all aspects of recordkeeping. This will require a thorough knowledge of the legal, auditing, information technology, and management warrant for recordkeeping. <P14> The medical profession emphasizes that [sic] need to practice evidence-based medicine. We need to find out what would happen if records managers followed suit, and emphasized and practiced warrant-based recordkeeping. Would this require a major change in what we do, or would it simply be a new way to describe what we have always done? <P15> More work also has to be done on the implications of warrant and the functional requirements for the development of viable archives and records management programs. <P16> The warrant concept, along with the recordkeeping functional requirements, seem to possess immense pedagogical implications for what future archivists or practicing archivists, seeking to update their skills, should or would be taught. <P17> We need to determine the effectiveness of using the warrant and recordkeeping functional requirements as a basis for graduate archival and records management education and for developing needed topics for research by masters and doctoral students. <P18> The next generation of educational programs might be those located in other professional schools, focusing on the particular requirements for records in such institutions as corporations, hospitals, and the courts. <P19> We also need to determine the effectiveness of using the warrant and recordkeeping functional requirements in continuing education, public outreach, and advocacy for helping policy makers, resource allocators, administrators, and others to understand the importance of archives and records. Can the warrant and recordkeeping functional requirements support or foster stronger partnerships with other professions, citizen action groups, and other bodies interested in accountability in public organizations and government? <P20> Focusing on the mandate to keep and manage records, instead of the records as artifacts or intersting stuff, seems much more relevant in late twentieth century society. <P21> We need to investigate the degree to which records managers and archivists can develop a universal method for recordkeeping. ... Our laws, regulations, and best practices are usually different from country to country. Therefore, must any initiative to develop warrant also be bounded by our borders? <P22> A fundamental difference between the Pittsburgh Project and the UBC project is that UBC wishes to develop a method for managing and preserving electronic records that is applicable across all juridical systems and cultures, while the Pittsburgh Project is proposing a model that enables recordkeeping to be both universal and local at the same time. <P23> We now have a records management standard from Australia which is relevant for most North American records programs. It has been proposed as an international standard, although it is facing opposition from some European countries. Can there be an international standard for recordkeeping and can we develop one set of procedures which will be accepted across nations? Or must methods of recordkeeping be adapted to suit specific cultures, juridical systems, or industries?
Over the last decade a number of writers have encouraged archivists to develop strategies and tactics to redefine their role and to insert themselves into the process of designing recordkeeping systems. This paper urges archivists to exploit the authority inherent in the laws, regulations, standards, and professional best practices that dictate recordkeeping specifications to gain great acceptance for the requirements for electronic evidence. Furthermore, it postulates that this proactive approach could assist in gaining greater respect for the archival profession.
Critical Arguements
CA The use of authoritative sources of warrant would improve acceptance of electronic records as evidence and create greater respect for the archival profession.
Phrases
<P1> The legal, administrative, fiscal, or information value of records is dependent upon the degree of trust society places in records as reliable testimony or evidence of the acts they purport to document. In turn, this trust is dependent on society's faith in the procedures that control the creation and maintenance of the record. <P2> [S]ociety bestows some methods of recordkeeping and record creating with an authority or 'warrant' for generating reliable records. <P3> David Bearman first proposed the idea of "literary warrant." <P4> [S]tatements of warrant provide clear instructions on how records should be kept and delineate elements needed for the records to be complete. <P5> The information technology field promulgates standards, but in North America adherence to them is voluntary rather than obligatory. <P6> The University of Pittsburgh Electronic Recordkeeping Project suggested that requirements for electronic recordkeeping should derive from authoritative sources, such as the law, customs, standards, and professional best practices accepted by society and codified in the literature of different professions concerned with records and recordkeeping rather than developed in isolation. <P7> On their own, archival requirements for recordkeeping have very little authority as no authoritative agencies such as standards boards or professional associations have yet to endorse them [sic] and few archivists have the authority to insist that their organizations follow them. <P8> An NHPRC study suggested that archivists have not been involved in the process of meeting the challenges of electronic records because they are undervalued by their colleagues, or, in other words, are not viewed as a credible source.
Conclusions
RQ "By highlighting the similarity between recordkeeping requirements and the requirements delineated in authoritative statements in the law, auditing standards, and professional best practices, archivists will increase the power of their message. ... If archivists are to take their rightful place as regulators of an organization's documentary requirements, they will have to reach beyond their own professional literature and understand the requirements for recordkeeping imposed by other professions and society in general. Furthermore, they will have to study methods of increasing the accpetance of their message and the impact and power of warrant."
Type
Journal
Title
Accessing essential evidence on the web: Towards an Australian recordkeeping metadata standard
CA Standardized recordkeeping metadata allows for access to essential evidence of business activities and promotes reliability and authenticity. The Australian records and metadata community have been working hard to define standards and identify requirements as well as support interoperability.
Phrases
<P1> But records, as accountability traces and evidence of business activity, have additional metadata requirements. Authoritative, well-structured metadata which specifies their content, structure, context, and essential management needs must be embedded in, wrapped around and otherwise persistently linked to them from the moment they are created if they are to continue to function as evidence. (p.2) <P2> People do business in social and organizational contexts that are governed by external mandates (e.g. social mores, laws) and internal mandates (e.g. policies, business rules). Mandates establish who is responsible for what, and govern social and organizational activity, including the creation of full and accurate records. <warrant> (p.3)
Type
Journal
Title
The role of standards in the archival management of electronic records
CA Technical standards, developed by national and international organizations, are increasingly important in electronic recordkeeping. Thirteen standards are summarized and their sponsoring organizations described.
Phrases
<P1> The challenge to archivists is to make sure that the standards being applied to electronic records systems today are adequate to ensure the long-term preservation and use of information contained in the systems. (p.31) <P2> While consensus can easily be established that data exchange standards offer a wealth of potential benefits, there are also a number of real barriers to implementation that make the road ahead for archivists a very bumpy one. (p.41)
Conclusions
RQ What the current state of standardization in the archival management of electronic records and what are the issues involved?
Type
Journal
Title
Managing the Present: Metadata as Archival Description
Traditional archival description undertaken at the terminal stages of the life cycle has had two deleterious effects on the archival profession. First, it has resulted in enormous, and in some cases, insurmountable processing backlogs. Second, it has limited our ability to capture crucial contextual and structural information throughout the life cycle of record-keeping systems that are essential for fully understanding the fonds in our institutions. This shortcoming has resulted in an inadequate knowledge base for appraisal and access provision. Such complications will only become more magnified as distributed computering and complex software applications continue to expand throughout organizations. A metadata strategy for archival description will help mitigate these problems and enhance the organizational profile of archivists who will come to be seen as valuable organizational knowledge and accountability managers.
Critical Arguements
CA "This essay affirms this call for evaluation and asserts that the archival profession must embrace a metadata systems approach to archival description and management." ... "It is held here that the requirements for records capture and description are the requirements for metadata."
Phrases
<P1> New archival organizational structures must be created to ensure that records can be maintained in a usable form. <warrant> <P2> The recent report of Society of American Archivists (SAA) Committee on Automated Records and Techniques (CART) on curriculum development has argued that archivists need to "understand the nature and utility of metadata and how to interpret and use metadata for archival purposes." <warrant> <P3> The report advises archivists to acquire knowledge on the meanings of metadata, its structures, standards, and uses for the management of electronic records. Interestingly, the requirements for archival description immediately follow this section and note that archivists need to isolate the descriptive requirements, standards, documentiation, and practices needed for managing electronic records. <warrant> <P4> Clearly, archivists need to identify what types of metadata will best suit their descriptive needs, underscoring the need for the profession to develop strategies aand tactics to satisfy these requirements within active software environments. <warrant> <P5> Underlying the metadata systems strategy for describing and managing electronic information technologies is the seemingly universal agreement amongst electronic records archivists on the requirement to intervene earlier in the life cycle of electronic information systems. <warrant> <P6> Metadata has loomed over the archival management of electronic records for over five years now and is increasingly being promised as a basic control strategy for managing these records. <warrant> <P7> However, she [Margaret Hedstrom] also warns that as descriptive practices shift from creating descriptive information to capturing description along with the records, archivists may discover that managing the metadata is a much greater challenge than managing the records themselves. <P8> Archivists must seek to influence the creation of record-keeping systems within organizations by connecting the transaction that created the data to the data itself. Such a connection will link informational content, structure, and the context of transactions. Only when these conditions are met will we have records and an appropriate infrastructure for archival description. <warrant> <P9> Charles Dollar has argued that archivists increasingly will have to rely upon and shape the metadata associated with electronic records in order to fully capture provenance information about them. <warrant> <P10> Bearman proposes a metadata systems strategy, which would focus more explicitly on the context out of which records arise, as opposed to concentrating on their content. This axiom is premised on the assumption that "lifecycle records systems control should drive provenance-based description and link to top-down definitions of holdings." <warrant> <P11> Bearman and Margaret Hedstrom have built upon this model and contend that properly specified metadata capture could fully describe sytems while they are still active and eliminate the need for post-hoc description. The fundamental change wrought in this approach is the shift from doing things to records (surveying, scheduling, appraising, disposing/accessioning, describing, preserving, and accessing) to providing policy direction for adequate documentation through management of organizational behavior (analyzing organizational functions, defining business transactions, defining record metadata, indentifying control tactics, and establishing the record-keeping regime). Within this model archivists focus on steering how records will be captured (and that they will be captured) and how they will be managed and described within record-keeping systems while they are still actively serving their parent organization. <P12> Through the provision of policy guidance and oversight, organizational record-keeping is managed in order to ensure that the "documentation of organizational missions, functions, and responsibilities ... and reporting relationships within the organization, will be undertaken by the organizations themselves in their administrative control systems." <warrant> <P13> Through a metadata systems approach, archivists can realign themselves strategically as managers of authoritative information about organizational record-keeping systems, providing for the capture of information about each system, its contextual attributes, its users, its hardware configurations, its software configurations, and its data configurations. <warrant> <P14> The University of Pittsburgh's functional requirements for record-keeping provides a framework for such information management structure. These functional requirements are appropriately viewed as an absolute ideal, requiring testing within live systems and organizations. If properly implemented, however, they can provide a concrete model for metadata capture that can automatically supply many of the types of descriptive information both desired by archivists and required for elucidating the context out of which records arise. <P15> It is possible that satisfying these requirements will contribute to the development of a robust archival description process integrating "preservation of meaning, exercise of control, and provision of access'" within "one prinicipal, multipurpose descriptive instrument" hinted at by Luciana Duranti as a possible outcome of the electronic era. <P16> However, since electronic records are logical and not physical entities, there is no physical effort required to access and process them, just mental modelling. <P17> Depending on the type of metadata that is built into and linked to electronic information systems, it is possible that users can identify individual records at the lowest level of granularity and still see the top-level process it is related to. Furthermore, records can be reaggregated based upon user-defined criteria though metadata links that track every instance of their use, their relations to other records, and the actions that led to their creation. <P18> A metadata strategy for archival description will help to mitigate these problems and enhance the organizational profile of archivists, who will come to be seen as valuable organizational knowledge and accountability managers. <warrant>
Conclusions
RQ "First and foremost, the promise of metadata for archival description is contingent upon the creation of electronic record-keeping systems as opposed to a continuation of the data management orientation that seems to dominate most computer applications within organizations." ... "As with so many other aspects of the archival endeavour, these requirements and the larger metadata model for description that they are premised upon necessitate further exploration through basic research."
SOW
DC "In addition to New York State, recognition of the failure of existing software applications to capture a full compliment of metadata required for record-keeping and the need for such records management control has also been acknowledged in Canada, the Netherlands, and the World Bank." ... "In conjunction with experts in electronic records managment, an ongoing research project at the University of Pittsburgh has developed a set of thirteen functional requirements for record-keeping. These requirements provide a concrete metadata tool sought by archivists for managing and describing electronic records and electronic record-keeping systems." ... David A. Wallace is an Assistant Professor at the School of Information, University of Michigan, where he teaches in the areas of archives and records management. He holds a B.A. from Binghamton University, a Masters of Library Science from the University at Albany, and a doctorate from the University of Pittsburgh. Between 1988 and 1992, he served as Records/Systems/Database Manager at the National Security Archive in Washington, D.C., a non-profit research library of declassified U.S. government records. While at the NSA he also served as Technical Editor to their "The Making of U.S. Foreign Policy" series. From 1993-1994, he served as a research assistant to the University of Pittsburgh's project on Functional Requirements for Evidence in Recordkeeping, and as a Contributing Editor to Archives and Museum Informatics: Cultural Heritage Informatics Quarterly. From 1994 to 1996, he served as a staff member to the U.S. Advisory Council on the National Information Infrastructure. In 1997, he completed a dissertation analyzing the White House email "PROFS" case. Since arriving at the School of Information in late 1997, he has served as Co-PI on an NHPRC funded grant assessing strategies for preserving electronic records of collaborative processes, as PI on an NSF Digital Government Program funded planning grant investigating the incorporation of born digital records into a FOIA processing system, co-edited Archives and the Public Good: Accountability and Records in Modern Society (Quorum, 2002), and was awarded ARMA International's Britt Literary Award for an article on email policy. He also serves as a consultant to the South African History Archives Freedom of Information Program and is exploring the development of a massive digital library of declassified imaged/digitized U.S. government documents charting U.S. foreign policy.
Type
Electronic Journal
Title
Keeping Memory Alive: Practices for Preserving Digital Content at the National Digital Library Program of the Library of Congress
CA An overview of the major issues and initiatives in digital preservation at the Library of Congress. "In the medium term, the National Digital Library Program is focusing on two operational approaches. First, steps are taken during conversion that are likely to make migration or emulation less costly when they are needed. Second, the bit streams generated by the conversion process are kept alive through replication and routine refreshing supported by integrity checks. The practices described here provide examples of how those steps are implemented to keep the content of American Memory alive."
Phrases
<P1> The practices described here should not be seen as policies of the Library of Congress; nor are they suggested as best practices in any absolute sense. NDLP regards them as appropriate practices based on real experience, the nature and content of the originals, the primary purposes of the digitization, the state of technology, the availability of resources, the scale of the American Memory digital collection, and the goals of the program. They cover not just the storage of content and associated metadata, but also aspects of initial capture and quality review that support the long-term retention of content digitized from analog sources. <P2> The Library recognizes that digital information resources, whether born digital or converted from analog forms, should be acquired, used, and served alongside traditional resources in the same format or subject area. Such responsibility will include ensuring that effective access is maintained to the digital content through American Memory and via the Library's main catalog and, in coordination with the units responsible for the technical infrastructure, planning migration to new technology when needed. <P3> Refreshing can be carried out in a largely automated fashion on an ongoing basis. Migration, however, will require substantial resources, in a combination of processing time, out-sourced contracts, and staff time. Choice of appropriate formats for digital masters will defer the need for large-scale migration. Integrity checks and appropriate capture of metadata during the initial capture and production process will reduce the resource requirements for future migration steps. <warrant> We can be certain that migration of content to new data formats will be necessary at some point. The future will see industrywide adoption of new data formats with functional advantages over current standards. However, it will be difficult to predict exactly which metadata will be useful to support migration, when migration of master formats will be needed, and the nature and extent of resource needs. Human experts will need to decide when to undertake migration and develop tools for each migration step. <P4> Effective preservation of resources in digital form requires (a) attention early in the life-cycle, at the moment of creation, publication, or acquisition and (b) ongoing management (with attendant costs) to ensure continuing usability. <P5> The National Digital Library Program has identified several categories of metadata needed to support access and management for digital content. Descriptive metadata supports discovery through search and browse functions. Structural metadata supports presentation of complex objects by representing relationships between components, such as sequences of images. In addition, administrative metadata is needed to support management tasks, such as access control, archiving, and migration. Individual metadata elements may support more than one function, but the categorization of elements by function has proved useful. <P6> It has been recognized that metadata representations appropriate for manipulation and long-term retention may not always be appropriate for real-time delivery. <P7> It has also been realized that some basic descriptive metadata (at the very least a title or brief description) should be associated with the structural and administrative metadata. <P8> During 1999, an internal working group reviewed past experience and prototype exercises and compiled a core set of metadata elements that will serve the different functions identified. This set will be tested and refined as part of pilot activities during 2000. <P9> Master formats are well documented and widely deployed, preferably formal standards and preferably non-proprietary. Such choices should minimize the need for future migration or ensure that appropriate and affordable tools for migration will be developed by the industry. <warrant>
Conclusions
RQ "Developing long-term strategies for preserving digital resources presents challenges associated with the uncertainties of technological change. There is currently little experience on which to base predictions of how often migration to new formats will be necessary or desirable or whether emulation will prove cost-effective for certain categories of resources. ... Technological advances, while sure to present new challenges, will also provide new solutions for preserving digital content."
Type
Electronic Journal
Title
A Spectrum of Interoperability: The Site for Science Prototype for the NSDL
"Currently, NSF is funding 64 projects, each making its own contribution to the library, with a total annual budget of about $24 million. Many projects are building collections; others are developing services; a few are carrying out targeted research.The NSDL is a broad program to build a digital library for education in science, mathematics, engineering and technology. It is funded by the National Science Foundation (NSF) Division of Undergraduate Education. . . . The Core Integration task is to ensure that the NSDL is a single coherent library, not simply a set of unrelated activities. In summer 2000, the NSF funded six Core Integration demonstration projects, each lasting a year. One of these grants was to Cornell University and our demonstration is known as Site for Science. It is at http://www.siteforscience.org/ [Site for Science]. In late 2001, the NSF consolidated the Core Integration funding into a single grant for the production release of the NSDL. This grant was made to a collaboration of the University Corporation for Atmospheric Research (UCAR), Columbia University and Cornell University. The technical approach being followed is based heavily on our experience with Site for Science. Therefore this article is both a description of the strategy for interoperability that was developed for Site for Science and an introduction to the architecture being used by the NSDL production team."
ISBN
1082-9873
Critical Arguements
CA "[T]his article is both a description of the strategy for interoperability that was developed for the [Cornell University's NSF-funded] Site for Science and an introduction to the architecture being used by the NSDL production team."
Phrases
<P1> The grand vision is that the NSDL become a comprehensive library of every digital resource that could conceivably be of value to any aspect of education in any branch of science and engineering, both defined very broadly. <P2> Interoperability among heterogeneous collections is a central theme of the Core Integration. The potential collections have a wide variety of data types, metadata standards, protocols, authentication schemes, and business models. <P3> The goal of interoperability is to build coherent services for users, from components that are technically different and managed by different organizations. This requires agreements to cooperate at three levels: technical, content and organizational. <P4> Much of the research of the authors of this paper aims at . . . looking for approaches to interoperability that have low cost of adoption, yet provide substantial functionality. One of these approaches is the metadata harvesting protocol of the Open Archives Initiative (OAI) . . . <P5> For Site for Science, we identified three levels of digital library interoperability: Federation; Harvesting; Gathering. In this list, the top level provides the strongest form of interoperability, but places the greatest burden on participants. The bottom level requires essentially no effort by the participants, but provides a poorer level of interoperability. The Site for Science demonstration concentrated on the harvesting and gathering, because other projects were exploring federation. <P6> In an ideal world all the collections and services that the NSDL wishes to encompass would support an agreed set of standard metadata. The real world is less simple. . . . However, the NSDL does have influence. We can attempt to persuade collections to move along the interoperability curve. <warrant> <P7> The Site for Science metadata strategy is based on two principles. The first is that metadata is too expensive for the Core Integration team to create much of it. Hence, the NSDL has to rely on existing metadata or metadata that can be generated automatically. The second is to make use of as much of the metadata available from collections as possible, knowing that it varies greatly from none to extensive. Based on these principles, Site for Science, and subsequently the entire NSDL, developed the following metadata strategy: Support eight standard formats; Collect all existing metadata in these formats; Provide crosswalks to Dublin Core; Assemble all metadata in a central metadata repository; Expose all metadata records in the repository for service providers to harvest; Concentrate limited human effort on collection-level metadata; Use automatic generation to augment item-level metadata. <P8> The strategy developed by Site for Science and now adopted by the NSDL is to accumulate metadata in the native formats provided by the collections . . . If a collection supports the protocols of the Open Archives Initiative, it must be able to supply unqualified Dublin Core (which is required by the OAI) as well as the native metadata format. <P9> From a computing viewpoint, the metadata repository is the key component of the Site for Science system. The repository can be thought of as a modern variant of the traditional library union catalog, a catalog that holds comprehensive catalog records from a group of libraries. . . . Metadata from all the collections is stored in the repository and made available to providers of NSDL service.
Conclusions
RQ 1 "Can a small team of librarians manage the collection development and metadata strategies for a very large library?" RQ 2 "Can the NSDL actually build services that are significantly more useful than the general web search services?"
Type
Electronic Journal
Title
Computer Records and the Federal Rules of Evidence
See also U.S. Federal Rules of Evidence. Rule 803. Hearsay Exceptions; Availability of Declarant Immaterial.
Publisher
U.S. Department of Justice Executive Office for United States Attorneys
Critical Arguements
CA "This article explains some of the important issues that can arise when the government seeks the admission of computer records under the Federal Rules of Evidence. It is an excerpt of a larger DOJ manual entitled 'Searching and Seizing Computers and Obtaining Electronic Evidence in Criminal Investigations,' which is available on the internet at www.cybercrime.gov/searchmanual.htm." Cites cases dealing with Fed. R. Evid. 803(6).
Phrases
<P1>Most federal courts that have evaluated the admissibility of computer records have focused on computer records as potential hearsay. The courts generally have admitted computer records upon a showing that the records fall within the business records exception, Fed. R. Evid. 803(6). <P2> See, e.g., United States v. Cestnik, 36 F.3d 904, 909-10 (10th Cir. 1994); United States v. Moore, 923 F.2d 910, 914 (1st Cir. 1991); United States v. Briscoe, 896 F.2d 1476, 1494 (7th Cir. 1990); United States v. Catabran, 836 F.2d 453, 457 (9th Cir. 1988); Capital Marine Supply v. M/V Roland Thomas II, 719 F.2d 104, 106 (5th Cir. 1983). <P3> Applying this test, the courts have indicated that computer records generally can be admitted as business records if they were kept pursuant to a routine procedure for motives that tend to assure their accuracy. <warrant>
Conclusions
RQ "The federal courts are likely to move away from this 'one size fits all' approach as they become more comfortable and familiar with computer records. Like paper records, computer records are not monolithic: the evidentiary issues raised by their admission should depend on what kind of computer records a proponent seeks to have admitted. For example, computer records that contain text often can be divided into two categories: computer-generated records, and records that are merely computer-stored. See People v. Holowko, 486 N.E.2d 877, 878-79 (Ill. 1985). The difference hinges upon whether a person or a machine created the records' contents. ... As the federal courts develop a more nuanced appreciation of the distinctions to be made between different kinds of computer records, they are likely to see that the admission of computer records generally raises two distinct issues. First, the government must establish the authenticity of all computer records by providing 'evidence sufficient to support a finding that the matter in question is what its proponent claims.' Fed. R. Evid. 901(a). Second, if the computer records are computer-stored records that contain human statements, the government must show that those human statements are not inadmissible hearsay."
Type
Electronic Journal
Title
The Warwick Framework: A container architecture for diverse sets of metadata
This paper is a abbreviated version of The Warwick Framework: A Container Architecture for Aggregating Sets of Metadata. It describes a container architecture for aggregating logically, and perhaps physically, distinct packages of metadata. This "Warwick Framework" is the result of the April 1996 Metadata II Workshop in Warwick U.K.
ISBN
1082-9873
Critical Arguements
CA Describes the Warwick Framework, a proposal for linking together the various metadata schemes that may be attached to a given information object by using a system of "packages" and "containers." "[Warwick Workshop] attendees concluded that ... the route to progress on the metadata issue lay in the formulation a higher-level context for the Dublin Core. This context should define how the Core can be combined with other sets of metadata in a manner that addresses the individual integrity, distinct audiences, and separate realms of responsibility of these distinct metadata sets. The result of the Warwick Workshop is a container architecture, known as the Warwick Framework. The framework is a mechanism for aggregating logically, and perhaps physically, distinct packages of metadata. This is a modularization of the metadata issue with a number of notable characteristics. It allows the designers of individual metadata sets to focus on their specific requirements, without concerns for generalization to ultimately unbounded scope. It allows the syntax of metadata sets to vary in conformance with semantic requirements, community practices, and functional (processing) requirements for the kind of metadata in question. It separates management of and responsibility for specific metadata sets among their respective "communities of expertise." It promotes interoperability by allowing tools and agents to selectively access and manipulate individual packages and ignore others. It permits access to the different metadata sets that are related to the same object to be separately controlled. It flexibly accommodates future metadata sets by not requiring changes to existing sets or the programs that make use of them."
Phrases
<P1> The range of metadata needed to describe and manage objects is likely to continue to expand as we become more sophisticated in the ways in which we characterize and retrieve objects and also more demanding in our requirements to control the use of networked information objects. The architecture must be sufficiently flexible to incorporate new semantics without requiring a rewrite of existing metadata sets. <warrant> <P2> Each logically distinct metadata set may represent the interests of and domain of expertise of a specific community. <P3> Just as there are disparate sources of metadata, different metadata sets are used by and may be restricted to distinct communities of users and agents. <P4> Strictly partitioning the information universe into data and metadata is misleading. <P5> If we allow for the fact that metadata for an object consists of logically distinct and separately administered components, then we should also provide for the distribution of these components among several servers or repositories. The references to distributed components should be via a reliable persistent name scheme, such as that proposed for Universal Resources Names (URNs) and Handles. <P6> [W]e emphasize that the existence of a reliable URN implementation is a necessary to avoid the problems of dangling references that plague the Web. <warrant> <P7> Anyone can, in fact, create descriptive data for a networked resource, without permission or knowledge of the owner or manager of that resource. This metadata is fundamentally different from that metadata that the owner of a resource chooses to link or embed with the resource. We, therefore, informally distinguish between two categories of metadata containers, which both have the same implementation [internally referenced and externally referenced metadata containers].
Conclusions
RQ "We run the danger, with the full expressiveness of the Warwick Framework, of creating such complexity that the metadata is effectively useless. Finding the appropriate balance is a central design problem. ... Definers of specific metadata sets should ensure that the set of operations and semantics of those operations will be strictly defined for a package of a given type. We expect that a limited set of metadata types will be widely used and 'understood' by browsers and agents. However, the type system must be extensible, and some method that allows existing clients and agents to process new types must be a part of a full implementation of the Framework. ... There is a need to agree on one or more syntaxes for the various metadata sets. Even in the context of the relatively simple World Wide Web, the Internet is often unbearably slow and unreliable. Connections often fail or time out due to high load, server failure, and the like. In a full implementation of the Warwick Framework, access to a "document" might require negotiation across distributed repositories. The performance of this distributed architecture is difficult to predict and is prone to multiple points of failure. ... It is clear that some protocol work will need to be done to support container and package interchange and retrieval. ... Some examination of the relationship between the Warwick Framework and ongoing work in repository architectures would likely be fruitful.
Type
Electronic Journal
Title
Metadata: The right approach, An integrated model for descriptive and rights metadata in E-commerce
If you've ever completed a large and difficult jigsaw puzzle, you'll be familiar with that particular moment of grateful revelation when you find that two sections you've been working on separately actually fit together. The overall picture becomes coherent, and the task at last seems achievable. Something like this seems to be happening in the puzzle of "content metadata." Two communities -- rights owners on one hand, libraries and cataloguers on the other -- are staring at their unfolding data models and systems, knowing that somehow together they make up a whole picture. This paper aims to show how and where they fit.
ISBN
1082-9873
Critical Arguements
CA "This paper looks at metadata developments from this standpoint -- hence the "right" approach -- but does so recognising that in the digital world many Chinese walls that appear to separate the bibliographic and commercial communities are going to collapse." ... "This paper examines three propositions which support the need for radical integration of metadata and rights management concerns for disparate and heterogeneous materials, and sets out a possible framework for an integrated approach. It draws on models developed in the CIS plan and the DOI Rights Metadata group, and work on the ISRC, ISAN, and ISWC standards and proposals. The three propositions are: DOI metadata must support all types of creation; The secure transaction of requests and offers data depends on maintaining an integrated structure for documenting rights ownership agreements; All elements of descriptive metadata (except titles) may also be elements of agreements. The main consequences of these propositions are: A cross-sector vocabulary is essential; Non-confidential terms of rights ownership agreements must be generally accessible in a standard form. (In its purest form, the e-commerce network must be able to automatically determine the current owner of any right in any creation for any territory.); All descriptive metadata values (except titles) must be stored as unique, coded values. If correct, the implications of these propositions on the behaviour, and future inter-dependency, of the rights-owning and bibliographic communities are considerable."
Phrases
<P1> Historically, metadata -- "data about data" -- has been largely treated as an afterthought in the commercial world, even among rights owners. Descriptive metadata has often been regarded as the proper province of libraries, a battlefield of competing systems of tags and classification and an invaluable tool for the discovery of resources, while "business" metadata lurked, ugly but necessary, in distribution systems and EDI message formats. Rights metadata, whatever it may be, may seem to have barely existed in a coherent form at all. <P2> E-commerce offers the opportunity to integrate the functions of discovery, access, licensing and accounting into single point-and-click actions in which metadata is a critical agent, a glue which holds the pieces together. <warrant> <P3> E-commerce in rights will generate global networks of metadata every bit as vital as the networks of optical fibre -- and with the same requirements for security and unbroken connectivity. <warrant> <P4> The sheer volume and complexity of future rights trading in the digital environment will mean that any but the most sporadic level of human intervention will be prohibitively expensive. Standardised metadata is an essential component. <warrant> <P5> Just as the creators and rights holders are the sources of the content for the bibliographic world, so it seems inevitable they will become the principal source of core metadata in the web environment, and that metadata will be generated simultaneously and at source to meet the requirements of discovery, access, protection, and reward. <P6> However, under the analysis being carried out within the communities identified above and by those who are developing technology and languages for rights-based e-commerce, it is becoming clear that "functional" metadata is also a critical component. It is metadata (including identifiers) which defines a creation and its relationship to other creations and to the parties who created and variously own it; without a coherent metadata infrastructure e-commerce cannot properly flow. Securing the metadata network is every bit as important as securing the content, and there is little doubt which poses the greater problem. <warrant> <P7> Because creations can be nested and modified at an unprecedented level, and because online availability is continuous, not a series of time-limited events like publishing books or selling records, dynamic and structured maintenance of rights ownership is essential if the currency and validity of offers is to be maintained. <warrant> <P8> Rights metadata must be maintained and linked dynamically to all of its related content. <P9> A single, even partial, change to rights ownership in the original creation needs to be communicated through this chain to preserve the currency of permissions and royalty flow. There are many options for doing this, but they all depend, among other things, on the security of the metadata network. <warrant> <P10>As digital media causes copyright frameworks to be rewritten on both sides of the Atlantic, we can expect measures of similar and greater impact at regular intervals affecting any and all creation types: yet such changes can be relatively simple to implement if metadata is held in the right way in the right place to begin with. <warrant> <P11> The disturbing but inescapable consequence is that it is not only desirable but essential for all elements of descriptive metadata, except for titles, to be expressed at the outset as structured and standardised values to preserve the integrity of the rights chain. <P12> Within the DOI community, which embraces commercial and library interests, the integration of rights and descriptive metadata has become a matter of priority. <P13> What is required is that the establishment of a creation description (for example, the registration of details of a new article or audio recording) or of change of rights control (for example, notification of the acquisition of a work or a catalogue of works) can be done in a standardised and fully structured way. <warrant> <P14> Unless the chain is well maintained at source, all downstream transactions will be jeopardised, for in the web environment the CIS principle of "do it once, do it right" is seen at its ultimate. A single occurrence of a creation on the web, and its supporting metadata, can be the source for all uses. <P15> One of the tools to support this development is the RDF (Resource Description Framework). RDF provides a means of structuring metadata for anything, and it can be expressed in XML. <P16> Although formal metadata standards hardly exist within ISO, they are appearing through the "back door" in the form of mandatory supporting data for identifier standards such as ISRC, ISAN and ISWC. A major function of the INDECS project will be to ensure the harmonisation of these standards within a single framework. <P17> In an automated, protected environment, this requires that the rights transaction is able to generate automatically a new descriptive metadata set through the interaction of the agreement terms with the original creation metadata. This can only happen (and it will be required on a massive scale) if rights and descriptive metadata terminology is integrated and standardised. <warrant> <P18>As resources become available virtually, it becomes as important that the core metadata itself is not tampered with as it is that the object itself is protected. Persistence is now not only a necessary characteristic of identifiers but also of the structured metadata that attends them. <P19> This leads us also to the conclusion that, ideally, standardised descriptive metadata should be embedded into objects for its own protection. <P20> It also leads us to the possibility of metadata registration authorities, such as the numbering agencies, taking wider responsibilities. <P21>If this paper is correct in its propositions, then rights metadata will have to rewrite half of Dublin Core or else ignore it entirely. <P22> The web environment with its once-for-all means of access provides us with the opportunity to eliminate duplication and fragmentation of core metadata; and at this moment, there are no legacy metadata standards to shackle the information community. We have the opportunity to go in with our eyes open with standards that are constructed to make the best of the characteristics of the new digital medium. <warrant>
Conclusions
RQ "The INDECS project (assuming its formal adoption next month), in which the four major communities are active, and with strong links to ISO TC46 and MPEG, will provide a cross-sector framework for this work in the short-term. The DOI Foundation itself may be an appropriate umbrella body in the future. We may also consider that perhaps the main function of the DOI itself may not be, as originally envisaged, to link user to content -- which is a relatively trivial task -- but to provide the glue to link together creation, party, and agreement metadata. The model that rights owners may be wise to follow in this process is that of MPEG, where the technology industry has tenaciously embraced a highly-regimented, rolling standardisation programme, the results of which are fundamental to the success of each new generation of products. Metadata standardisation now requires the same technical rigour and commercial commitment. However, in the meantime the bibliographic world, working on what it has always seen its own part of the jigsaw puzzle, is actively addressing many of these issues in an almost parallel universe. The question remains as to how in practical terms the two worlds, rights and bibliographic, can connect, and what may be the consequences of a prolonged delay in doing so." ... "The former I encourage to make a case for continued support and standardisation of a flawed Dublin Core in the light of the propositions I have set out in this paper, or else engage with the DOI and rights owner communities in its revision to meet the real requirements of digital commerce in its fullest sense."
SOW
DC "There are currently four major active communities of rights-holders directly confronting these questions: the DOI community, at present based in the book and electronic publishing sector; the IFPI community of record companies; the ISAN community embracing producers, users, and rights owners of audiovisuals; and the CISAC community of collecting societies for composers and publishers of music, but also extending into other areas of authors' rights, including literary, visual, and plastic arts." ... "There are related rights-driven projects in the graphic, photographic, and performers' communities. E-commerce means that metadata solutions from each of these sectors (and others) require a high level of interoperability. As the trading environment becomes common, traditional genre distinctions between creation-types become meaningless and commercially destructive."
Type
Report
Title
Mapping of the Encoded Archival Description DTD Element Set to the CIDOC CRM
The CIDOC CRM is the first ontology designed to mediate contents in the area of material cultural heritage and beyond, and has been accepted by ISO TC46 as work item for an international standard. The EAD Document Type Definition (DTD) is a standard for encoding archival finding aids using the Standard Generalized Markup Language (SGML). Archival finding aids are detailed guides to primary source material which provide fuller information than that normally contained within cataloging records. 
Publisher
Institute of Computer Science, Foundation for Research and Technology - Hellas
Publication Location
Heraklion, Crete, Greece
Language
English
Critical Arguements
CA "This report describes the semantic mapping of the current EAD DTD Version 1.0 Element Set to the CIDOC CRM and its latest extension. This work represents a proof of concept for the functionality the CIDOC CRM is designed for." 
Conclusions
RQ "Actually, the CRM seems to do the job quite well ÔÇô problems in the mapping arise more from underspecification in the EAD rather than from too domain-specific notions. "┬á... "To our opinion, the archival community could benefit from the conceptualizations of the CRM to motivate more powerful metadata standards with wide interoperability in the future, to the benefit of museums and other disciplines as well."
SOW
DC "As a potential international standard, the EAD DTD is maintained in the Network Development and MARC Standards Office of the Library of Congress in partnership with the Society of American Archivists." ... "The CIDOC Conceptual Reference Model (see [CRM1999], [Doerr99]), in the following only referred to as ┬½CRM┬╗, is outcome of an effort of the Documentation Standards Group of the CIDOC Committee (see ┬½http:/www.cidoc.icom.org┬╗, ÔÇ£http://cidoc.ics.forth.grÔÇØ) of ICOM, the International Council of Museums beginning in 1996."
Type
Web Page
Title
U.S. Federal Rules of Evidence. Rule 803. Hearsay Exceptions; Availability of Declarant Immaterial.
"This briefing paper summarizes the results of a cooperative project sponsored in part, by a research grant from the National Historical Publications and Records Commission. The project, called "Models for Action: Practical Approaches to Electronic Records Management and Preservation," focused on the development of practical tools to support the integration of essential electronic records management requirements into the design of new information systems. The project was conducted from 1996 to 1998 through a partnership between the New York State Archives and Records Administration and the Center for Technology in Government. The project team also included staff from the NYS Adirondack Park Agency, eight corporate partners led by Intergraph Corporation, and University at Albany faculty and graduate students."
Publisher
Center for Technology in Government
Critical Arguements
CA "This briefing paper bridges the gap between theory and practice by presenting generalizable tools that link records management practices to business objectives."
The CDISC Submission Metadata Model was created to help ensure that the supporting metadata for these submission datasets should meet the following objectives: Provide FDA reviewers with clear describtions of the usage, structure, contents, and attributes of all datasets and variables; Allow reviewers to replicate most analyses, tables, graphs, and listings with minimal or no transformations; Enable reviewers to easily view and subset the data used to generate any analysis, table, graph, or listing without complex programming. ... The CDISC Submission Metadata Model has been defined to guide sponsors in the preparation of data that is to be submitted to the FDA. By following the principles of this model, sponsors will help reviewers to accurately interpret the contents of submitted data and work with it more effectively, without sacrificing the scientific objectives of clinical development.
Publisher
The Clinical Data Interchange Standards Consortium
Critical Arguements
CA "The CDISC Submission Data Model has focused on the use of effective metadata as the most practical way of establishing meaningful standards applicable to electronic data submitted for FDA review."
Conclusions
RQ "Metadata prepared for a domain (such as an efficacy domain) which has not been described in a CDISC model should follow the general format of the safety domains, including the same set of core selection variables and all of the metadata attributes specified for the safety domains. Additional examples and usage guidelines are available on the CDISC web site at www.cdisc.org." ... "The CDISC Metadata Model describes the structure and form of data, not the content. However, the varying nature of clinical data in general will require the sponsor to make some decisions about how to represent certain real-world conditions in the dataset. Therefore, it is useful for a metadata document to give the reviewer an indication of how the datasets handle certain special cases."
SOW
DC CDISC is an open, multidisciplinary, non-profit organization committed to the development of worldwide standards to support the electronic acquisition, exchange, submission and archiving of clinical trials data and metadata for medical and biopharmaceutical product development. CDISC members work together to establish universally accepted data standards in the pharmaceutical, biotechnology and device industries, as well as in regulatory agencies worldwide. CDISC currently has more than 90 members, including the majority of the major global pharmaceutical companies.
Type
Web Page
Title
CDISC Achieves Two Significant Milestones in the Development of Models for Data Interchange
CA "The Clinical Data Interchange Standards Consortium has achieved two significant milestones towards its goal of standard data models to streamline drug development and regulatory review processes. CDISC participants have completed metadata models for the 12 safety domains listed in the FDA Guidance regarding Electronic Submissions and have produced a revised XML-based data model to support data acquisition and archive."
Conclusions
RQ "The goal of the CDISC XML Document Type Definition (DTD) Version 1.0 is to make available a first release of the definition of this CDISC model, in order to support sponsors, vendors and CROs in the design of systems and processes around a standard interchange format."
SOW
DC "This team, under the leadership of Wayne Kubick of Lincoln Technologies, and Dave Christiansen of Genentech, presented their metadata models to a group of representatives at the FDA on Oct. 10, and discussed future cooperative efforts with Agency reviewers."... "CDISC is a non-profit organization with a mission to lead the development of standard, vendor-neutral, platform-independent data models that improve process efficiency while supporting the scientific nature of clinical research in the biopharmaceutical and healthcare industries"
Type
Web Page
Title
Schema Registry: activityreports: Recordkeeping Metadata Standard for Commonwealth Agencies
CA "The Australian SPIRT Recordkeeping Metadata Project was initially a project funded under a programme known as the Strategic Partnership with Industry -- Research and Training (SPIRT) Support Grant -- partly funded by the Australian Research Council. The project was concerned with developing a framework for standardising and defining recordkeeping metadata and produced a metadata element set eventually known as the Australian Recordkeeping Metadata Schema (RKMS). The conceptual frame of reference in the project was based in Australian archival practice, including the Records Continuum Model and the Australian Series System. The RKMS also inherits part of the Australian Government Locator Service (AGLS) metadata set."
This paper discusses how metadata standards can help organizations comply with the ISO 9000 standards for quality systems. It provides a brief overview of metadata, ISO 9000 and related records management standards. It then analyses in some depth the ISO 9000 requirements for quality records, and outlines the problems that some organizations have in complying with them. It also describes the metadata specifications developed by the University of Pittsburgh Electronic Recordkeeping project and the SPIRT Recordkeeping Metadata project in Australia and discusses the role of metadata in meeting ISO 9000 requirements for the creation and preservation of reliable, authentic and accessible records.
Publisher
Records Continuum Research Group
Critical Arguements
CA "During the last few years a number of research projects have studied the types of metadata needed to create, manage and make accessible quality records, i.e. reliable, authentic and useable records. This paper will briefly discuss the purposes of recordkeeping metadata, with reference to emerging records management standards, and the models presented by two projects, one in the United States and one in Australia. It will also briefly review the ISO 9000 requirements for records and illustrate how metadata can help an organization meet these requirements."
Conclusions
RQ "Quality records provide many advantages for organizations and can help companies meet the ISO 9000 certification. However, systems must be designed to create the appropriate metadata to ensure they comply with recordkeeping requirements, particularly those identified by records management standards like AS 4390 and the proposed international standard, which provide benchmarks for recordkeeping best practice. The Pittsburgh metadata model and the SPIRT framework provide organizations with standardized sets of metadata that would ensure the creation, preservation and accessibility of reliable, authentic and meaningful records for as long as they are of use. In deciding what metadata to capture, organisations should consider the cost of meeting the requirements of the ISO 9000 guidelines and any related records management best practice standards, and the possible risk of not meeting these requirements."
Type
Web Page
Title
Report of the Ad Hoc Committee for Development of a Standardized Tool for Encoding Finding Aids
This report focuses on the development of tools for the description and intellectual control of archives and the discovery of relevant resources by users. Other archival functions, such as appraisal, acquisition, preservation, and physical control, are beyond the scope for this project. The system developed as a result of this report should be useable on stand-alone computers in small institutions, by multiple users in larger organisations, and by local, regional, national, and international networks. The development of such a system should take into account the strategies, experiences, and results of other initiatives such as the European Union Archival Network (EUAN), the Linking and Exploring Authority Files (LEAF) initiative, the European Visual Archives (EVA) project, and the Canadian Archival Information Network (CAIN). This report is divided into five sections. A description of the conceptual structure of an archival information system, described as six layers of services and protocols, follows this introduction. Section three details the functional requirements for the software tool and is followed by a discussion of the relationship of these requirements to existing archival software application. The report concludes with a series of recommendations that provide a strategy for the successful development, deployment, and maintenance of an Open Source Archival Resource Information System (OSARIS). There are two appendices: a data model and a comparison of the functional requirements statements to several existing archival systems.
Notes
3. Functional Requirements Requirements for Information Interchange 3.2: The system must support the current archival standards for machine-readable data communication, Encoded Archival Description (EAD) and Encoded Archival Context (EAC). A subset of elements found in EAD may be used to exchange descriptions based on ISAD(G) while elements in EAC may be used to exchange ISAAR(CPF)-based authority data.
Publisher
International Council on Archives Committee on Descriptive Standards
Critical Arguements
CA The Ad Hoc Committee agrees that it would be highly desirable to develop a modular, open source software tool that could be used by archives worldwide to manage the intellectual control of their holdings through the recording of standardized descriptive data. Individual archives could combine their data with that of other institutions in regional, national or international networks. Researchers could access this data either via a stand-alone computerized system or over the Internet. The model for this software would be the successful UNESCO-sponsored free library program, ISIS, which has been in widespread use around the developing world for many years. The software, with appropriate supporting documentation, would be freely available via an ICA or UNESCO web site or on CD-ROM. Unlike ISIS, however, the source code and not just the software should be freely available.
Conclusions
RQ "1. That the ICA endorses the functional requirements presented in this document as the basis for moving the initiative forward. 2. That the functional desiderata and technical specifications for the software applications, such as user requirements, business rules, and detailed data models, should be developed further by a team of experts from both ICA/CDS and ICA/ITC as the next stage of this project. 3. That following the finalization of the technical specifications for OSARIS, the requirements should be compared to existing systems and a decision made to adopt or adapt existing software or to build new applications. At that point in time, it will then be possible to estimate project costs. 4. That a solution that incorporates the functional requirements result in the development of several modular software applications. 5. That the implementation of the system should follow a modular strategy. 6. That the development of software applications must include a thorough investigation and assessment of existing solutions beginning with those identified in section four and Appendix B of this document. 7. That the ICA develop a strategy for communicating the progress of this project to members of the international archival community on a regular basis. This would include the distribution of progress reports in multiple languages. The communication strategy must include a two-way exchange of ideas. The project will benefit strongly from the ongoing comments, suggestions, and input of the members of the international archival community. 8. That a test-bed be developed to allow the testing of software solutions in a realistic archival environment. 9. That the system specifications, its documentation, and the source codes for the applications be freely available. 10. That training courses for new users, ongoing education, and webbased support groups be established. 11. That promotion of the software be carried out through the existing regional infrastructure of ICA and through UNESCO. 12. That an infrastructure for ongoing maintenance, distribution, and technical support be developed. This should include a web site to download software and supporting documentation. The ICA should also establish and maintain a mechanism for end-users to recommend changes and enhancements to the software. 13. That the ICA establishes and maintains an official mechanism for regular review of the software by an advisory committee that includes technical and archival experts. "
SOW
DC "The development of such a system should take into account the strategies, experiences, and results of other initiatives such as the European Union Archival Network (EUAN), the Linking and Exploring Authority Files (LEAF) initiative, the European Visual Archives (EVA) project, and the Canadian Archival Information Network (CAIN)."
Just like other memory institutions, libraries will have to play an important part in the Semantic Web. In that context, ontologies and conceptual models in the field of cultural heritage information are crucial, and the interoperability between these ontologies and models perhaps even more crucial. This document reviews four projects and models that the FRBR Review Group recommends for consideration as to interoperability with FRBR.
Publisher
International Federation of Library Associations and Institutions
Critical Arguements
CA "Just like other memory institutions, libraries will have to play an important part in the Semantic Web. In that context, ontologies and conceptual models in the field of cultural heritage information are crucial, and the interoperability between these ontologies and models perhaps even more crucial."
Conclusions
RQ 
SOW
DC "Some members of the CRM-SIG, including Martin Doerr himself, also are subscribers to the FRBR listserv, and Patrick Le Boeuf, chair of the FRBR Review Group, also is a member of the CRM-SIG and ISO TC46/SC4/WG9 (the ISO Group on CRM). A FRBR to CRM mapping is available from the CIDOC CRM-SIG listserv archive." ... This report was produced by the Cataloguing Section of IFLA, the International Federation of Library Associations and Institutions. 
This document is a draft version 1.0 of requirements for a metadata framework to be used by the International Press Telecommunications Council for all new and revised IPTC standards. It was worked on and agreed to by members of the IPTC Standards Committee, who represented a variety of newspaper, wire agencies, and other interested members of the IPTC.
Notes
Misha Wolf is also listed as author.
Publisher
International Press Telecommunications Council (IPTC)
Critical Arguements
CA "This Requirements document forms part of the programme of work called ITPC Roadmap 2005. The Specification resulting from these Requirements will define the use of metadata by all new IPTC standards and by new major versions of existing IPTC standards." (p. 1) ... "The purpose of the News Metadata Framework (NMDF) WG is to specify how metadata will be expressed, referenced, and managed in all new major versions of IPTC standards. The NMF WG will: Gather, discuss, agree and document functional requirements for the ways in which metadata will be expressed, referenced and managed in all new major versions of IPTC standards; Discuss, agree and document a model, satisfying these requirements; Discuss, agree and document possible approaches to expressing this model in XML, and select those most suited to the tasks. In doing so, the NMDF WG will, where possible, make use of the work of other standards bodies. (p. 2)
Conclusions
RQ "Open issues include: The versioning of schemes, including major and minor versions, and backward compatibility; the versioning of TopicItems; The design of URIs for TopicItem schemes and TopicItem collections, including the issues of: versions (relating to TopicItems, schemes, and collections); representations (relating to TopicItems and collections); The relationship between a [scheme, code] pair, the corresponding URI and the scheme URI." (p. 17)
SOW
DC The development of this framework came out of the 2003 News Standards Summit, which was attended by representatives from over 80 international press and information agencies ... "The News Standards Summit brings together major players--experts on news metadata standards as well as commercial news providers, users, and aggregators. Together, they will analyze the current state and future expectations for news and publishing XML and metadata efforts from both the content and processing model perspectives. The goal is to increase understanding and to drive practical, productive convergence." ... This is a draft version of the standard.
Type
Web Page
Title
NHPRC: Minnesota State Archives Strategic Plan: Electronic Records Consultant Project
National Historical Publications and Records Commission Grant No. 95-030
Critical Arguements
CA "The Electronic Records Consultant Project grant was carried out in conjunction with the strategic planning effort for the Minnesota Historical Society's State Archives program. The objective was to develop a plan for a program that will be responsive to the changing nature of government records." ... "The strategic plan that was developed calls for specific actions to meet five goals: 1) strengthening partnerships, 2) facilitating the identification of historically valuable records, 3) integrating electronic records into the existing program, 4) providing quality public service, and 5) structuring the State Archives Department to meet the demands of this plan."
Type
Web Page
Title
Minnesota Recordkeeping Metadata Standard (IRM Standard 20, Version 1.2)
<P1> The Minnesota Recordkeeping Metadata Standard is referenced as a "current standard" in the Minnesota Enterprise Technical Architecture under Chapter 4, "Data and Records Management Architecture." State agencies bound by the Architecture should reference that document for compliance requirements. <P2> The Minnesota Recordkeeping Metadata Standard is directly based upon the one developed by the National Archives of Australia (NAA), the Recordkeeping Metadata Standard for Commonwealth Nations, version 1.0, May 1999. (p. 7) <warrant> <P3> The Minnesota Recordkeeping Metadata Standard (Minnesota Office of Technology standard IRM 20) was developed to facilitate records management by government entities at any level of government.
"The ERMS Metadata Standard forms Part 2 of the National Archives' 'Requirements for Electronic Records Management Systems' (commonly known as the '2002 Requirements'). It is specified in a technology independent manner, and is aligned with the e-Government Metadata Standard (e-GMS) version 2, April 2003. A version of e-GMS v2 including XML examples was published in the autumn of 2003. This Guide should be read in conjunction with the ERMS Metadata Standard. Readers may find the GovTalk Schema Guidelines (available via http://www.govtalk.gov.uk ) helpful regarding design rules used in building the schemas."
Conclusions
RQ Electronically enabled processes need to generate appropriate records, according to established records management principles. These records need to reach the ERMS that captures them with enough information to enable the ERMS to classify them appropriately, allocate an appropriate retention policy, etc.
SOW
DC This document is a draft.
Type
Web Page
Title
Recordkeeping Metadata Standard for Commonwealth Agencies
This standard describes the metadata that the National Archives of Australia recommends should be captured in the recordkeeping systems used by Commonwealth government agencies. ... Part One of the standard explains the purpose and importance of standardised recordkeeping metadata and details the scope, intended application and features of the standard. Features include: flexibility of application; repeatability of data elements; extensibility to allow for the management of agency-specific recordkeeping requirements; interoperability across systems environments; compatibility with related metadata standards, including the Australian Government Locator Service (AGLS) standard; and interdependency of metadata at the sub-element level.
Critical Arguements
CA Compliance with the Recordkeeping Metadata Standard for Commonwealth Agencies will help agencies to identify, authenticate, describe and manage their electronic records in a systematic and consistent way to meet business, accountability and archival requirements. In this respect the metadata is an electronic recordkeeping aid, similar to the descriptive information captured in file registers, file covers, movement cards, indexes and other registry tools used in the paper-based environment to apply intellectual and physical controls to records.
Conclusions
RQ "The National Archives intends to consult with agencies, vendors and other interested parties on the implementation and continuing evolution of the Recordkeeping Metadata Standard for Commonwealth Agencies." ... "The National Archives expects to re-examine and reissue the standard in response to broad agency feedback and relevant advances in theory and methodology." ... "The development of public key technology is one area the National Archives will monitor closely, in consultation with the Office for Government Online, for possible additions to a future version of the standard."
SOW
DC "This standard has been developed in consultation with recordkeeping software vendors endorsed by the Office for Government OnlineÔÇÖs Shared Systems Initiative, as well as selected Commonwealth agencies." ... "The standard has also been developed with reference to other metadata standards emerging in Australia and overseas to ensure compatibility, as far as practicable, between related resource management tools, including: the Dublin Core-derived Australian Government Locator Service (AGLS) metadata standard for discovery and retrieval of government services and information in web-based environments, co-ordinated by the National Archives of Australia; and the non-sector-specific Recordkeeping Metadata Standards for Managing and Accessing Information Resources in Networked Environments Over Time for Government, Social and Cultural Purposes, co-ordinated by Monash University using an Australian Research Council Strategic Partnership with Industry Research and Training (SPIRT) Support Grant."
This document is a revision and expansion of "Metadata Made Simpler: A guide for libraries," published by NISO Press in 2001.
Publisher
NISO Press
Critical Arguements
CA An overview of what metadata is and does, aimed at librarians and other information professionals. Describes various metadata schemas. Concludes with a bibliography and glossary.
Joined-up government needs joined-up information systems. The e-Government Metadata Standard (e-GMS) lays down the elements, refinements and encoding schemes to be used by government officers when creating metadata for their information resources or designing search interfaces for information systems. The e-GMS is needed to ensure maximum consistency of metadata across public sector organisations.
Publisher
Office of the e-Envoy, Cabinet Office, UK.
Critical Arguements
CA "The e-GMS is concerned with the particular facets of metadata intended to support resource discovery and records management. The Standard covers the core set of ÔÇÿelementsÔÇÖ that contain data needed for the effective retrieval and management of official information. Each element contains information relating to a particular aspect of the information resource, e.g. 'title' or 'creator'. Further details on the terminology being used in this standard can be found in Dublin Core and Part Two of the e-GIF."
Conclusions
RQ "The e-GMS will need to evolve, to ensure it remains comprehensive and consistent with changes in international standards, and to cater for changes in use and technology. Some of the elements listed here are already marked for further development, needing additional refinements or encoding schemes. To limit disruption and cost to users, all effort will be made to future-proof the e-GMS. In particular we will endeavour: not to remove any elements or refinements; not to rename any elements or refinements; not to add new elements that could contain values contained in the existing elements."
SOW
DC The E-GMS is promulgated by the British government as part of its e-government initiative. It is the technical cornerstone of the e-government policy for joining up the public sector electronically and providing modern, improved public services.
During the past decade, the recordkeeping practices in public and private organizations have been revolutionized. New information technologies from mainframes, to PC's, to local area networks and the Internet have transformed the way state agencies create, use, disseminate, and store information. These new technologies offer a vastly enhanced means of collecting information for and about citizens, communicating within state government and between state agencies and the public, and documenting the business of government. Like other modern organizations, Ohio state agencies face challenges in managing and preserving their records because records are increasingly generated and stored in computer-based information systems. The Ohio Historical Society serves as the official State Archives with responsibility to assist state and local agencies in the preservation of records with enduring value. The Office of the State Records Administrator within the Department of Administrative Services (DAS) provides advice to state agencies on the proper management and disposition of government records. Out of concern over its ability to preserve electronic records with enduring value and assist agencies with electronic records issues, the State Archives has adapted these guidelines from guidelines created by the Kansas State Historical Society. The Kansas State Historical Society, through the Kansas State Historical Records Advisory Board, requested a program development grant from the National Historical Publications and Records Commission to develop policies and guidelines for electronic records management in the state of Kansas. With grant funds, the KSHS hired a consultant, Dr. Margaret Hedstrom, an Associate Professor in the School of Information, University of Michigan and formerly Chief of State Records Advisory Services at the New York State Archives and Records Administration, to draft guidelines that could be tested, revised, and then implemented in Kansas state government.
Notes
These guidelines are part of the ongoing effort to address the electronic records management needs of Ohio state government. As a result, this document continues to undergo changes. The first draft, written by Dr. Margaret Hedstrom, was completed in November of 1997 for the Kansas State Historical Society. That version was reorganized and updated and posted to the KSHS Web site on August 18, 1999. The Kansas Guidelines were modified for use in Ohio during September 2000
Critical Arguements
CA "This publication is about maintaining accountability and preserving important historical records in the electronic age. It is designed to provide guidance to users and managers of computer systems in Ohio government about: the problems associated with managing electronic records, special recordkeeping and accountability concerns that arise in the context of electronic government; archival strategies for the identification, management and preservation of electronic records with enduring value; identification and appropriate disposition of electronic records with short-term value, and
Type
Web Page
Title
Requirements for Electronic Records Management Systems: (2) Metadata Standard
Requirements for Electronic Records Management Systems includes: (1) "Functional Requirements" (http://www.nationalarchives.gov.uk/electronicrecords/reqs2002/pdf/requirementsfinal.pdf); (2) "Metadata Standard" (the subject of this record); (3) Reference Document (http://www.nationalarchives.gov.uk/electronicrecords/reqs2002/pdf/referencefinal.pdf); and (4) "Implementation Guidance: Configuration and Metadata Issues" (http://www.nationalarchives.gov.uk/electronicrecords/reqs2002/pdf/implementation.pdf)
Publisher
Public Records Office, [British] National Archives
Critical Arguements
CA Sets out the implications for records management metadata in compliant systems. It has been agreed with the Office of the e-Envoy that this document will form the basis for an XML schema to support the exchange of records metadata and promote interoperability between ERMS and other systems
SOW
DC The National Archives updated the functional requirements for electronic records management systems (ERMS) in collaboration with the central government records management community during 2002. The revision takes account of developments in cross-government and international standards since 1999.
Type
Web Page
Title
Record Keeping Metadata Requirements for the Government of Canada
This document comprises descriptions for metadata elements utilized by the Canadian Government as of January 2001.
Critical Arguements
CA "The Record Keeping Metadata is defined broadly to include the type of information Departments are required to capture to describe the identity, authenticity, content, context, structure and management requirements of records created in the context of a business activity. The Metadata model consists of elements, which are the attributes of a record that are comparable to fields in a database. The model is modular in nature. It permits Departments to use a core set of elements that will meet the minimum requirements for describing and sharing information, while facilitating interoperability between government Departments. It also allows Departments with specialized needs or the need for more detailed descriptions to add new elements and/or sub-elements to the basic metadata in order to satisfy their particular business requirements."
This standard sets out principles for making and keeping full and accurate records as required under section 12(1) of the State Records Act 1998. The principles are: Records must be made; Records must be accurate; Records must be authentic; Records must have integrity; Records must be useable. Each principle is supported by mandatory compliance requirements.
Critical Arguements
CA "Section 21(1) of the State Records Act 1998 requires public offices to 'make and keep full and accurate records'. The purpose of this standard is to assist public offices to meet this obligation and to provide a benchmark against which a public office's compliance may be measured."
Conclusions
RQ None
SOW
DC This standard is promulgated by the State Records Agency of New South Wales, Australia, as required under section 12(1) of the State Records Act 1998.
Type
Web Page
Title
Archiving of Electronic Digital Data and Records in the Swiss Federal Archives (ARELDA): e-government project ARELDA - Management Summary
The goal of the ARELDA project is to find long-term solutions for the archiving of digital records in the Swiss Federal Archives. This includes the accession, the long-term storage, preservation of data, description, and access for the users of the Swiss Federal Archives. It is also coordinated with the basic efforts of the Federal Archives to realize a uniform records management solution in the federal administration and therefore to support the pre-archival creation of documents of archival value for the benefits of the administration as well as of the Federal Archives. The project is indispensable for the long-term execution of the Federal Archives Act; Older IT systems are being replaced by newer ones. A complete migration of the data is sometimes not possible or too expensive; A constant increase of small database applications, built and maintained by people with no IT background; More and more administrative bodies are introducing records and document management systems.
Publisher
Swiss Federal Archives
Publication Location
Bern
Critical Arguements
CA "Archiving in general is a necessary prerequisite for the reconstruction of governmental activities as well as for the principle of legal certainty. It enables citizens to understand governmental activities and ensures a democratic control of the federal administration. And finally are archives a prerequisite for the scientific research, especially in the social and historical fields and ensure the preservation of our cultural heritage. It plays a vital role for an ongoing and efficient records management. A necessary prerequisite for the Federal Archives in the era of the information society will be the system ARELDA (Archiving of Electronic Data and Records)."
Conclusions
RQ "Because of the lack of standard solutions and limited or lacking personal resources for an internal development effort, the realisation of ARELDA will have to be outsourced and the cooperation with the IT division and the Federal Office for Information Technology, Systems and Telecommunication must be intensified. The guidelines for the projects are as follows:
SOW
DC ARELDA is one of the five key projects in the Swiss government's e-government strategy.