The CDISC Submission Metadata Model was created to help ensure that the supporting metadata for these submission datasets should meet the following objectives: Provide FDA reviewers with clear describtions of the usage, structure, contents, and attributes of all datasets and variables; Allow reviewers to replicate most analyses, tables, graphs, and listings with minimal or no transformations; Enable reviewers to easily view and subset the data used to generate any analysis, table, graph, or listing without complex programming. ... The CDISC Submission Metadata Model has been defined to guide sponsors in the preparation of data that is to be submitted to the FDA. By following the principles of this model, sponsors will help reviewers to accurately interpret the contents of submitted data and work with it more effectively, without sacrificing the scientific objectives of clinical development.
Publisher
The Clinical Data Interchange Standards Consortium
Critical Arguements
CA "The CDISC Submission Data Model has focused on the use of effective metadata as the most practical way of establishing meaningful standards applicable to electronic data submitted for FDA review."
Conclusions
RQ "Metadata prepared for a domain (such as an efficacy domain) which has not been described in a CDISC model should follow the general format of the safety domains, including the same set of core selection variables and all of the metadata attributes specified for the safety domains. Additional examples and usage guidelines are available on the CDISC web site at www.cdisc.org." ... "The CDISC Metadata Model describes the structure and form of data, not the content. However, the varying nature of clinical data in general will require the sponsor to make some decisions about how to represent certain real-world conditions in the dataset. Therefore, it is useful for a metadata document to give the reviewer an indication of how the datasets handle certain special cases."
SOW
DC CDISC is an open, multidisciplinary, non-profit organization committed to the development of worldwide standards to support the electronic acquisition, exchange, submission and archiving of clinical trials data and metadata for medical and biopharmaceutical product development. CDISC members work together to establish universally accepted data standards in the pharmaceutical, biotechnology and device industries, as well as in regulatory agencies worldwide. CDISC currently has more than 90 members, including the majority of the major global pharmaceutical companies.
Type
Web Page
Title
CDISC Achieves Two Significant Milestones in the Development of Models for Data Interchange
CA "The Clinical Data Interchange Standards Consortium has achieved two significant milestones towards its goal of standard data models to streamline drug development and regulatory review processes. CDISC participants have completed metadata models for the 12 safety domains listed in the FDA Guidance regarding Electronic Submissions and have produced a revised XML-based data model to support data acquisition and archive."
Conclusions
RQ "The goal of the CDISC XML Document Type Definition (DTD) Version 1.0 is to make available a first release of the definition of this CDISC model, in order to support sponsors, vendors and CROs in the design of systems and processes around a standard interchange format."
SOW
DC "This team, under the leadership of Wayne Kubick of Lincoln Technologies, and Dave Christiansen of Genentech, presented their metadata models to a group of representatives at the FDA on Oct. 10, and discussed future cooperative efforts with Agency reviewers."... "CDISC is a non-profit organization with a mission to lead the development of standard, vendor-neutral, platform-independent data models that improve process efficiency while supporting the scientific nature of clinical research in the biopharmaceutical and healthcare industries"
Type
Web Page
Title
PBCore: Public Broadcasting Metadata Dictionary Project
CA "PBCore is designed to provide -- for television, radio and Web activities -- a standard way of describing and using media (video, audio, text, images, rich interactive learning objects). It allows content to be more easily retrieved and shared among colleagues, software systems, institutions, community and production partners, private citizens, and educators. It can also be used as a guide for the onset of an archival or asset management process at an individual station or institution. ... The Public Broadcasting Metadata Dictionary (PBCore) is: a core set of terms and descriptors (elements) used to create information (metadata) that categorizes or describes media items (sometimes called assets or resources)."
Conclusions
<RQ> The PBCore Metadata Elements are currently in their first published edition, Version 1.0. Over two years of research and lively discussions have generated this version. ... As various users and communities begin to implement the PBCore, updates and refinements to the PBCore are likely to occur. Any changes will be clearly identified, ramifications outlined, and published to our constituents.
SOW
DC "Initial development funding for PBCore was provided by the Corporation for Public Broadcasting. The PBCore is built on the foundation of the Dublin Core (ISO 15836) ... and has been reviewed by the Dublin Core Metadata Initiative Usage Board. ... PBCore was successfully deployed in a number of test implementations in May 2004 in coordination with WGBH, Minnesota Public Radio, PBS, National Public Radio, Kentucky Educational Television, and recognized metadata expert Grace Agnew. As of July 2004 in response to consistent feedback to make metadata standards easy to use, the number of metadata elements was reduced to 48 from the original set of 58 developed by the Metadata Dictionary Team. Also, efforts are ongoing to provide more focused metadata examples that are specific to TV and radio. ... Available free of charge to public broadcasting stations, distributors, vendors, and partners, version 1.0 of PBCore was launched in the first quarter of 2005. See our Licensing Agreement via the Creative Commons for further information. ... Plans are under way to designate an Authority/Maintenance Organization."
This document is a draft version 1.0 of requirements for a metadata framework to be used by the International Press Telecommunications Council for all new and revised IPTC standards. It was worked on and agreed to by members of the IPTC Standards Committee, who represented a variety of newspaper, wire agencies, and other interested members of the IPTC.
Notes
Misha Wolf is also listed as author.
Publisher
International Press Telecommunications Council (IPTC)
Critical Arguements
CA "This Requirements document forms part of the programme of work called ITPC Roadmap 2005. The Specification resulting from these Requirements will define the use of metadata by all new IPTC standards and by new major versions of existing IPTC standards." (p. 1) ... "The purpose of the News Metadata Framework (NMDF) WG is to specify how metadata will be expressed, referenced, and managed in all new major versions of IPTC standards. The NMF WG will: Gather, discuss, agree and document functional requirements for the ways in which metadata will be expressed, referenced and managed in all new major versions of IPTC standards; Discuss, agree and document a model, satisfying these requirements; Discuss, agree and document possible approaches to expressing this model in XML, and select those most suited to the tasks. In doing so, the NMDF WG will, where possible, make use of the work of other standards bodies. (p. 2)
Conclusions
RQ "Open issues include: The versioning of schemes, including major and minor versions, and backward compatibility; the versioning of TopicItems; The design of URIs for TopicItem schemes and TopicItem collections, including the issues of: versions (relating to TopicItems, schemes, and collections); representations (relating to TopicItems and collections); The relationship between a [scheme, code] pair, the corresponding URI and the scheme URI." (p. 17)
SOW
DC The development of this framework came out of the 2003 News Standards Summit, which was attended by representatives from over 80 international press and information agencies ... "The News Standards Summit brings together major players--experts on news metadata standards as well as commercial news providers, users, and aggregators. Together, they will analyze the current state and future expectations for news and publishing XML and metadata efforts from both the content and processing model perspectives. The goal is to increase understanding and to drive practical, productive convergence." ... This is a draft version of the standard.
During the past decade, the recordkeeping practices in public and private organizations have been revolutionized. New information technologies from mainframes, to PC's, to local area networks and the Internet have transformed the way state agencies create, use, disseminate, and store information. These new technologies offer a vastly enhanced means of collecting information for and about citizens, communicating within state government and between state agencies and the public, and documenting the business of government. Like other modern organizations, Ohio state agencies face challenges in managing and preserving their records because records are increasingly generated and stored in computer-based information systems. The Ohio Historical Society serves as the official State Archives with responsibility to assist state and local agencies in the preservation of records with enduring value. The Office of the State Records Administrator within the Department of Administrative Services (DAS) provides advice to state agencies on the proper management and disposition of government records. Out of concern over its ability to preserve electronic records with enduring value and assist agencies with electronic records issues, the State Archives has adapted these guidelines from guidelines created by the Kansas State Historical Society. The Kansas State Historical Society, through the Kansas State Historical Records Advisory Board, requested a program development grant from the National Historical Publications and Records Commission to develop policies and guidelines for electronic records management in the state of Kansas. With grant funds, the KSHS hired a consultant, Dr. Margaret Hedstrom, an Associate Professor in the School of Information, University of Michigan and formerly Chief of State Records Advisory Services at the New York State Archives and Records Administration, to draft guidelines that could be tested, revised, and then implemented in Kansas state government.
Notes
These guidelines are part of the ongoing effort to address the electronic records management needs of Ohio state government. As a result, this document continues to undergo changes. The first draft, written by Dr. Margaret Hedstrom, was completed in November of 1997 for the Kansas State Historical Society. That version was reorganized and updated and posted to the KSHS Web site on August 18, 1999. The Kansas Guidelines were modified for use in Ohio during September 2000
Critical Arguements
CA "This publication is about maintaining accountability and preserving important historical records in the electronic age. It is designed to provide guidance to users and managers of computer systems in Ohio government about: the problems associated with managing electronic records, special recordkeeping and accountability concerns that arise in the context of electronic government; archival strategies for the identification, management and preservation of electronic records with enduring value; identification and appropriate disposition of electronic records with short-term value, and