Top | Topic News | Event News | News Archive | Projects | Products | Topics | Events | Resource Links and Downloads | Associations | Electronic Journals | Site info | Search | Feedback
Adverts from Google
Metadata and metacontent | Metadata online introduction | The Dublin Core | Open Archives Initiative (OAI) | Meta-content | Resources | Metadata mailing lists | Metadata information clearinghouse | Papers and articles | Conference reports and proceedings | Z39.50 information retrieval resources | Interoperability of e-commerce data (indecs) | Education sector metadata spec. | Metadata for Education | Metadata for the web | RDF | PICS | DOI | Metadata and databases | Standardisation | General resources
There is a very large body of knowledge on the subject of metadata, much of which pre-dates the development of information webs. Metadata is essentially data about data, an example being the card index catalogue in a library; where the information on that card is metadata about a particular publication. The application of solutions based on metadata may solve many of the problems associated with content identification and delivery on information webs. It is from the standpoint of: "metadata being used by anyone to make their material more accessible", that much of the relevant research is based.
The Getty Institute has posted an introductory study of metadata - Introduction to Metadata: Pathways to Digital Information on the web. The introduction says: "This publication is intended as a primer for an important but misunderstood - and still evolving - aspect of the age of information: metadata. Professionals who are deeply involved in the development and implementation of information standards have contributed to this publication...". The work is oriented to cultural heritage and web resource discovery applications and includes a table showing the detailed relation between different metadata standards. 26/04/00
The Dublin Core is likely to play a central role in the process of establishing agreed forms of metadata. There are links to a number of resources concerned with Dublin Core developments on this page.
The Open Archives Initiative (OAI) develops and promotes interoperability standards that aim to facilitate the efficient dissemination of content. The goal of the specifications is to provide an easy way for data providers to expose their metadata and for service providers to access that metadata and use it as input to value-added services. Full details of the meeting including an agenda from the web link below. 01/12/00
URL: Open Archives Initiative http://www.openarchives.org/
In a paper entitled: "Towards a theory of meta-content", R.V. Guha argues for the use of a single expressive language for encoding meta-content irrespective of the source, location or format of the content itself. It seems from the thread of Guha's arguments that the his use of the term metacontent is in fact a synonym for metadata.
Meta-content is defined broadly in the paper as "anything about content", it is descriptive of content - where content can include anything from the documents on hard disks, pages on the WWW, to messages in email folders. Whilst the paper recognises there is no clear line between content and meta-content, it argues that the application of a more principled approach to the definition and use of meta-content will ensure far more effective management and manipulation of information.
Guha, postulates a potential meta-content language and characterises some of the benefits as:
- enhanced ability to search on the context of information;
- richly structured meta-content, enables searching to be transformed into structured query processing and inferencing;
- meta content could be used to provide background context maps, particularly in hypertext environments (eg. the Web) where context can sometimes be lost through hyperlinking.
This paper lead to a proposed standard language for describing meta-content, Meta Content Framework (MCF).
An introductory tutorial by UKOLN which introduces the concept of metadata and specifically the Dublin Core is available on-line.
- The Dublin Core home page.
- A summary of the Dublin Core workplan for 1999, announced at the Sixth Annual Dublin Core Workshop, has been published.
- A starting point providing a description of the form and content of resources concerned with metadata.
- The Dublin Core French translation.
- The Warwick framework along with the Dublin Core, should: "provide a comprehensive infrastructure for network resource description". Reports from the workshop held in Warwick, UK to which the framework refers and an explanation of the framework itself are available.
- NIST: Metadata for Multimedia Objects in the Learning Domain National Learning Object Standard Project.
- UKOLN have developed a Web-based tool for creating Dublin Core <META> tags, called DCdot, and are currently working on a Java based implementation. DC-dot has recently been enhanced, it now: optionally generates a Resource Description Framework (RDF) view of DC (the RDF generated conforms to the most recent draft of the W3C's RDF Model and Syntax Specification); extracts metadata from Microsoft Office files (eg. MS Word, Powerpoint) and HTML web pages; performs some simple validation of any existing DC META tags embedded in the web page being described.
- A major re-write of DC-dot is planned, primarily to provide support for repeated elements and "qualified" Dublin Core. The DC-dot software is available for download from the URL below which also offers a simple set of DC-dot exercises, intended both as an introduction to the functionality of DC-dot and to the Dublin Core in general. These exercises form part of a larger set of material developed for a half-day session on metadata at the Institutional Web Management workshop held at Newcastle University, UK in September 1998. The full set of materials, including slides, exercises on metadata tools/applications and ideas for group work.
URL: DCdot http://www.ukoln.ac.uk/metadata/dcdot/
URL: Institutional Web Management http://www.ukoln.ac.uk/web-focus/metadata/seminar-materials/
The Dublin Core Directorate has announced that a set of revised element definitions (Dublin Core Elements, Version 1.1) has been completed and is available for public review and comment as a Proposed Recommendation of the Dublin Core Metadata Initiative. This revision process is an important component of the Dublin Core workplan that emerged from the Sixth Dublin Core Workshop in November 1998.
The goal of the process was to review and modify Dublin Core element definitions to improve their clarity and to express them in a standard format for data dictionaries (the ISO/IEC 11179 standard), to facilitate interoperability and mapping to other element sets.
Following a period of public review, these modified definitions will replace those of RFC 2413 as the official Dublin Core Element definitions. These definitions will also serve as the base document that will be submitted for standardisation by CEN, the European standards organisation, and NISO, the North American standards organisation. The deadline for public comments is July 31, 1999, after which the Dublin Core Advisory Committee will, in conjunction with the editors, issue the final version as a Dublin Core Metadata Initiative Recommendation.
URL: Dublin Core Elements, Version 1.1 http://purl.org/dc/elements/1.1
URL: comments on DC 1.1 http://archive.dstc.edu.au/RDU/DCAC/PR-DCV11.html
URL: RFC 2413 http://www.ietf.org/rfc/rfc2413.txt
The Dublin Core Metadata Initiative (DCMI), an organisation leading the development of international standards to improve electronic resource management and information discovery, has announced the formal recommendation of the Dublin Core (DC) Qualifiers. The addition of the DC Qualifiers enhances the semantic precision of the existing DC Metadata Element Set.
"Think of Legos. The close tolerances of these simple toys ensure all the different Lego themes, built at different times, can work together smoothly. Dublin Core is the basic Lego block for promoting discovery of resources on the Web: a simple and interoperable foundation upon which many information solutions can be built. The introduction of Dublin Core Qualifiers is like adding color and themes to the Legos - it helps enrich the description of information resources on the Internet", said Stuart Weibel, DCMI Director.
More information about the new recommendation can be found on the web. 25/07/00
URL: DC Qualifiers http://purl.org/dc/documents/rec/dcmes-qualifiers-20000711.htm
URL: Dublin Core http://purl.org/dc/
URL: press release http://www.zotgroup.com/development/dcmi/dcqualifiers.html
The email@example.com list is the mailing list for discussion of all issues relevant to the development, deployment, and use of Dublin Core metadata.
The firstname.lastname@example.org list is a general mailing list for developers who are deploying systems that use RDF. The list focusses especially on the application of RDF to Internet resource discovery issues.
The Metadata Subcommittee of the Association of American Publishers (AAP) Enabling Technologies Committee, launched the: "Metadata Information Clearinghouse Interactive (MICI)". They are hoping that the site will serve as a clearinghouse of information about metadata projects, standards, and initiatives worldwide. It allows the reader to look up projects from whatever perspective the reader may have, for example, by:
- "Type of application" (eg. sales, rights management, archive/reuse, content management);
- "Type of publisher" (eg. books, journals, etc.);
- "Subject" (eg. chemistry, accounting, earth science);
- and also by "Project name".
MICI is designed to be interactive, enabling users to post questions or comments about the projects listed, and conduct threaded discussions with other readers. If readers know of any project which is not yet in the database but should be, they are being requested to create a record for themselves. Readers that are involved in any of the projects already listed, and feel that the existing description should be changed in any way, are being requested to post comments.
- A paper by Ann Apps of the University of Manchester, UK entitled, "Dublin Core Metadata for Electronic Journals", which was presented at ECDL 2000 in Lisbon in September, 2000 is available on the web. The draft preprint version, in PDF format, is accessible via the URL below. 06/07/00
- According to the UK research company, Ovum, the untapped market for integrated metadata management is worth around US$ 10bn. The report, "Repositories and XML: Technology choices for metadata management" (published December 1999), contends that the requirement for companies to manage metadata is become more pressing, and represents a growing market for developers.
The report analyses three types of metadata management software: mature database-based repository products, "stovepipe" solutions applying to just one application context, and XML-based solutions. The report is available from Ovum at US$ 3495.
- "Metadata: The Right Approach" addresses the often puzzling developments in the area of "content metadata". The introduction to the article states that: "Two communities - rights owners on one hand, libraries and cataloguers on the other - are staring at their unfolding data models and systems, knowing that somehow together they make up a whole picture. This paper aims to show how and where they fit.".
- "Metadata: The Right Approach" is based on the author's previous piece: "The Fire and the Rose: an integrated model for rights and descriptive metadata".
See also the related topic page of Intellectual property rights and their protection on this site.
- Proceedings of the OCLC Internet Cataloging Colloquium, San Antonio, Texas January 19, 1996.
- Report from the Metadata Workshop II, Warwick, UK, April 1-3, 1996.
- Report from the Distributed Indexing/Searching Workshop, Cambridge, Massachusetts, May 28-29, 1996.
- Conference proceedings from the The Second IEEE Metadata Conference, Silver Spring, Maryland, USA, September 16 - 17, 1997.
- Official report and proceedings of the Libraries Sector of the European Commission's DG XIII/E-4 Metadata Workshop I, December 1997, with particular emphasis on its role in the library sector.
The Library of Congress Maintenance Agency publishes a page for the International Standard Z39.50, "Information Retrieval (Z39.50): Application Service Definition and Protocol Specification". This page provides information pertaining to the development and maintenance of Z39.50 (existing as well as future versions) and the implementation and use of the Z39.50 protocol.
There is also a presentation on Z39.50 in the form of slides online, or downloadable as a PowerPoint presentation. The presentation also includes bibliographies of further reading on the web and printed publications, and links to Z39.50 software.
URL: Z39.50 Maintenance Agency http://lcweb.loc.gov/z3950/agency/
URL: Z39.50 presentation http://www.musiconline.ac.uk/z3950
Interoperability of Data in E-commerce Systems (indecs) is an international initiative of rights owners which seeks to develop a framework of metadata standards to support network commerce based on intellectual property. The initiative is supported under the European Commission's Info2000 programme embracing multimedia rights clearance systems (MMRCS). The project site claims that indecs will deliver, by the end of 1999:
- a completed generic data model for intellectual property trading in a network environment;
- the mapping of other metadata initiatives to this common model;
- a specification for the development of a "metadata registry" which will make it possible for applications to make use of this mapping to make different metadata schemes interoperable;
- specification for the linking of "person identifiers", an essential part of the infrastructure;
- a Resource Description Framework (RDF) model of the generic data model;
- implementation guides (managerial and technical) for those who need to work with the model;
- proposals to appropriate standards bodies for formal standardisation.
indecs-link is a free monthly online newsletter, detailing new areas and updates added to the indecs website which is aimed at policy-makers, librarians, the creative industries and anyone interested in developing standards to support the trade in intellectual property.
The second issue, dated March 1999, includes details of:
- four recently published project documents which explain the principles behind the creation of the generic data model, detailed above;
- work to exchange information between indecs and the Moving Picture Expert Group (MPEG);
- a report on the 47th meeting of Moving Picture Expert Group (ISO/IEC JTC1 SC29 WG11) in Seoul, South Korea, particularly concerning the development of MPEG-7.
A report on the indecs project, Evaluation Conference, held in London on 7-8 July, 1999, along with 11 presentations and workshop reports from the meeting, has been published on the web. In conclusion, the meeting generally agreed that when the project moved into an implementation phase it would need to "secure the active participation of the stakeholder communities in developing the tools, services and standards which were being specified or proposed".
One hundred and twenty delegates attended the second indecs conference in Washington, USA entitled: "Names, Numbers and Networks", held on 15 November and jointly sponsored by the indecs partners, the US Copyright Office and the US Trade and Patent Office. The full report is published on the indecs site, along with eleven conference papers available for download as .pdf files.
Educom, a nonprofit US-based consortium of higher education institutions which concentrates on the synthesis of education and information technology, along with a coalition of US academic, industry and government organisations is developing a metadata specification for materials used in higher education, corporate and government training programs.
Through the proposed metadata specification the project aims to provide a common vocabulary for educational resources, making them easier to find on the Web. Educom is also developing a Java-based tool that will assist content developers in applying the metadata labels to their materials.
URL: Educom http://www.educom.edu/
URL: Metadata specification http://www.imsproject.org/metadata/
IMS has released version 1. of the learning resources metadata specification. The spec. consists of the Information model, the XML binding and the best practices and implementation guide. In 1997, The IMS Project, part of the non-profit EDUCOM consortium (now EDUCAUSE) of US institutions of higher education and their vendor partners established an effort to develop open, market-based standards for online learning, including specifications for learning content metadata. Also in 1997, groups within the National Institute for Standards and Technology (NIST) and the IEEE P.1484 study group (now the IEEE Learning Technology Standards Committee - LTSC) began similar efforts.
The NIST effort merged with the IMS effort, and the IMS began collaborating with the ARIADNE Project, a European Project with an active metadata definition effort. In 1998, IMS and ARIADNE submitted a joint proposal and specification to IEEE, which formed the basis for the current IEEE Learning Object Metadata (LOM) base document, which is a classification for a pre-draft IEEE Specification. IMS publicised the IEEE work through the IMS community in the US, UK, Europe, Australia, and Singapore during 1999 and brought the resulting feedback into the ongoing specification development process.
Following a meeting in Edinburgh, UK, the UK's Open Metadata for Education Group (MEG) has released its first deliverable - the MEG Concord. This Concord seeks to capture the key principles that MEG members feel should lie behind current and future work in this area. Current signatories to the Concord include the University for Industry, the Scottish University for Industry, SCRAN, mda, the Gateway to Educational Materials (GEM), the Archaeology Data Service, UKOLN, Resource, the Centre for Digital Library Research (CDLR) at the University of Strathclyde, and the Library Information Technology Centre LITC) at South Bank University.
The work of the group is progressed on the uk-meg JISCMail list and at open face-to-face meetings. If this work is of interest to you, please join the mailing list, where announcements of the next meeting will be made in due course. 12/12/00
URL: MEG http://www.ukoln.ac.uk/metadata/education/
URL: MEG Concord http://www.ukoln.ac.uk/metadata/education/documents/concord.html
URL: mailing list http://www.jiscmail.ac.uk/lists/uk-meg.html
The World Wide Web Consortium coordinates a number of disparate metadata activities, through a working group - the W3C Metadata Activity - the principle of which is the RDF work.
Resource Description Framework (RDF), is billed as a vendor-neutral and operating system-independent system for metadata on the Web. RDF is an extension of the work on PICS content description technology (see below), drawing on XML technology along with the submissions to the W3C by Microsoft (XML Web Collections and XML-Data) and Netscape (XML/MCF) . The design of RDF has also been influenced by the Dublin Core/Warwick Framework - see information above.
RDF will allow different application communities to define the metadata property set that best serves the needs of each community. A variety of application areas are envisaged including:
- in resource discovery to provide better search engine capabilities;
- in cataloging for describing the content and content relationships available at a particular Web site, page, or digital library;
- by intelligent software agents to facilitate knowledge sharing and exchange;
- in content rating for child protection and privacy protection;
- in describing collections of pages that represent a single logical "document";
- for describing intellectual property rights of Web pages.
Implemented with digital signatures, the W3C believes that: "RDF will aid in building the 'Web of Trust' for electronic commerce, collaboration, and other applications". RDF development has seven major areas of focus:
- A metadata model and syntax specification, RDF;
- A language for writing RDF schemas;
- A language for expressing processing rules (sometimes called "filters", "preferences", or "profiles" in various applications of metadata) for the use of RDF statements;
- A language for expressing a general query for RDF information;
- An algorithm for canonicalizing RDF for digital signature;
- A syntax for digitally signing RDF; A vocabulary for expressing PICS labels in RDF, and a conversion algorithm from PICS 1.1.
URL: RDF Recommendation http://www.w3.org/RDF
- The Resource Description Framework (RDF) is a technology that can be difficult to understand, especially for those who aren't on the working group. Dave Beckett's RDF Resources provides a hub of links to as many RDF web pages that the author could find containing examples, documents, papers and software. Links also provide further information on related applications and work. 11/07/00
- A tutorial on Resource Description Framework (RDF), covering syntax, semantics, concepts and vocabulary, is available on the web. 07/04/00
- A note authored by Tim Berners-Lee in September 1998 which attempts to answer the question, "Why should I use RDF - why not just XML?", a question which has been around ever since RDF started. It takes as its starting point the premise that there is a clear difference of view between those who want to query documents and those who want to extract the "meaning" in some form and query that.
- Introduction to RDF
- RDF resources for developers
- "RDFMade (Fairly) Easy" is based on the author's wish to get a feeling for his understanding based on the current RDF draft. It is intended to give a practitioner's introduction to expressing RDF concepts in XML, starting from simple and working towards more complex (instead of starting from general and working towards specific, as the draft does).
- A site which provides the "guts" of the RDF architecture for libraries (primarily).
Meta Content Framework (MCF) - a data model for describing metadata for collections of networked information, it also provides a syntax for the representation of instances of this data model using XML.
URL: Latest specification http://www.textuality.com/mcf/NOTE-MCF-XML.html
URL: Introductory tutorial http://www.textuality.com/mcf/MCF-tutorial.html
XML-Data - Microsoft have proposed an alternative, called XML-Data, the position paper from Microsoft which describes their approach along with a white paper on XML is available.
IDML - Proposed by Identify Systems, IDML is a language that extents HTML for representing semantics information, there is a discussion document on why IDML should be implemented rather than using META tags.
URL: Discussion document http://www.identify.com/welcome/idml-faq.html#meta.
URL: Identify Systems http://www.identify.com/
"A Proposed Convention for Embedding Metadata in HTML" attempts to identify a simple means of embedding metadata within HTML documents without requiring additional tags or changes to browser software, and without unnecessarily compromising current practices for robot collection of data.
PICS is an initiative from the W3C to create a means of controlling access to specific topics on the Web. The PICS technology is based on the labelling of content much like the rating systems used for other consumer products and media. Good labelling systems for Internet resources, using the PICS standard, could help in the selection of interesting, high-quality materials and at the same time can be used to help "supervisors" (systems administrators, parents) block access to inappropriate materials.
Whilst PICS addresses the immediate problem of access to inappropriate material, as a labelling system its potential application is far wider. The ability to control the access of information to certain browsers for example could be very useful on intranets by controlling employees' access to information.
The PICS initiative addresses the dual problems of labelling documents on web servers and giving browsers the ability to interpret these labels, displaying the document if it has a valid label attached. It purposely does not define or give attributes to these labels. This is intentional, therefore in order for the system to work it is necessary to classify content and in doing so give "attributes" to the labels. This classification of content is provided by the rating system.
Whilst a number of rating "rating" systems are available (RSACi and SafeSurf are two of the better known) most members of the PICS Consortium support the Recreational Software Advisory Council's (RSAC) content-labelling advisory system for the Internet (RSACi). The system is an objective, content-labelling advisory system for the Internet and is based on this non-profit organization's experience in developing a content rating system for the computer games industry. The RSACi rating system is available at no charge from RSAC's web site.
URL: RSACi http://www.rsac.org/
URL: SafeSurf http://www.SafeSurf.com/
The labelling ability presented by PICS can be used at multiple levels and with the addition of digital signatures could ensure that the "groupware" aspects of information webs can be fully realised.
The labelling scheme adopted by the PICS initiative differs significantly in approach to "blocking" software (from such companies as SurfWatch, and NetNanny) which act at the client end, blocking access to the user. A good overview paper "Rating the Net" , gives a number of interesting trails to follow, although its evaluation of PICS and rating schemes is rather sketchy - best to read about these at the PICS and RSACi sites.
URL: "Rating the Net" http://www.msen.com/~weinberg/rating.htm
URL: SurfWatch http://www.SurfWatch.com/
URL: NetNanny http://www.NetNanny.com/
Publishers will engage in substantial commercial activity over electronic networks - to facilitate this publications which are bought, sold or accessed must be identified according to an internationally accepted standard system. Such a system will enable multiple applications such as the development of electronic copyright management systems, ordering and fulfillment, tracking, billing and payment schemes, bibliographic control and enforcement systems.
The Digital Object Identifier (DOI) project will aid in achieving these objectives, for further information visit the International Publishers Association, Information Identifier Committee pages, and the recently established DOI Web site. DOI has been implemented by a number of publishers, for an example of an implementation visit the John Wiley & Son DOI web page.
- The May 1999 issue of D-Lib Magazine features an article entitled: "Digital Object Identifier (DOI): Current Status and Outlook"
- Information Identifier Committee pages of the International Publishers Association.
- DOI Web site.
- John Wiley & Son DOI web page.
- Presentations (in pdf) from the Digital Object Identifier (DOI) Technology Forum, New York, 10 December.
- Report published on NISO/International DOI Foundation Joint Workshop, 7 May 1998, in Washington, USA, published on the IMPRIMATUR site.
Working under the umbrella of the DOI initiative, 12 scientific and scholarly publishers are collaborating on a reference-linking initiative that they believe will "change the way scientists use the Internet to conduct online research". Expected to launch during the first quarter of 2000, the service should enable researchers to link from references in a journal article to the content of a cited journal article, typically located on a different server and published by a different publisher.
At the outset, approximately three million articles across thousands of journals will be linked through this service, and more than half a million more articles will be linked each year thereafter. The reference-linking service will be run from a central facility which will be managed by an elected Board and will operate in cooperation with the International Digital Object Identifier (DOI) Foundation. It will contain a limited set of metadata, allowing the journal content and links to remain distributed at publishers' sites. Each publisher will set its own access standards, determining what content is available to the researcher following a link (such as access to the abstract or to the full text of an article, by subscription, document delivery, or pay-per-view, etc.).
The service is being organized as a not-for-profit entity to safeguard the independence of each participating publisher to set their own access standards and conditions. The service, which is based on a prototype developed by Wiley and Academic Press, takes advantage of the DOI standard.
Representatives of the participating publishers and the International DOI Foundation are in active discussions with other scientific and scholarly primary journal publishers to make this a broad-based, industry-wide initiative.
URL: DOI http://www.doi.org/
The Meta Data Coalition is a consortium of software vendors and users with a common purpose of driving forward the definition, implementation and ongoing evolution of a metadata interchange format standard and its support mechanisms. The coalition is primarily interested in the use of metadata and its exchange between enterprise applications. It tends to concentrate on the key components of such applications, including: data warehousing, distributed client/server computing, databases (eg. relational, OLAP, OLTP), integrated enterprise-wide applications.
The Meta Data Coalition announced several new initiatives as part of its technical meeting held on November 11, 1999. In July, 1999, the membership ratified the MDC-OIM 1.0, which provides the basic meta-model for representing databases and the interrelationships between them. The new initiatives (further information on the MDC web site) will extend the model into several key areas such as business models and information portals. This will enable the integration of an even larger set of tools and business applications using the MDC-OIM and its XML interchange format.
Microsoft has joined the Meta Data Coalition (MDC), and: "transferred to the organisation the rights to maintain and evolve the Microsoft Open Information Model (OIM)", a specification for representing metadata. As part of this agreement, the coalition will integrate the OIM and MDC's Meta Data Interface Standard (MDIS) 1.1. The intention is to ensure interoperability of tools, applications and repositories implementing the standard.
The OIM specification, based on SQL, COM and Java, is part of the Microsoft Data Warehousing Framework, an architecture for integrating all aspects of decision support, including the building, managing and use of data warehouses and data marts. The MDC plans to make the unified specification available to its members for review in the first quarter of 1999; availability for implementation is expected in the third quarter of 1999.
The Mozilla RDF / Z39.50 Integration project, hosted by the Mozilla Organisation, is looking for participants. The organisers believe that the work will be of interest to all groups investing in any combination of Z39.50, Dublin Core and RDF technologies for the creation of resource discovery systems.
The project can loosely be characterised as an effort to incorporate the Z39.50 search and retrieval protocol into the next-generation Mozilla/Netscape web browser. Further details on the relationship between Netscape and its now open source browser software, Mozilla, are available on the web. For a project overview, further technical details, open issues and current status, see the project home page .
The development team wish to find a way of integrating Z39.50 search into the browser in such a way as to have search results show up in the user interface, quite possibly as a set of simple Dublin Core records. The solution is likely to draw heavily on work being carried out within the W3C on the Resource Description Framework (RDF).
URL: Mozilla and Netscape relationship http://www.mozilla.org/mission.html
URL: project description http://www.mozilla.org/rdf/doc/z3950.html
URL: RDF resources http://www.mozilla.org/rdf/doc/
URL: RDF resources http://www.w3.org/RDF/
The Object Management Group (OMG) has extended its support for distributed metadata standards by publishing a Common Warehouse Metamodel (CWM) Specification. Metadata management, and reconciliation of inconsistent metadata when data from different sources are merged, are the biggest problems facing enterprises working with data warehousing today.
OMGs CWM provides a standard solution to this problem. Building on three existing industry standards the OMGs Unified Modeling Language (UML), XML, and OMGs XML Metadata Interchange (XMI) the CWM starts by establishing a common metamodel for warehousing but then goes beyond this to also standardise the syntax and semantics needed for import, export, and other dynamic data warehousing operations.
OMGs CWM Specification documents are available for viewing or downloading at the URLs given below. 11/07/00
URL: OMG http://www.omg.org/
URL: Corba http://www.corba.org/
The Meta Data Coalition (MDC) and the Object Management Group (OMG), two industry organisations with competing data warehousing standards, are to work together on a single standard for metadata and modeling in the areas of data warehousing and component-based development.
The merger of MDC into the OMG marks an agreement of the major data warehousing and metadata vendors to converge on one standard, incorporating the best of the MDC's Open Information Model (OIM) with the best of the OMG's Common Warehouse Metamodel (CWM).
When the work is complete, the resulting specification will be issued by the OMG as the next version of the CWM as a single standard to allow users to exchange metadata between different products from different vendors freely. 02/10/00
URL: OMG http://www.omg.org/
URL: OMG http://www.corba.org/
URL: MDC http://www.MDCinfo.com/
Recognising the industrial significance of establishing agreements on metadata, the Metadata for Multimedia Workshop aims to provide "a forum for industrial players and research projects in Europe to form a concensus, by:
- gathering information on metadata activities of European projects and industry, plus international activities
- analysing current work including identification of overlaps and gaps
- disseminating information to European industry, projects and programmes
- generating recommendations, advice and workshop agreements."
The Secretariat, responsible for the: CEN/ISSS Workshop on Metadata for Multimedia Information (MMI) - Dublin Core, publishes details of its meetings on its website.
URL: CEN/ISSS http://www.cenorm.be/isss/
The CEN/ISSS Workshop on Metadata for Multimedia Information - Dublin Core (MMI-DC) is turning its attentions to the complex subject of defining management information for metadata. Projects based on the Dublin Core have a requirement for metadata, not only concerning the content of the described resource, but also about the metadata itself, eg. information about the person or corporate body sending metadata.
The Dublin Core Metadata Initiative (DCMI) Administration Working Group has been established to propose an element set for the management of metadata based on the work of A-Core. The Working Group will research existing practice by: collecting examples of current use of administrative metadata; working to discover existing, defined element sets for management of metadata analysing these and using the results to propose an element set for the management of metadata. 01/12/00
URL: MMI-DC http://www.cenorm.be/isss/Workshop/MMI-DC/
URL: DCMI Administration Working Group http://purl.org/dc/groups/admin.htm
URL: "The A-Core: Metadata about Content Metadata" http://metadata.net/admin/draft-iannella-admin-01.txt
The ISO/IEC JTC1 Subcommittee 32, Data Management and Interchange, approved a Metadata Working Group (WG2). Eliot Christian, (mailto:email@example.com) of the US Geological Survey will be Convenor of the group, whose scope was officially defined as follows:
"WG2 is responsible for standards that facilitate specification and management of metadata. Use of these standards will enhance the understanding and sharing of data, information, and processes to support, for example, interoperability, electronic commerce, and component-based development. The scope shall include:
The Schemas project is a two year accompanying measure to the ECs 5th Framework Programme. The project's aim is to provide information regarding the status and use of new and emerging metadata standards, and to promote good practice guidelines for adapting metadata standards for local use in customised, implementation specific schemas. The project website contains a growing list of resources and is maintained by the UKOLN. 22/12/00
The results on the ETB-survey (European Treasury Browser) run by the European Schoolnet initiative has been published, which gives an idea of the current European wide situation on the educational servers and their use of metadata. The entire report, called Survey on School Educational Repositories, is downloadable from the web. 02/10/00
Cedars (CURL Exemplars in Digital Archives), a project in the UK's eLib research programme, has carried out a review of metadata in the area of digital preservation and published it as a report entitled: "Metadata in preservation". A specialised form of administrative metadata, it can be used to store technical information that supports the preservation of digital objects and at the same time can be used for rights and collection management activities.
The British Computer Society's electronic & multimedia publishing specialist group held a one day seminar entitled: "Metadata Matters" during March 1999, and have published copies of most of the presentation slides on the web. Two presentations entitled: "Why metadata matters for libraries" and "Metadata and the Web", are available online at separate sites.
URL: seminar http://www.kcl.ac.uk/kis/support/cc/staff/malcolm/metadata.htm
URL: Why metadata matters http://www.ukoln.ac.uk/metadata/presentations/bcs/
URL: Metadata and the Web http://www.cs.ukc.ac.uk/people/staff/djb1/talks/bcs-metadata/
SiteMetrics carries out a quarterly "Web Content Survey" of commercial US Web sites. The second survey released in early June 1998 examined content on the home pages of Web sites owned by 31,000 businesses in 14 different industries with annual revenues from US$10 million to more than US$1 billion.
Overall, the survey found that only a third of the sites surveyed included the META keyword or description tags. Among those Web sites revised within the last 2 weeks, META keyword usage climbed to 39%. The study also found significant variation among the 14 industries surveyed. The Industrial Tech, Travel and Computer industries lead with nearly 35% of Web sites using META keywords. The lowest adoption rates were found in the Education and Utilities industries with just 24%.
El.pub - Interactive
Electronic Publishing R & D News and Resources
We welcome feedback and contributions to the information service, and proposals for subjects for the news service (mail to: firstname.lastname@example.org)
Edited by: Logical Events Limited - electronic marketing, search engine marketing, pay per click advertising, search engine optimisation, website optimisation consultants in London, UK. Visit our website at: www.logicalevents.co.uk
Last up-dated: 1 December 2016
© 2016 Copyright and disclaimer El.pub and www.elpub.org are brand names owned by Logical Events Limited - no unauthorised use of them or the contents of this website is permitted without prior permission.