MPEG-7

[1][2][3][4] This description will be associated with the content itself, to allow fast and efficient searching for material that is of interest to the user.

It uses XML to store metadata, and can be attached to timecode in order to tag particular events, or synchronise lyrics to a song, for example.

A few application examples are: The MPEG-7 standard was originally written in XML Schema (XSD), which constitutes semi-structured data.

For example, the running time of a movie annotated using MPEG-7 in XML is machine-readable data, so software agents will know that the number expressing the running time is a positive integer, but such data is not machine-interpretable (cannot be understood by agents), because it does not convey semantics (meaning), known as the "Semantic Gap."

To address this issue, there were many attempts to map the MPEG-7 XML Schema to the Web Ontology Language (OWL), which is a structured data equivalent of the terms of the MPEG-7 standard (MPEG-7Ontos, COMM, SWIntO, etc.).

[9] In other words, annotating an automatically extracted video feature, such as color distribution, does not provide the meaning of the actual visual content.

Independence between description and content
Relation between different tools and elaboration process of MPEG-7