Surveillance Application Metadata Format
Metadata interoperability is crucial for various kinds of surveillance applications and systems, e.g. metadata mining in multi-sensor environments, metadata exchange in networked camera systems or information fusion in multi-sensor and multi-detector environments.
Different metadata formats have been proposed to foster metadata interoperability, but they show significant limitations, e.g. supporting only the visual modality, representing metadata only in a frame based approach or are to complex.
To overcome these limitations of existing formats we developed the Surveillance Application Metadata (SAM) model. It is capable of describing online and offline analysis results as a set of time lines containing events. A set of sensors, detectors, recorded media items and object instances is described centrally and linked from the event descriptions. The time lines can be related to a subset of sensors and detectors for any modality and different levels of abstraction. The model supports efficient representation of dense spatio-temporal information such as object trajectories.
A more detailed description of the SAM data model can be found at
- SAM - An Interoperable Metadata Model for Multimodal Surveillance Applications. SPIE conference on Data Mining, Intrusion Detection, Information Security and Assurance, and Data Networks Security, 2009.
SAM is not bound to a specific serialization but can be mapped to different existing formats within the limitations evoked by the target format. A native XML based serialization format (SAM.F) has been defined
In SAM hierarchical classification schemes are used for many purposes, such as types of properties and their values, event types, object classes, coordinate systems etc. in order to allow for application specific adaptations without modifying the data model while ensuring the controlled use of terms, e.g.