Detail výsledku

Distributed Visual Sensor Network Fusion

CHMELAŘ, P.; ZENDULKA, J. Distributed Visual Sensor Network Fusion. 4th Joint Workshop on Multimodal Interaction and Related Machine Learning Algorithms. Brno: 2007. 2 s.
Typ
abstrakt
Jazyk
česky
Autoři
Chmelař Petr, Ing., UIFS (FIT)
Zendulka Jaroslav, doc. Ing., CSc., UIFS (FIT)
Abstrakt

The poster deals with a framework fordistributed visual sensor network metadata management system. It is assumedthat data coming from many cameras is annotated using computer vision modulesto produce metadata representing moving objects in their states. The data issupposed to be noisy, uncertain and some states might be missing. Firstly, hereis described the spatio-temporal data cleaning using Kalman filter. Secondly,it copes with many visual sensors fusion and persistent object tracking within alarge area. Thirdly, it describes the data and architecture model.

Klíčová slova

Visual sensor, distributednetwork, metadata management, moving objects, spatio-temporal data,Kalman filter, sensor fusion, object tracking, large area surveillancesystem.

Klíčová slova anglicky

Visual sensor, distributed network, metadata management, moving objects, spatio-temporal data, Kalman filter, sensor fusion, object tracking, large area surveillance system.

Rok
2007
Strany
2
Kniha
4th Joint Workshop on Multimodal Interaction and Related Machine Learning Algorithms
Konference
Machine Learning and Multimodal Interaction
Místo
Brno
BibTeX
@misc{BUT192634,
  author="Petr {Chmelař} and Jaroslav {Zendulka}",
  title="Distributed Visual Sensor Network Fusion",
  booktitle="4th Joint Workshop on Multimodal Interaction and Related Machine Learning Algorithms",
  year="2007",
  pages="2",
  address="Brno",
  note="Abstract"
}
Projekty
Výzkum informačních technologií z hlediska bezpečnosti, MŠMT, Institucionální prostředky SR ČR (např. VZ, VC), MSM0021630528, zahájení: 2007-01-01, ukončení: 2013-12-31, řešení
Výzkumné skupiny
Pracoviště
Nahoru