(Created page with " == Abstract == Event cameras are bio-inspired vision sensors that output pixel-level brightness changes instead of standard intensity frames. These cameras do not suffer fro...")
 
m (Scipediacontent moved page Draft Content 985036245 to Gallego et al 2017a)
 
(No difference)

Latest revision as of 10:54, 16 February 2021

Abstract

Event cameras are bio-inspired vision sensors that output pixel-level brightness changes instead of standard intensity frames. These cameras do not suffer from motion blur and have a very high dynamic range, which enables them to provide reliable visual information during high-speed motions or in scenes characterized by high dynamic range. These features, along with a very low power consumption, make event cameras an ideal complement to standard cameras for VR/AR and video game applications. With these applications in mind, this paper tackles the problem of accurate, low-latency tracking of an event camera from an existing photometric depth map (i.e., intensity plus depth information) built via classic dense reconstruction pipelines. Our approach tracks the 6-DOF pose of the event camera upon the arrival of each event, thus virtually eliminating latency. We successfully evaluate the method in both indoor and outdoor scenes and show that---because of the technological advantages of the event camera---our pipeline works in scenes characterized by high-speed motion, which are still unaccessible to standard cameras.

Comment: 12 pages, 13 figures. 2 tables. (in press)


Original document

The different versions of the original document can be found in:

http://dx.doi.org/10.1109/tpami.2017.2769655
https://www.arxiv.org/abs/1607.03468,
https://ieeexplore.ieee.org/document/8094962,
https://www.zora.uzh.ch/id/eprint/150432,
https://www.ncbi.nlm.nih.gov/pubmed/29990121,
http://europepmc.org/abstract/MED/29990121,
https://ui.adsabs.harvard.edu/abs/2016arXiv160703468G/abstract,
https://www.research-collection.ethz.ch/handle/20.500.11850/217074,
https://infoscience.epfl.ch/record/232970,
https://doi.org/10.1109/TPAMI.2017.2769655,
http://ieeexplore.ieee.org/document/8094962,
https://www.arxiv-vanity.com/papers/1607.03468,
https://fr.arxiv.org/abs/1607.03468,
https://128.84.21.199/abs/1607.03468,
https://infosciences.epfl.ch/record/232970?ln=en,
https://academic.microsoft.com/#/detail/2766013930


DOIS: 10.1109/tpami.2017.2769655 10.5167/uzh-150432

Back to Top

Document information

Published on 01/01/2017

Volume 2017, 2017
DOI: 10.1109/tpami.2017.2769655
Licence: CC BY-NC-SA license

Document Score

0

Views 2
Recommendations 0

Share this document

claim authorship

Are you one of the authors of this document?