(Created page with " == Abstract == The increasing importance of outdoor applications such as driver assistance systems or video surveillance tasks has recently triggered the development of opti...")
 
 
(One intermediate revision by the same user not shown)
Line 3: Line 3:
  
 
The increasing importance of outdoor applications such as driver assistance systems or video surveillance tasks has recently triggered the development of optical flow methods that aim at performing robustly under uncontrolled illumination. Most of these methods are based on patch-based features such as the normalized cross correlation, the census transform or the rank transform. They achieve their robustness by locally discarding both absolute brightness and contrast. In this paper, we follow an alternative strategy: Instead of discarding potentially important image information, we propose a novel variational model that jointly estimates both illumination changes and optical flow. The key idea is to parametrize the illumination changes in terms of basis functions that are learned from training data. While such basis functions allow for a meaningful representation of illumination effects, they also help to distinguish real illumination changes from motion-induced brightness variations if supplemented by additional smoothness constraints. Experiments on the KITTI benchmark show the clear benefits of our approach. They do not only demonstrate that it is possible to obtain meaningful basis functions, they also show state-of-the-art results for robust optical flow estimation.
 
The increasing importance of outdoor applications such as driver assistance systems or video surveillance tasks has recently triggered the development of optical flow methods that aim at performing robustly under uncontrolled illumination. Most of these methods are based on patch-based features such as the normalized cross correlation, the census transform or the rank transform. They achieve their robustness by locally discarding both absolute brightness and contrast. In this paper, we follow an alternative strategy: Instead of discarding potentially important image information, we propose a novel variational model that jointly estimates both illumination changes and optical flow. The key idea is to parametrize the illumination changes in terms of basis functions that are learned from training data. While such basis functions allow for a meaningful representation of illumination effects, they also help to distinguish real illumination changes from motion-induced brightness variations if supplemented by additional smoothness constraints. Experiments on the KITTI benchmark show the clear benefits of our approach. They do not only demonstrate that it is possible to obtain meaningful basis functions, they also show state-of-the-art results for robust optical flow estimation.
 
Document type: Part of book or chapter of book
 
 
== Full document ==
 
<pdf>Media:Draft_Content_737471888-beopen274-5241-document.pdf</pdf>
 
  
  
Line 15: Line 10:
  
 
* [http://www.mia.uni-saarland.de/Publications/demetz-eccv14.pdf http://www.mia.uni-saarland.de/Publications/demetz-eccv14.pdf]
 
* [http://www.mia.uni-saarland.de/Publications/demetz-eccv14.pdf http://www.mia.uni-saarland.de/Publications/demetz-eccv14.pdf]
 +
 +
* [http://link.springer.com/content/pdf/10.1007/978-3-319-10590-1_30 http://link.springer.com/content/pdf/10.1007/978-3-319-10590-1_30],
 +
: [http://dx.doi.org/10.1007/978-3-319-10590-1_30 http://dx.doi.org/10.1007/978-3-319-10590-1_30] under the license http://www.springer.com/tdm
 +
 +
* [https://dblp.uni-trier.de/db/conf/eccv/eccv2014-1.html#DemetzSVWB14 https://dblp.uni-trier.de/db/conf/eccv/eccv2014-1.html#DemetzSVWB14],
 +
: [http://www.mia.uni-saarland.de/Publications/demetz-eccv14.pdf http://www.mia.uni-saarland.de/Publications/demetz-eccv14.pdf],
 +
: [https://link.springer.com/chapter/10.1007/978-3-319-10590-1_30 https://link.springer.com/chapter/10.1007/978-3-319-10590-1_30],
 +
: [https://www.scipedia.com/public/Demetz_et_al_2014a https://www.scipedia.com/public/Demetz_et_al_2014a],
 +
: [https://rd.springer.com/chapter/10.1007/978-3-319-10590-1_30 https://rd.springer.com/chapter/10.1007/978-3-319-10590-1_30],
 +
: [https://doi.org/10.1007/978-3-319-10590-1_30 https://doi.org/10.1007/978-3-319-10590-1_30],
 +
: [https://academic.microsoft.com/#/detail/195690893 https://academic.microsoft.com/#/detail/195690893]

Latest revision as of 15:31, 21 January 2021

Abstract

The increasing importance of outdoor applications such as driver assistance systems or video surveillance tasks has recently triggered the development of optical flow methods that aim at performing robustly under uncontrolled illumination. Most of these methods are based on patch-based features such as the normalized cross correlation, the census transform or the rank transform. They achieve their robustness by locally discarding both absolute brightness and contrast. In this paper, we follow an alternative strategy: Instead of discarding potentially important image information, we propose a novel variational model that jointly estimates both illumination changes and optical flow. The key idea is to parametrize the illumination changes in terms of basis functions that are learned from training data. While such basis functions allow for a meaningful representation of illumination effects, they also help to distinguish real illumination changes from motion-induced brightness variations if supplemented by additional smoothness constraints. Experiments on the KITTI benchmark show the clear benefits of our approach. They do not only demonstrate that it is possible to obtain meaningful basis functions, they also show state-of-the-art results for robust optical flow estimation.


Original document

The different versions of the original document can be found in:

http://dx.doi.org/10.1007/978-3-319-10590-1_30 under the license http://www.springer.com/tdm
http://www.mia.uni-saarland.de/Publications/demetz-eccv14.pdf,
https://link.springer.com/chapter/10.1007/978-3-319-10590-1_30,
https://www.scipedia.com/public/Demetz_et_al_2014a,
https://rd.springer.com/chapter/10.1007/978-3-319-10590-1_30,
https://doi.org/10.1007/978-3-319-10590-1_30,
https://academic.microsoft.com/#/detail/195690893
Back to Top

Document information

Published on 01/01/2014

Volume 2014, 2014
DOI: 10.1007/978-3-319-10590-1_30
Licence: CC BY-NC-SA license

Document Score

0

Views 2
Recommendations 0

Share this document

claim authorship

Are you one of the authors of this document?