Illumination-Guided Video Composition via Gradient Consistency Optimization.


Journal

IEEE transactions on image processing : a publication of the IEEE Signal Processing Society
ISSN: 1941-0042
Titre abrégé: IEEE Trans Image Process
Pays: United States
ID NLM: 9886191

Informations de publication

Date de publication:
20 May 2019
Historique:
pubmed: 21 5 2019
medline: 21 5 2019
entrez: 21 5 2019
Statut: aheadofprint

Résumé

Video composition aims at cloning a patch from the source video into the target scene to create a seamless and harmonious blending frame sequence. Previous work in video composition usually suffer from artifacts around the blending region and spatial-temporal consistency when illumination intensity varies in the input source and target video. We propose an illumination-guided video composition method via a unified spatial and temporal optimization framework. Our method can produce globally consistent composition results and maintain the temporal coherency. We first compute a spatial-temporal blending boundary iteratively. For each frame, the gradient field of the target and source frames are mixed adaptively based on gradients and inter-frame color difference. The temporal consistency is further obtained by optimizing luminance gradients throughout all the composition frames. Moreover, we extend the mean-value cloning by smoothing discrepancies between the source and target frames, then eliminate the color distribution overflow exponentially to reduce falsely blending pixels. Various experiments have shown the effectiveness and high-quality performance of our illumination-guided composition.

Identifiants

pubmed: 31107653
doi: 10.1109/TIP.2019.2916769
doi:

Types de publication

Journal Article

Langues

eng

Auteurs

Classifications MeSH