Color-Aware Deep Temporal Backdrop Duplex Matting System

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Authors

View graph of relations

Details

Original languageEnglish
Title of host publicationMMSys '23
Subtitle of host publicationProceedings of the 14th Conference on ACM Multimedia Systems
Pages205-216
Number of pages12
ISBN (electronic)9798400701481
Publication statusPublished - 8 Jun 2023
Event14th ACM Multimedia Systems Conference, MMSys 2023 - Vancouver, Canada
Duration: 7 Jun 202310 Jun 2023

Abstract

Deep learning-based alpha matting showed tremendous improvements in recent years, yet, feature film production studios still rely on classical chroma keying including costly post-production steps. This perceived discrepancy can be explained by some missing links necessary for production which are currently not adequately addressed in the alpha matting community, in particular foreground color estimation or color spill compensation. We propose a neural network-based temporal multi-backdrop production system that combines beneficial features from chroma keying and alpha matting. Given two consecutive frames with different background colors, our one-encoder-dual-decoder network predicts foreground colors and alpha values using a patch-based overlap-blend approach. The system is able to handle imprecise backdrops, dynamic cameras, and dynamic foregrounds and has no restrictions on foreground colors. We compare our method to state-of-The-Art algorithms using benchmark datasets and a video sequence captured by a demonstrator setup. We verify that a dual backdrop input is superior to the usually applied trimap-based approach. In addition, the proposed studio set is actor friendly, and produces high-quality, temporal consistent alpha and color estimations that include a superior color spill compensation.

Keywords

    alpha matting, color spill, neural networks, virtual reality

ASJC Scopus subject areas

Sustainable Development Goals

Cite this

Color-Aware Deep Temporal Backdrop Duplex Matting System. / Hachmann, Hendrik; Rosenhahn, Bodo.
MMSys '23: Proceedings of the 14th Conference on ACM Multimedia Systems. 2023. p. 205-216.

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Hachmann, H & Rosenhahn, B 2023, Color-Aware Deep Temporal Backdrop Duplex Matting System. in MMSys '23: Proceedings of the 14th Conference on ACM Multimedia Systems. pp. 205-216, 14th ACM Multimedia Systems Conference, MMSys 2023, Vancouver, Canada, 7 Jun 2023. https://doi.org/10.48550/arXiv.2306.02954, https://doi.org/10.1145/3587819.3590973
Hachmann, H., & Rosenhahn, B. (2023). Color-Aware Deep Temporal Backdrop Duplex Matting System. In MMSys '23: Proceedings of the 14th Conference on ACM Multimedia Systems (pp. 205-216) https://doi.org/10.48550/arXiv.2306.02954, https://doi.org/10.1145/3587819.3590973
Hachmann H, Rosenhahn B. Color-Aware Deep Temporal Backdrop Duplex Matting System. In MMSys '23: Proceedings of the 14th Conference on ACM Multimedia Systems. 2023. p. 205-216 doi: 10.48550/arXiv.2306.02954, 10.1145/3587819.3590973
Hachmann, Hendrik ; Rosenhahn, Bodo. / Color-Aware Deep Temporal Backdrop Duplex Matting System. MMSys '23: Proceedings of the 14th Conference on ACM Multimedia Systems. 2023. pp. 205-216
Download
@inproceedings{6a33976b7af34f13a99516b76b9c90d1,
title = "Color-Aware Deep Temporal Backdrop Duplex Matting System",
abstract = "Deep learning-based alpha matting showed tremendous improvements in recent years, yet, feature film production studios still rely on classical chroma keying including costly post-production steps. This perceived discrepancy can be explained by some missing links necessary for production which are currently not adequately addressed in the alpha matting community, in particular foreground color estimation or color spill compensation. We propose a neural network-based temporal multi-backdrop production system that combines beneficial features from chroma keying and alpha matting. Given two consecutive frames with different background colors, our one-encoder-dual-decoder network predicts foreground colors and alpha values using a patch-based overlap-blend approach. The system is able to handle imprecise backdrops, dynamic cameras, and dynamic foregrounds and has no restrictions on foreground colors. We compare our method to state-of-The-Art algorithms using benchmark datasets and a video sequence captured by a demonstrator setup. We verify that a dual backdrop input is superior to the usually applied trimap-based approach. In addition, the proposed studio set is actor friendly, and produces high-quality, temporal consistent alpha and color estimations that include a superior color spill compensation.",
keywords = "alpha matting, color spill, neural networks, virtual reality",
author = "Hendrik Hachmann and Bodo Rosenhahn",
note = "Funding Information: This work was supported by the Federal Ministry of Education and Research (BMBF), Germany under the project LeibnizKILabor (grant no. 01DD20003) and the AI service center KISSKI (grant no. 01IS22093C), the Center for Digital Innovations (ZDIN) and the Deutsche Forschungsgemeinschaft (DFG) under Germany{\textquoteright}s Excellence Strategy within the Cluster of Excellence PhoenixD (EXC 2122).; 14th ACM Multimedia Systems Conference, MMSys 2023 ; Conference date: 07-06-2023 Through 10-06-2023",
year = "2023",
month = jun,
day = "8",
doi = "10.48550/arXiv.2306.02954",
language = "English",
pages = "205--216",
booktitle = "MMSys '23",

}

Download

TY - GEN

T1 - Color-Aware Deep Temporal Backdrop Duplex Matting System

AU - Hachmann, Hendrik

AU - Rosenhahn, Bodo

N1 - Funding Information: This work was supported by the Federal Ministry of Education and Research (BMBF), Germany under the project LeibnizKILabor (grant no. 01DD20003) and the AI service center KISSKI (grant no. 01IS22093C), the Center for Digital Innovations (ZDIN) and the Deutsche Forschungsgemeinschaft (DFG) under Germany’s Excellence Strategy within the Cluster of Excellence PhoenixD (EXC 2122).

PY - 2023/6/8

Y1 - 2023/6/8

N2 - Deep learning-based alpha matting showed tremendous improvements in recent years, yet, feature film production studios still rely on classical chroma keying including costly post-production steps. This perceived discrepancy can be explained by some missing links necessary for production which are currently not adequately addressed in the alpha matting community, in particular foreground color estimation or color spill compensation. We propose a neural network-based temporal multi-backdrop production system that combines beneficial features from chroma keying and alpha matting. Given two consecutive frames with different background colors, our one-encoder-dual-decoder network predicts foreground colors and alpha values using a patch-based overlap-blend approach. The system is able to handle imprecise backdrops, dynamic cameras, and dynamic foregrounds and has no restrictions on foreground colors. We compare our method to state-of-The-Art algorithms using benchmark datasets and a video sequence captured by a demonstrator setup. We verify that a dual backdrop input is superior to the usually applied trimap-based approach. In addition, the proposed studio set is actor friendly, and produces high-quality, temporal consistent alpha and color estimations that include a superior color spill compensation.

AB - Deep learning-based alpha matting showed tremendous improvements in recent years, yet, feature film production studios still rely on classical chroma keying including costly post-production steps. This perceived discrepancy can be explained by some missing links necessary for production which are currently not adequately addressed in the alpha matting community, in particular foreground color estimation or color spill compensation. We propose a neural network-based temporal multi-backdrop production system that combines beneficial features from chroma keying and alpha matting. Given two consecutive frames with different background colors, our one-encoder-dual-decoder network predicts foreground colors and alpha values using a patch-based overlap-blend approach. The system is able to handle imprecise backdrops, dynamic cameras, and dynamic foregrounds and has no restrictions on foreground colors. We compare our method to state-of-The-Art algorithms using benchmark datasets and a video sequence captured by a demonstrator setup. We verify that a dual backdrop input is superior to the usually applied trimap-based approach. In addition, the proposed studio set is actor friendly, and produces high-quality, temporal consistent alpha and color estimations that include a superior color spill compensation.

KW - alpha matting

KW - color spill

KW - neural networks

KW - virtual reality

UR - http://www.scopus.com/inward/record.url?scp=85163695040&partnerID=8YFLogxK

U2 - 10.48550/arXiv.2306.02954

DO - 10.48550/arXiv.2306.02954

M3 - Conference contribution

AN - SCOPUS:85163695040

SP - 205

EP - 216

BT - MMSys '23

T2 - 14th ACM Multimedia Systems Conference, MMSys 2023

Y2 - 7 June 2023 through 10 June 2023

ER -

By the same author(s)