Externalization Enhancement for Headphone-Reproduced Virtual Frontal and Rear Sound Images

Publikation: KonferenzbeitragPaperForschungPeer-Review

Autoren

  • Song Li
  • Roman Schlieper
  • Jürgen Peissig

Organisationseinheiten

Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
PublikationsstatusVeröffentlicht - 21 Aug. 2019
Veranstaltung2019 AES International Conference on Headphone Technology - San Francisco, USA / Vereinigte Staaten
Dauer: 27 Aug. 201929 Aug. 2019

Konferenz

Konferenz2019 AES International Conference on Headphone Technology
Land/GebietUSA / Vereinigte Staaten
OrtSan Francisco
Zeitraum27 Aug. 201929 Aug. 2019

Abstract

Non-individual head-related transfer functions (HRTFs) are commonly used in binaural rendering systems because measuring personal HRTFs is difficult in consumer scenarios. However, the perceived externalization of such rendered virtual sound images is usually low, especially for frontal and rear sound sources. This study proposed a method for improving perceived externalization of virtual frontal and rear sound images presented over headphones, consisting mainly of direction-dependent peak and notch filters for direct sound parts, and filter bank based decorrelation filters for early reflection parts. The result of the subjective listening experiment showed that the perceived externalization of frontal and rear sound sources was substantially improved by our proposed method compared to the conventional method.

ASJC Scopus Sachgebiete

Zitieren

Externalization Enhancement for Headphone-Reproduced Virtual Frontal and Rear Sound Images. / Li, Song; Schlieper, Roman; Peissig, Jürgen.
2019. Beitrag in 2019 AES International Conference on Headphone Technology, San Francisco, USA / Vereinigte Staaten.

Publikation: KonferenzbeitragPaperForschungPeer-Review

Li, S, Schlieper, R & Peissig, J 2019, 'Externalization Enhancement for Headphone-Reproduced Virtual Frontal and Rear Sound Images', Beitrag in 2019 AES International Conference on Headphone Technology, San Francisco, USA / Vereinigte Staaten, 27 Aug. 2019 - 29 Aug. 2019. <https://www.aes.org/e-lib/browse.cfm?elib=20516>
Li, S., Schlieper, R., & Peissig, J. (2019). Externalization Enhancement for Headphone-Reproduced Virtual Frontal and Rear Sound Images. Beitrag in 2019 AES International Conference on Headphone Technology, San Francisco, USA / Vereinigte Staaten. https://www.aes.org/e-lib/browse.cfm?elib=20516
Li S, Schlieper R, Peissig J. Externalization Enhancement for Headphone-Reproduced Virtual Frontal and Rear Sound Images. 2019. Beitrag in 2019 AES International Conference on Headphone Technology, San Francisco, USA / Vereinigte Staaten.
Li, Song ; Schlieper, Roman ; Peissig, Jürgen. / Externalization Enhancement for Headphone-Reproduced Virtual Frontal and Rear Sound Images. Beitrag in 2019 AES International Conference on Headphone Technology, San Francisco, USA / Vereinigte Staaten.
Download
@conference{70e415083cce44189022a8b8f695338f,
title = "Externalization Enhancement for Headphone-Reproduced Virtual Frontal and Rear Sound Images",
abstract = "Non-individual head-related transfer functions (HRTFs) are commonly used in binaural rendering systems because measuring personal HRTFs is difficult in consumer scenarios. However, the perceived externalization of such rendered virtual sound images is usually low, especially for frontal and rear sound sources. This study proposed a method for improving perceived externalization of virtual frontal and rear sound images presented over headphones, consisting mainly of direction-dependent peak and notch filters for direct sound parts, and filter bank based decorrelation filters for early reflection parts. The result of the subjective listening experiment showed that the perceived externalization of frontal and rear sound sources was substantially improved by our proposed method compared to the conventional method.",
author = "Song Li and Roman Schlieper and J{\"u}rgen Peissig",
note = "Funding information: This work is supported by Huawei Innovation Research Program FLAGSHIP (HIRP FLAGSHIP) project. The authors would like to thank those who participated in the listening tests.; 2019 AES International Conference on Headphone Technology ; Conference date: 27-08-2019 Through 29-08-2019",
year = "2019",
month = aug,
day = "21",
language = "English",

}

Download

TY - CONF

T1 - Externalization Enhancement for Headphone-Reproduced Virtual Frontal and Rear Sound Images

AU - Li, Song

AU - Schlieper, Roman

AU - Peissig, Jürgen

N1 - Funding information: This work is supported by Huawei Innovation Research Program FLAGSHIP (HIRP FLAGSHIP) project. The authors would like to thank those who participated in the listening tests.

PY - 2019/8/21

Y1 - 2019/8/21

N2 - Non-individual head-related transfer functions (HRTFs) are commonly used in binaural rendering systems because measuring personal HRTFs is difficult in consumer scenarios. However, the perceived externalization of such rendered virtual sound images is usually low, especially for frontal and rear sound sources. This study proposed a method for improving perceived externalization of virtual frontal and rear sound images presented over headphones, consisting mainly of direction-dependent peak and notch filters for direct sound parts, and filter bank based decorrelation filters for early reflection parts. The result of the subjective listening experiment showed that the perceived externalization of frontal and rear sound sources was substantially improved by our proposed method compared to the conventional method.

AB - Non-individual head-related transfer functions (HRTFs) are commonly used in binaural rendering systems because measuring personal HRTFs is difficult in consumer scenarios. However, the perceived externalization of such rendered virtual sound images is usually low, especially for frontal and rear sound sources. This study proposed a method for improving perceived externalization of virtual frontal and rear sound images presented over headphones, consisting mainly of direction-dependent peak and notch filters for direct sound parts, and filter bank based decorrelation filters for early reflection parts. The result of the subjective listening experiment showed that the perceived externalization of frontal and rear sound sources was substantially improved by our proposed method compared to the conventional method.

UR - http://www.scopus.com/inward/record.url?scp=85074588719&partnerID=8YFLogxK

M3 - Paper

T2 - 2019 AES International Conference on Headphone Technology

Y2 - 27 August 2019 through 29 August 2019

ER -