Externalization Enhancement for Headphone-Reproduced Virtual Frontal and Rear Sound Images

Research output: Contribution to conferencePaperResearchpeer review

Authors

  • Song Li
  • Roman Schlieper
  • Jürgen Peissig
View graph of relations

Details

Original languageEnglish
Publication statusPublished - 21 Aug 2019
Event2019 AES International Conference on Headphone Technology - San Francisco, United States
Duration: 27 Aug 201929 Aug 2019

Conference

Conference2019 AES International Conference on Headphone Technology
Country/TerritoryUnited States
CitySan Francisco
Period27 Aug 201929 Aug 2019

Abstract

Non-individual head-related transfer functions (HRTFs) are commonly used in binaural rendering systems because measuring personal HRTFs is difficult in consumer scenarios. However, the perceived externalization of such rendered virtual sound images is usually low, especially for frontal and rear sound sources. This study proposed a method for improving perceived externalization of virtual frontal and rear sound images presented over headphones, consisting mainly of direction-dependent peak and notch filters for direct sound parts, and filter bank based decorrelation filters for early reflection parts. The result of the subjective listening experiment showed that the perceived externalization of frontal and rear sound sources was substantially improved by our proposed method compared to the conventional method.

ASJC Scopus subject areas

Cite this

Externalization Enhancement for Headphone-Reproduced Virtual Frontal and Rear Sound Images. / Li, Song; Schlieper, Roman; Peissig, Jürgen.
2019. Paper presented at 2019 AES International Conference on Headphone Technology, San Francisco, United States.

Research output: Contribution to conferencePaperResearchpeer review

Li, S, Schlieper, R & Peissig, J 2019, 'Externalization Enhancement for Headphone-Reproduced Virtual Frontal and Rear Sound Images', Paper presented at 2019 AES International Conference on Headphone Technology, San Francisco, United States, 27 Aug 2019 - 29 Aug 2019. <https://www.aes.org/e-lib/browse.cfm?elib=20516>
Li, S., Schlieper, R., & Peissig, J. (2019). Externalization Enhancement for Headphone-Reproduced Virtual Frontal and Rear Sound Images. Paper presented at 2019 AES International Conference on Headphone Technology, San Francisco, United States. https://www.aes.org/e-lib/browse.cfm?elib=20516
Li S, Schlieper R, Peissig J. Externalization Enhancement for Headphone-Reproduced Virtual Frontal and Rear Sound Images. 2019. Paper presented at 2019 AES International Conference on Headphone Technology, San Francisco, United States.
Li, Song ; Schlieper, Roman ; Peissig, Jürgen. / Externalization Enhancement for Headphone-Reproduced Virtual Frontal and Rear Sound Images. Paper presented at 2019 AES International Conference on Headphone Technology, San Francisco, United States.
Download
@conference{70e415083cce44189022a8b8f695338f,
title = "Externalization Enhancement for Headphone-Reproduced Virtual Frontal and Rear Sound Images",
abstract = "Non-individual head-related transfer functions (HRTFs) are commonly used in binaural rendering systems because measuring personal HRTFs is difficult in consumer scenarios. However, the perceived externalization of such rendered virtual sound images is usually low, especially for frontal and rear sound sources. This study proposed a method for improving perceived externalization of virtual frontal and rear sound images presented over headphones, consisting mainly of direction-dependent peak and notch filters for direct sound parts, and filter bank based decorrelation filters for early reflection parts. The result of the subjective listening experiment showed that the perceived externalization of frontal and rear sound sources was substantially improved by our proposed method compared to the conventional method.",
author = "Song Li and Roman Schlieper and J{\"u}rgen Peissig",
note = "Funding information: This work is supported by Huawei Innovation Research Program FLAGSHIP (HIRP FLAGSHIP) project. The authors would like to thank those who participated in the listening tests.; 2019 AES International Conference on Headphone Technology ; Conference date: 27-08-2019 Through 29-08-2019",
year = "2019",
month = aug,
day = "21",
language = "English",

}

Download

TY - CONF

T1 - Externalization Enhancement for Headphone-Reproduced Virtual Frontal and Rear Sound Images

AU - Li, Song

AU - Schlieper, Roman

AU - Peissig, Jürgen

N1 - Funding information: This work is supported by Huawei Innovation Research Program FLAGSHIP (HIRP FLAGSHIP) project. The authors would like to thank those who participated in the listening tests.

PY - 2019/8/21

Y1 - 2019/8/21

N2 - Non-individual head-related transfer functions (HRTFs) are commonly used in binaural rendering systems because measuring personal HRTFs is difficult in consumer scenarios. However, the perceived externalization of such rendered virtual sound images is usually low, especially for frontal and rear sound sources. This study proposed a method for improving perceived externalization of virtual frontal and rear sound images presented over headphones, consisting mainly of direction-dependent peak and notch filters for direct sound parts, and filter bank based decorrelation filters for early reflection parts. The result of the subjective listening experiment showed that the perceived externalization of frontal and rear sound sources was substantially improved by our proposed method compared to the conventional method.

AB - Non-individual head-related transfer functions (HRTFs) are commonly used in binaural rendering systems because measuring personal HRTFs is difficult in consumer scenarios. However, the perceived externalization of such rendered virtual sound images is usually low, especially for frontal and rear sound sources. This study proposed a method for improving perceived externalization of virtual frontal and rear sound images presented over headphones, consisting mainly of direction-dependent peak and notch filters for direct sound parts, and filter bank based decorrelation filters for early reflection parts. The result of the subjective listening experiment showed that the perceived externalization of frontal and rear sound sources was substantially improved by our proposed method compared to the conventional method.

UR - http://www.scopus.com/inward/record.url?scp=85074588719&partnerID=8YFLogxK

M3 - Paper

T2 - 2019 AES International Conference on Headphone Technology

Y2 - 27 August 2019 through 29 August 2019

ER -