Details
Originalsprache | Englisch |
---|---|
Titel des Sammelwerks | Neural Information Processing |
Untertitel | 30th International Conference, ICONIP 2023 |
Herausgeber/-innen | Biao Luo, Long Cheng, Zheng-Guang Wu, Hongyi Li, Chaojie Li |
Herausgeber (Verlag) | Springer Science and Business Media Deutschland GmbH |
Seiten | 453-469 |
Seitenumfang | 17 |
ISBN (elektronisch) | 978-981-99-8145-8 |
ISBN (Print) | 9789819981441 |
Publikationsstatus | Veröffentlicht - 2023 |
Veranstaltung | 30th International Conference on Neural Information Processing, ICONIP 2023 - Changsha, China Dauer: 20 Nov. 2023 → 23 Nov. 2023 |
Publikationsreihe
Name | Communications in Computer and Information Science |
---|---|
Band | 1965 CCIS |
ISSN (Print) | 1865-0929 |
ISSN (elektronisch) | 1865-0937 |
Abstract
Multi-label learning (MLL) refers to a learning task where each instance is associated with a set of labels. However, in most real-world applications, the labeling process is very expensive and time consuming. Partially multi-label learning (PML) refers to MLL where only a part of the labels are correctly annotated and the rest are false positive labels. The main purpose of PML is to learn and predict unseen multi-label data with less annotation cost. To address the ambiguities in the label set, existing popular PML research attempts to extract the label confidence for each candidate label. These methods mainly perform disambiguation by considering the correlation among labels or/and features. However, in PML because of noisy labels, the true correlation among labels is corrupted. These methods can be easily misled by noisy false-positive labels. In this paper, we propose Partial Multi-Label learning method via Constraint Clustering (PML-CC) to address PML based on the underlying structure of data. PML-CC gradually extracts high-confidence labels and then uses them to extract the rest labels. To find the high-confidence labels, it solves PML as a clustering task while considering extracted information from previous steps as constraints. In each step, PML-CC updates the extracted labels and uses them to extract the other labels. Experimental results show that our method successfully tackles PML tasks and outperforms the state-of-the-art methods on artificial and real-world datasets.
ASJC Scopus Sachgebiete
Zitieren
- Standard
- Harvard
- Apa
- Vancouver
- BibTex
- RIS
Neural Information Processing: 30th International Conference, ICONIP 2023. Hrsg. / Biao Luo; Long Cheng; Zheng-Guang Wu; Hongyi Li; Chaojie Li. Springer Science and Business Media Deutschland GmbH, 2023. S. 453-469 (Communications in Computer and Information Science; Band 1965 CCIS).
Publikation: Beitrag in Buch/Bericht/Sammelwerk/Konferenzband › Aufsatz in Konferenzband › Forschung › Peer-Review
}
TY - GEN
T1 - Partial Multi-label Learning via Constraint Clustering
AU - Siahroudi, Sajjad Kamali
AU - Kudenko, Daniel
N1 - Funding Information: This work has been partially supported by the Volkswagen foundation.
PY - 2023
Y1 - 2023
N2 - Multi-label learning (MLL) refers to a learning task where each instance is associated with a set of labels. However, in most real-world applications, the labeling process is very expensive and time consuming. Partially multi-label learning (PML) refers to MLL where only a part of the labels are correctly annotated and the rest are false positive labels. The main purpose of PML is to learn and predict unseen multi-label data with less annotation cost. To address the ambiguities in the label set, existing popular PML research attempts to extract the label confidence for each candidate label. These methods mainly perform disambiguation by considering the correlation among labels or/and features. However, in PML because of noisy labels, the true correlation among labels is corrupted. These methods can be easily misled by noisy false-positive labels. In this paper, we propose Partial Multi-Label learning method via Constraint Clustering (PML-CC) to address PML based on the underlying structure of data. PML-CC gradually extracts high-confidence labels and then uses them to extract the rest labels. To find the high-confidence labels, it solves PML as a clustering task while considering extracted information from previous steps as constraints. In each step, PML-CC updates the extracted labels and uses them to extract the other labels. Experimental results show that our method successfully tackles PML tasks and outperforms the state-of-the-art methods on artificial and real-world datasets.
AB - Multi-label learning (MLL) refers to a learning task where each instance is associated with a set of labels. However, in most real-world applications, the labeling process is very expensive and time consuming. Partially multi-label learning (PML) refers to MLL where only a part of the labels are correctly annotated and the rest are false positive labels. The main purpose of PML is to learn and predict unseen multi-label data with less annotation cost. To address the ambiguities in the label set, existing popular PML research attempts to extract the label confidence for each candidate label. These methods mainly perform disambiguation by considering the correlation among labels or/and features. However, in PML because of noisy labels, the true correlation among labels is corrupted. These methods can be easily misled by noisy false-positive labels. In this paper, we propose Partial Multi-Label learning method via Constraint Clustering (PML-CC) to address PML based on the underlying structure of data. PML-CC gradually extracts high-confidence labels and then uses them to extract the rest labels. To find the high-confidence labels, it solves PML as a clustering task while considering extracted information from previous steps as constraints. In each step, PML-CC updates the extracted labels and uses them to extract the other labels. Experimental results show that our method successfully tackles PML tasks and outperforms the state-of-the-art methods on artificial and real-world datasets.
KW - constraint clustering
KW - disambiguation
KW - partial multi-label learning
UR - http://www.scopus.com/inward/record.url?scp=85178635833&partnerID=8YFLogxK
U2 - 10.1007/978-981-99-8145-8_35
DO - 10.1007/978-981-99-8145-8_35
M3 - Conference contribution
AN - SCOPUS:85178635833
SN - 9789819981441
T3 - Communications in Computer and Information Science
SP - 453
EP - 469
BT - Neural Information Processing
A2 - Luo, Biao
A2 - Cheng, Long
A2 - Wu, Zheng-Guang
A2 - Li, Hongyi
A2 - Li, Chaojie
PB - Springer Science and Business Media Deutschland GmbH
T2 - 30th International Conference on Neural Information Processing, ICONIP 2023
Y2 - 20 November 2023 through 23 November 2023
ER -