Details
Original language | English |
---|---|
Title of host publication | CIKM 2014 - Proceedings of the 2014 ACM International Conference on Information and Knowledge Management |
Pages | 191-200 |
Number of pages | 10 |
ISBN (electronic) | 9781450325981 |
Publication status | Published - 3 Nov 2014 |
Event | 23rd ACM International Conference on Information and Knowledge Management, CIKM 2014 - Shanghai, China Duration: 3 Nov 2014 → 7 Nov 2014 |
Publication series
Name | CIKM 2014 - Proceedings of the 2014 ACM International Conference on Information and Knowledge Management |
---|
Abstract
Multi-matrix factorization models provide a scalable and effective approach for multi-relational learning tasks such as link prediction, Linked Open Data (LOD) mining, recommender systems and social network analysis. Such models are learned by optimizing the sum of the losses on all relations in the data. Early models address the problem where there is only one target relation for which predictions should be made. More recent models address the multi-target variant of the problem and use the same set of parameters to make predictions for all target relations. In this paper, we argue that a model optimized for each target relation individually has better predictive performance than models optimized for a compromise on the performance on all target relations. We introduce specific parameters for each target but, instead of learning them independently from each other, we couple them through a set of shared auxiliary parameters, which has a regularizing effect on the target specific ones. Experiments on large Web datasets derived from DBpedia, Wikipedia and BlogCatalog show the performance improvement obtained by using target specific parameters and that our approach outperforms competitive state-of-the-art methods while being able to scale gracefully to big data.
Keywords
- Factorization models, Relational learning, Statistical inference
ASJC Scopus subject areas
- Decision Sciences(all)
- Information Systems and Management
- Computer Science(all)
- Computer Science Applications
- Computer Science(all)
- Information Systems
Cite this
- Standard
- Harvard
- Apa
- Vancouver
- BibTeX
- RIS
CIKM 2014 - Proceedings of the 2014 ACM International Conference on Information and Knowledge Management. 2014. p. 191-200 (CIKM 2014 - Proceedings of the 2014 ACM International Conference on Information and Knowledge Management).
Research output: Chapter in book/report/conference proceeding › Conference contribution › Research › peer review
}
TY - GEN
T1 - Optimizing Multi-Relational Factorization Models for Multiple Target Relations
AU - Drumond, Lucas
AU - Diaz-Aviles, Ernesto
AU - Schmidt-Thieme, Lars
AU - Nejdl, Wolfgang
PY - 2014/11/3
Y1 - 2014/11/3
N2 - Multi-matrix factorization models provide a scalable and effective approach for multi-relational learning tasks such as link prediction, Linked Open Data (LOD) mining, recommender systems and social network analysis. Such models are learned by optimizing the sum of the losses on all relations in the data. Early models address the problem where there is only one target relation for which predictions should be made. More recent models address the multi-target variant of the problem and use the same set of parameters to make predictions for all target relations. In this paper, we argue that a model optimized for each target relation individually has better predictive performance than models optimized for a compromise on the performance on all target relations. We introduce specific parameters for each target but, instead of learning them independently from each other, we couple them through a set of shared auxiliary parameters, which has a regularizing effect on the target specific ones. Experiments on large Web datasets derived from DBpedia, Wikipedia and BlogCatalog show the performance improvement obtained by using target specific parameters and that our approach outperforms competitive state-of-the-art methods while being able to scale gracefully to big data.
AB - Multi-matrix factorization models provide a scalable and effective approach for multi-relational learning tasks such as link prediction, Linked Open Data (LOD) mining, recommender systems and social network analysis. Such models are learned by optimizing the sum of the losses on all relations in the data. Early models address the problem where there is only one target relation for which predictions should be made. More recent models address the multi-target variant of the problem and use the same set of parameters to make predictions for all target relations. In this paper, we argue that a model optimized for each target relation individually has better predictive performance than models optimized for a compromise on the performance on all target relations. We introduce specific parameters for each target but, instead of learning them independently from each other, we couple them through a set of shared auxiliary parameters, which has a regularizing effect on the target specific ones. Experiments on large Web datasets derived from DBpedia, Wikipedia and BlogCatalog show the performance improvement obtained by using target specific parameters and that our approach outperforms competitive state-of-the-art methods while being able to scale gracefully to big data.
KW - Factorization models
KW - Relational learning
KW - Statistical inference
UR - http://www.scopus.com/inward/record.url?scp=84937548042&partnerID=8YFLogxK
U2 - 10.1145/2661829.2662052
DO - 10.1145/2661829.2662052
M3 - Conference contribution
AN - SCOPUS:84937548042
T3 - CIKM 2014 - Proceedings of the 2014 ACM International Conference on Information and Knowledge Management
SP - 191
EP - 200
BT - CIKM 2014 - Proceedings of the 2014 ACM International Conference on Information and Knowledge Management
T2 - 23rd ACM International Conference on Information and Knowledge Management, CIKM 2014
Y2 - 3 November 2014 through 7 November 2014
ER -