Optimizing Multi-Relational Factorization Models for Multiple Target Relations

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Authors

Research Organisations

External Research Organisations

  • IBM Research
  • University of Hildesheim
View graph of relations

Details

Original languageEnglish
Title of host publicationCIKM 2014 - Proceedings of the 2014 ACM International Conference on Information and Knowledge Management
Pages191-200
Number of pages10
ISBN (electronic)9781450325981
Publication statusPublished - 3 Nov 2014
Event23rd ACM International Conference on Information and Knowledge Management, CIKM 2014 - Shanghai, China
Duration: 3 Nov 20147 Nov 2014

Publication series

NameCIKM 2014 - Proceedings of the 2014 ACM International Conference on Information and Knowledge Management

Abstract

Multi-matrix factorization models provide a scalable and effective approach for multi-relational learning tasks such as link prediction, Linked Open Data (LOD) mining, recommender systems and social network analysis. Such models are learned by optimizing the sum of the losses on all relations in the data. Early models address the problem where there is only one target relation for which predictions should be made. More recent models address the multi-target variant of the problem and use the same set of parameters to make predictions for all target relations. In this paper, we argue that a model optimized for each target relation individually has better predictive performance than models optimized for a compromise on the performance on all target relations. We introduce specific parameters for each target but, instead of learning them independently from each other, we couple them through a set of shared auxiliary parameters, which has a regularizing effect on the target specific ones. Experiments on large Web datasets derived from DBpedia, Wikipedia and BlogCatalog show the performance improvement obtained by using target specific parameters and that our approach outperforms competitive state-of-the-art methods while being able to scale gracefully to big data.

Keywords

    Factorization models, Relational learning, Statistical inference

ASJC Scopus subject areas

Cite this

Optimizing Multi-Relational Factorization Models for Multiple Target Relations. / Drumond, Lucas; Diaz-Aviles, Ernesto; Schmidt-Thieme, Lars et al.
CIKM 2014 - Proceedings of the 2014 ACM International Conference on Information and Knowledge Management. 2014. p. 191-200 (CIKM 2014 - Proceedings of the 2014 ACM International Conference on Information and Knowledge Management).

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Drumond, L, Diaz-Aviles, E, Schmidt-Thieme, L & Nejdl, W 2014, Optimizing Multi-Relational Factorization Models for Multiple Target Relations. in CIKM 2014 - Proceedings of the 2014 ACM International Conference on Information and Knowledge Management. CIKM 2014 - Proceedings of the 2014 ACM International Conference on Information and Knowledge Management, pp. 191-200, 23rd ACM International Conference on Information and Knowledge Management, CIKM 2014, Shanghai, China, 3 Nov 2014. https://doi.org/10.1145/2661829.2662052
Drumond, L., Diaz-Aviles, E., Schmidt-Thieme, L., & Nejdl, W. (2014). Optimizing Multi-Relational Factorization Models for Multiple Target Relations. In CIKM 2014 - Proceedings of the 2014 ACM International Conference on Information and Knowledge Management (pp. 191-200). (CIKM 2014 - Proceedings of the 2014 ACM International Conference on Information and Knowledge Management). https://doi.org/10.1145/2661829.2662052
Drumond L, Diaz-Aviles E, Schmidt-Thieme L, Nejdl W. Optimizing Multi-Relational Factorization Models for Multiple Target Relations. In CIKM 2014 - Proceedings of the 2014 ACM International Conference on Information and Knowledge Management. 2014. p. 191-200. (CIKM 2014 - Proceedings of the 2014 ACM International Conference on Information and Knowledge Management). doi: 10.1145/2661829.2662052
Drumond, Lucas ; Diaz-Aviles, Ernesto ; Schmidt-Thieme, Lars et al. / Optimizing Multi-Relational Factorization Models for Multiple Target Relations. CIKM 2014 - Proceedings of the 2014 ACM International Conference on Information and Knowledge Management. 2014. pp. 191-200 (CIKM 2014 - Proceedings of the 2014 ACM International Conference on Information and Knowledge Management).
Download
@inproceedings{5005959193ba4cf9b277e4e77557ae38,
title = "Optimizing Multi-Relational Factorization Models for Multiple Target Relations",
abstract = "Multi-matrix factorization models provide a scalable and effective approach for multi-relational learning tasks such as link prediction, Linked Open Data (LOD) mining, recommender systems and social network analysis. Such models are learned by optimizing the sum of the losses on all relations in the data. Early models address the problem where there is only one target relation for which predictions should be made. More recent models address the multi-target variant of the problem and use the same set of parameters to make predictions for all target relations. In this paper, we argue that a model optimized for each target relation individually has better predictive performance than models optimized for a compromise on the performance on all target relations. We introduce specific parameters for each target but, instead of learning them independently from each other, we couple them through a set of shared auxiliary parameters, which has a regularizing effect on the target specific ones. Experiments on large Web datasets derived from DBpedia, Wikipedia and BlogCatalog show the performance improvement obtained by using target specific parameters and that our approach outperforms competitive state-of-the-art methods while being able to scale gracefully to big data.",
keywords = "Factorization models, Relational learning, Statistical inference",
author = "Lucas Drumond and Ernesto Diaz-Aviles and Lars Schmidt-Thieme and Wolfgang Nejdl",
year = "2014",
month = nov,
day = "3",
doi = "10.1145/2661829.2662052",
language = "English",
series = "CIKM 2014 - Proceedings of the 2014 ACM International Conference on Information and Knowledge Management",
pages = "191--200",
booktitle = "CIKM 2014 - Proceedings of the 2014 ACM International Conference on Information and Knowledge Management",
note = "23rd ACM International Conference on Information and Knowledge Management, CIKM 2014 ; Conference date: 03-11-2014 Through 07-11-2014",

}

Download

TY - GEN

T1 - Optimizing Multi-Relational Factorization Models for Multiple Target Relations

AU - Drumond, Lucas

AU - Diaz-Aviles, Ernesto

AU - Schmidt-Thieme, Lars

AU - Nejdl, Wolfgang

PY - 2014/11/3

Y1 - 2014/11/3

N2 - Multi-matrix factorization models provide a scalable and effective approach for multi-relational learning tasks such as link prediction, Linked Open Data (LOD) mining, recommender systems and social network analysis. Such models are learned by optimizing the sum of the losses on all relations in the data. Early models address the problem where there is only one target relation for which predictions should be made. More recent models address the multi-target variant of the problem and use the same set of parameters to make predictions for all target relations. In this paper, we argue that a model optimized for each target relation individually has better predictive performance than models optimized for a compromise on the performance on all target relations. We introduce specific parameters for each target but, instead of learning them independently from each other, we couple them through a set of shared auxiliary parameters, which has a regularizing effect on the target specific ones. Experiments on large Web datasets derived from DBpedia, Wikipedia and BlogCatalog show the performance improvement obtained by using target specific parameters and that our approach outperforms competitive state-of-the-art methods while being able to scale gracefully to big data.

AB - Multi-matrix factorization models provide a scalable and effective approach for multi-relational learning tasks such as link prediction, Linked Open Data (LOD) mining, recommender systems and social network analysis. Such models are learned by optimizing the sum of the losses on all relations in the data. Early models address the problem where there is only one target relation for which predictions should be made. More recent models address the multi-target variant of the problem and use the same set of parameters to make predictions for all target relations. In this paper, we argue that a model optimized for each target relation individually has better predictive performance than models optimized for a compromise on the performance on all target relations. We introduce specific parameters for each target but, instead of learning them independently from each other, we couple them through a set of shared auxiliary parameters, which has a regularizing effect on the target specific ones. Experiments on large Web datasets derived from DBpedia, Wikipedia and BlogCatalog show the performance improvement obtained by using target specific parameters and that our approach outperforms competitive state-of-the-art methods while being able to scale gracefully to big data.

KW - Factorization models

KW - Relational learning

KW - Statistical inference

UR - http://www.scopus.com/inward/record.url?scp=84937548042&partnerID=8YFLogxK

U2 - 10.1145/2661829.2662052

DO - 10.1145/2661829.2662052

M3 - Conference contribution

AN - SCOPUS:84937548042

T3 - CIKM 2014 - Proceedings of the 2014 ACM International Conference on Information and Knowledge Management

SP - 191

EP - 200

BT - CIKM 2014 - Proceedings of the 2014 ACM International Conference on Information and Knowledge Management

T2 - 23rd ACM International Conference on Information and Knowledge Management, CIKM 2014

Y2 - 3 November 2014 through 7 November 2014

ER -

By the same author(s)