Hashing-accelerated graph neural networks for link prediction

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Autorschaft

Externe Organisationen

  • Fudan University
  • Microsoft Corporation
Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
Titel des SammelwerksWWW '21
UntertitelProceedings of the Web Conference 2021
Seiten2910-2920
Seitenumfang11
ISBN (elektronisch)9781450383127
PublikationsstatusVeröffentlicht - 19 Apr. 2021
VeranstaltungWorld Wide Web Conference (WWW 2021) - Ljubljana, Slowenien
Dauer: 19 Apr. 202123 Apr. 2021
Konferenznummer: 30

Abstract

Networks are ubiquitous in the real world. Link prediction, as one of the key problems for network-structured data, aims to predict whether there exists a link between two nodes. The traditional approaches are based on the explicit similarity computation between the compact node representation by embedding each node into a low-dimensional space. In order to efficiently handle the intensive similarity computation in link prediction, the hashing technique has been successfully used to produce the node representation in the Hamming space. However, the hashing-based link prediction algorithms face accuracy loss from the randomized hashing techniques or inefficiency from the learning to hash techniques in the embedding process. Currently, the Graph Neural Network (GNN) framework has been widely applied to the graph-related tasks in an end-to-end manner, but it commonly requires substantial computational resources and memory costs due to massive parameter learning, which makes the GNN-based algorithms impractical without the help of a powerful workhorse. In this paper, we propose a simple and effective model called #GNN, which balances the trade-off between accuracy and efficiency. #GNN is able to efficiently acquire node representation in the Hamming space for link prediction by exploiting the randomized hashing technique to implement message passing and capture high-order proximity in the GNN framework. Furthermore, we characterize the discriminative power of #GNN in probability. The extensive experimental results demonstrate that the proposed #GNN algorithm achieves accuracy comparable to the learning-based algorithms and outperforms the randomized algorithm, while running significantly faster than the learning-based algorithms. Also, the proposed algorithm shows excellent scalability on a large-scale network with the limited resources.

ASJC Scopus Sachgebiete

Zitieren

Hashing-accelerated graph neural networks for link prediction. / Wu, Wei; Li, Bin; Luo, Chuan et al.
WWW '21: Proceedings of the Web Conference 2021. 2021. S. 2910-2920.

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Wu, W, Li, B, Luo, C & Nejdl, W 2021, Hashing-accelerated graph neural networks for link prediction. in WWW '21: Proceedings of the Web Conference 2021. S. 2910-2920, World Wide Web Conference (WWW 2021), Ljubljana, Slowenien, 19 Apr. 2021. https://doi.org/10.1145/3442381.3449884
Wu, W., Li, B., Luo, C., & Nejdl, W. (2021). Hashing-accelerated graph neural networks for link prediction. In WWW '21: Proceedings of the Web Conference 2021 (S. 2910-2920) https://doi.org/10.1145/3442381.3449884
Wu W, Li B, Luo C, Nejdl W. Hashing-accelerated graph neural networks for link prediction. in WWW '21: Proceedings of the Web Conference 2021. 2021. S. 2910-2920 doi: 10.1145/3442381.3449884
Wu, Wei ; Li, Bin ; Luo, Chuan et al. / Hashing-accelerated graph neural networks for link prediction. WWW '21: Proceedings of the Web Conference 2021. 2021. S. 2910-2920
Download
@inproceedings{af776e157824452d8d1d1637e9618ff6,
title = "Hashing-accelerated graph neural networks for link prediction",
abstract = "Networks are ubiquitous in the real world. Link prediction, as one of the key problems for network-structured data, aims to predict whether there exists a link between two nodes. The traditional approaches are based on the explicit similarity computation between the compact node representation by embedding each node into a low-dimensional space. In order to efficiently handle the intensive similarity computation in link prediction, the hashing technique has been successfully used to produce the node representation in the Hamming space. However, the hashing-based link prediction algorithms face accuracy loss from the randomized hashing techniques or inefficiency from the learning to hash techniques in the embedding process. Currently, the Graph Neural Network (GNN) framework has been widely applied to the graph-related tasks in an end-to-end manner, but it commonly requires substantial computational resources and memory costs due to massive parameter learning, which makes the GNN-based algorithms impractical without the help of a powerful workhorse. In this paper, we propose a simple and effective model called #GNN, which balances the trade-off between accuracy and efficiency. #GNN is able to efficiently acquire node representation in the Hamming space for link prediction by exploiting the randomized hashing technique to implement message passing and capture high-order proximity in the GNN framework. Furthermore, we characterize the discriminative power of #GNN in probability. The extensive experimental results demonstrate that the proposed #GNN algorithm achieves accuracy comparable to the learning-based algorithms and outperforms the randomized algorithm, while running significantly faster than the learning-based algorithms. Also, the proposed algorithm shows excellent scalability on a large-scale network with the limited resources. ",
keywords = "Attributed Network, Graph Neural Networks, Hashing, Link Prediction",
author = "Wei Wu and Bin Li and Chuan Luo and Wolfgang Nejdl",
note = "Funding Information: This work was supported in part by the Federal Ministry of Education and Research (BMBF), Germany under the project Leib-nizKILabor (Grant No.01DD20003), the Sub Project of Independent Scientific Research Project (No. ZZKY-ZX-03-02-04), STCSM Project (20511100400), Shanghai Municipal Science and Technology Major Projects (2018SHZDZX01), and the Program for Professor of Special Appointment (Eastern Scholar) at Shanghai Institutions of Higher Learning.; World Wide Web Conference (WWW 2021), WWW 2021 ; Conference date: 19-04-2021 Through 23-04-2021",
year = "2021",
month = apr,
day = "19",
doi = "10.1145/3442381.3449884",
language = "English",
pages = "2910--2920",
booktitle = "WWW '21",

}

Download

TY - GEN

T1 - Hashing-accelerated graph neural networks for link prediction

AU - Wu, Wei

AU - Li, Bin

AU - Luo, Chuan

AU - Nejdl, Wolfgang

N1 - Conference code: 30

PY - 2021/4/19

Y1 - 2021/4/19

N2 - Networks are ubiquitous in the real world. Link prediction, as one of the key problems for network-structured data, aims to predict whether there exists a link between two nodes. The traditional approaches are based on the explicit similarity computation between the compact node representation by embedding each node into a low-dimensional space. In order to efficiently handle the intensive similarity computation in link prediction, the hashing technique has been successfully used to produce the node representation in the Hamming space. However, the hashing-based link prediction algorithms face accuracy loss from the randomized hashing techniques or inefficiency from the learning to hash techniques in the embedding process. Currently, the Graph Neural Network (GNN) framework has been widely applied to the graph-related tasks in an end-to-end manner, but it commonly requires substantial computational resources and memory costs due to massive parameter learning, which makes the GNN-based algorithms impractical without the help of a powerful workhorse. In this paper, we propose a simple and effective model called #GNN, which balances the trade-off between accuracy and efficiency. #GNN is able to efficiently acquire node representation in the Hamming space for link prediction by exploiting the randomized hashing technique to implement message passing and capture high-order proximity in the GNN framework. Furthermore, we characterize the discriminative power of #GNN in probability. The extensive experimental results demonstrate that the proposed #GNN algorithm achieves accuracy comparable to the learning-based algorithms and outperforms the randomized algorithm, while running significantly faster than the learning-based algorithms. Also, the proposed algorithm shows excellent scalability on a large-scale network with the limited resources.

AB - Networks are ubiquitous in the real world. Link prediction, as one of the key problems for network-structured data, aims to predict whether there exists a link between two nodes. The traditional approaches are based on the explicit similarity computation between the compact node representation by embedding each node into a low-dimensional space. In order to efficiently handle the intensive similarity computation in link prediction, the hashing technique has been successfully used to produce the node representation in the Hamming space. However, the hashing-based link prediction algorithms face accuracy loss from the randomized hashing techniques or inefficiency from the learning to hash techniques in the embedding process. Currently, the Graph Neural Network (GNN) framework has been widely applied to the graph-related tasks in an end-to-end manner, but it commonly requires substantial computational resources and memory costs due to massive parameter learning, which makes the GNN-based algorithms impractical without the help of a powerful workhorse. In this paper, we propose a simple and effective model called #GNN, which balances the trade-off between accuracy and efficiency. #GNN is able to efficiently acquire node representation in the Hamming space for link prediction by exploiting the randomized hashing technique to implement message passing and capture high-order proximity in the GNN framework. Furthermore, we characterize the discriminative power of #GNN in probability. The extensive experimental results demonstrate that the proposed #GNN algorithm achieves accuracy comparable to the learning-based algorithms and outperforms the randomized algorithm, while running significantly faster than the learning-based algorithms. Also, the proposed algorithm shows excellent scalability on a large-scale network with the limited resources.

KW - Attributed Network

KW - Graph Neural Networks

KW - Hashing

KW - Link Prediction

UR - http://www.scopus.com/inward/record.url?scp=85107938179&partnerID=8YFLogxK

U2 - 10.1145/3442381.3449884

DO - 10.1145/3442381.3449884

M3 - Conference contribution

AN - SCOPUS:85107938179

SP - 2910

EP - 2920

BT - WWW '21

T2 - World Wide Web Conference (WWW 2021)

Y2 - 19 April 2021 through 23 April 2021

ER -

Von denselben Autoren