Vol.13, No.2, May 2024. ISSN: 2217-8309 eISSN: 2217-8333
TEM Journal
TECHNOLOGY, EDUCATION, MANAGEMENT, INFORMATICS Association for Information Communication Technology Education and Science |
Enhancing Signed Graph Attention Network by Graph Characteristics: An Analysis
Panatda Kaewhit, Chanun Lewchalermvongs, Phakaporn Lewchalermvongs
© 2024 Chanun Lewchalermvongs, published by UIKTEN. This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License. (CC BY-NC-ND 4.0)
Citation Information: TEM Journal. Volume 13, Issue 2, Pages 885-896, ISSN 2217-8309, DOI: 10.18421/TEM132-05, May 2024.
Received: 03 November 2023. Revised: 16 February 2024.
Abstract:
A graph neural network (GNN) is one of successful methods for handling tasks on a graph data structure, e.g. node embedding, link prediction and node classification. GNNs focus on a graph data structure that must aggregate messages on nodes in the graph to retain a graph-structured information in new node’s message and proceed tasks on a graph. One of modifications on the propagation step in GNNs by adopting attention mechanism is a graph attention network (GAT). Applying this modification to signed graphs generated by sociological theories is called signed graph attention network (SiGAT). In this research, we utilize SiGAT and create novel graphs using graph characters to assess the performance of SiGAT models embedded in nodes across various characteristic graphs. The primary focus of our study was linked prediction, which aligns with the task employed in the previous research on SiGAT. We propose a method using graph characteristics to improve the time spent on the learning process in SiGAT.
Keywords –Graph neural network, graph attention network, signed graph attention network, graph characteristics, graph theory. |
----------------------------------------------------------------------------------------------------------- ----------------------------------------------------------------------------------------------------------- |