Preserved in Portico This version is not peer-reviewed
Deep Community Detection Method for Social Networks
: Received: 1 November 2019 / Approved: 3 November 2019 / Online: 3 November 2019 (15:51:34 CET)
A peer-reviewed article of this Preprint also exists.
Journal reference: IEEE Access 2020
With the fast development of the mobile Internet, the online platforms of social networks have rapidly been developing for the purpose of making friends, sharing information, etc. In these online platforms, users being related to each other forms social networks. Literature reviews have shown that social networks have community structure. Through the studies of community structure, the characteristics and functions of networks structure and the dynamical evolution mechanism of networks can be used for predicting user behaviours and controlling information dissemination. Therefore, this study proposes a deep community detection method which includes (1) matrix reconstruction method, (2) spatial feature extraction method and (3) community detection method. The original adjacency matrix in social network is reconstructed based on the opinion leader and nearer neighbors for obtaining spatial proximity matrix. The spatial eigenvector of reconstructed adjacency matrix can be extracted by an auto-encoder based on convolution neural network for the improvement of modularity. In experiments, four open datasets of practical social networks were selected to evaluate the proposed method, and the experimental results show that the proposed deep community detection method obtained higher modularity than other methods. Therefore, the proposed deep community detection method can effectively detect high quality communities in social networks.
community detection; social network; convolutional neural network; auto-encoder
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
We encourage comments and feedback from a broad range of readers. See criteria for comments and our diversity statement.