Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

A Self-Attention Model for Next Location Prediction based on Semantic Mining

Version 1 : Received: 1 September 2023 / Approved: 4 September 2023 / Online: 4 September 2023 (08:12:06 CEST)

A peer-reviewed article of this Preprint also exists.

Lu, E.H.-C.; Lin, Y.-R. A Self-Attention Model for Next Location Prediction Based on Semantic Mining. ISPRS Int. J. Geo-Inf. 2023, 12, 420. Lu, E.H.-C.; Lin, Y.-R. A Self-Attention Model for Next Location Prediction Based on Semantic Mining. ISPRS Int. J. Geo-Inf. 2023, 12, 420.

Abstract

With the rise of the Internet of Things (IOT), mobile devices and Location-Based Social Network (LBSN), an abundant amount of trajectory data has made research on location prediction more popular. The check-in data shared through LBSN hides information related to life patterns, obtaining this information is helpful for location prediction. However, the trajectory data recorded by mobile devices is different from check-in data that has semantic information. In order to obtain the user’s semantic, relevant studies match the stay point to the nearest Point of Interest (POI), but location error may lead to wrong semantic matching. Therefore, we propose A Self-Attention for Next Location Prediction based on Semantic Mining to predict the next location. When calculating the semantic feature of a stay point, the first step is to search for the k nearest POI, then use the reciprocal of the distance from the stay point to the k nearest POI and the number of categories as weights. Finally, we use the probability to express semantic without losing other important semantic information. Furthermore, this research combined with sequential pattern mining can bring richer semantic features. In order to better perceive the trajectory, temporal features learn periodicity of time series by sine function. In terms of location features, we build a directed weighted graph and regard the frequency of users visiting locations as the weight, so the location features are rich in contextual information. We then adopt the Self-Attention model to capture long-term dependencies in long trajectory sequences. Experiments in Geolife show that the semantic matching of this study improved by 45.78% in TOP@1 compared with the closest distance search for POI. Compared with the baseline, the model proposed in this study improved by 2.5% in TOP@1.

Keywords

Location prediction; point of interest; trajectory; semantic matching; deep learning

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.