Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Large Language Models as Recommendation Systems in Museums

Version 1 : Received: 19 July 2023 / Approved: 20 July 2023 / Online: 20 July 2023 (08:41:04 CEST)

A peer-reviewed article of this Preprint also exists.

Trichopoulos, G.; Konstantakis, M.; Alexandridis, G.; Caridakis, G. Large Language Models as Recommendation Systems in Museums. Electronics 2023, 12, 3829. Trichopoulos, G.; Konstantakis, M.; Alexandridis, G.; Caridakis, G. Large Language Models as Recommendation Systems in Museums. Electronics 2023, 12, 3829.

Abstract

This paper proposes the utilization of large language models as recommendations systems for museums. Since the aforementioned models lack the notion of context, they can’t work with temporal information that is often present in recommendations for cultural environments (e.g. special exhibitions or events). In this respect, the current work aims at enhancing the capabilities of large language models through a fine-tuning process that incorporates contextual information and user instructions. The resulting models are expected to be capable of providing personalized recommendations, aligned with user preferences and desires. More specifically, Generative Pre-trained Transformer 4, a knowledge-based large language model is fine-tuned and turned into a context-ware recommendation system, adapting its suggestions based on user input and specific contextual factors such as location, time of visit, and other relevant parameters. The effectiveness of the proposed approach is evaluated through certain user studies, which ensure an improved user experience and engagement within the museum environment.

Keywords

large language models; recommender systems; GPT-4; context awareness; personalization; cultural heritage; museum

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.