Preprint
Article

This version is not peer-reviewed.

Intelligent Recommendation Systems Using Multi-Scale LoRA Fine-Tuning and Large Language Models

Submitted:

16 December 2025

Posted:

17 December 2025

You are already at the latest version

Abstract
This study proposes a multi-scale LoRA fine-tuning recommendation algorithm based on large language models to address the limitations of traditional recommender systems in semantic understanding, feature redundancy, and parameter transfer efficiency. The method preserves the semantic representation ability of large models while achieving unified modeling of global preferences and local interests through multi-scale semantic decomposition, low-rank parameter adaptation, and cross-scale fusion mechanisms. The model first inputs user-content interaction sequences into a pre-trained language model to obtain context-aware semantic embeddings. Then, a multi-scale semantic pooling structure extracts hierarchical feature information to capture multi-granularity preference relations. Based on this, a multi-scale LoRA module performs low-rank decomposition and cross-scale alignment of weight matrices, significantly reducing parameter size and improving fine-tuning efficiency. Finally, a cross-scale attention fusion layer dynamically reconstructs global and local features to optimize recommendation ranking. Systematic experiments conducted on the MovieLens-1M dataset validate the effectiveness of the proposed method across multiple evaluation metrics. The results show that the model outperforms several baseline algorithms in Precision@K, NDCG@K, Recall@K, and Coverage, demonstrating the advantages of multi-scale structure and LoRA parameterization in enhancing recommendation accuracy, diversity, and generalization. Overall, this research provides a feasible solution for structural optimization and parameter-efficient fine-tuning of large language models in efficient recommendation tasks.
Keywords: 
;  ;  ;  
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2025 MDPI (Basel, Switzerland) unless otherwise stated