Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

A Cross-modal Retrieval Method for Image, Audio and Video Based on OpenKylin

These authors contributed equally to this work.
Version 1 : Received: 1 November 2023 / Approved: 2 November 2023 / Online: 3 November 2023 (04:04:35 CET)
Version 2 : Received: 3 November 2023 / Approved: 3 November 2023 / Online: 3 November 2023 (14:53:17 CET)

How to cite: Zhang, J.; Xie, X.; Liu, X.D.; Yu, J.; Peng, L.; Wang, W.Z.; Zhang, P.F.; Lan, Y.; Zhang, C. A Cross-modal Retrieval Method for Image, Audio and Video Based on OpenKylin. Preprints 2023, 2023110185. https://doi.org/10.20944/preprints202311.0185.v1 Zhang, J.; Xie, X.; Liu, X.D.; Yu, J.; Peng, L.; Wang, W.Z.; Zhang, P.F.; Lan, Y.; Zhang, C. A Cross-modal Retrieval Method for Image, Audio and Video Based on OpenKylin. Preprints 2023, 2023110185. https://doi.org/10.20944/preprints202311.0185.v1

Abstract

The need for cross-modal retrieval (CMR) among users is growing because of the swift advancements in computer vision and natural language processing. Promoting home operating systems is greatly aided by the openKylin home operating system. It is focused on enhancing the functionality of the operating system with safe and dependable operating technologies. Practical implications abound for integrating computer vision research into home operating systems. A multimodal retrieval approach based on OpenKylin is presented to improve the usability of text search graphs, audio, and video. The overview of pertinent domestic and international research is the main topic of this article. This technological advancement. It then offers details on the design, implementation procedure, and final implementation outcomes for demonstration. The works of natural operations show that this strategy has excellent accuracy and performance.

Keywords

cross-modal retrieval; open Kylin; domestic operating system

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.