Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Language-led Visual Grounding for Human Computer Interaction

Version 1 : Received: 11 June 2023 / Approved: 13 June 2023 / Online: 13 June 2023 (16:29:23 CEST)

A peer-reviewed article of this Preprint also exists.

Sui, Z.; Zhou, M.; Feng, Z.; Stefanidis, A.; Jiang, N. Language-Led Visual Grounding and Future Possibilities. Electronics 2023, 12, 3142. Sui, Z.; Zhou, M.; Feng, Z.; Stefanidis, A.; Jiang, N. Language-Led Visual Grounding and Future Possibilities. Electronics 2023, 12, 3142.

Abstract

In recent years, with the rapid development of computer vision technology and the popularization of intelligent hardware, as well as people’s increasing demand for intelligent products for human-computer interaction, visual grounding technology can help machines and humans identify and locate objects, thereby promoting human-computer interaction and intelligent manufacturing. At the same time, human-computer interaction is constantly evolving and improving, becoming increasingly intelligent, humane, and efficient. This paper proposes a new VG model and designs a language verification module that uses language information as the main information to increase the model’s interactivity. Additionally, we propose the combination of visual grounding and human-computer interaction, aiming to explore the research status and development trends of visual grounding and human-computer interaction technology, as well as their application in practical scenarios and optimization directions, to provide references and guidance for relevant researchers and promote the development and application of visual grounding and human-computer interaction technology.

Keywords

visual grounding; human-computer interaction; intelligent systems; user experience; interaction design

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.