Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Self-Attention and Adversary Guided Hashing Network for Cross-Modal Retrieval

Version 1 : Received: 16 September 2020 / Approved: 18 September 2020 / Online: 18 September 2020 (04:16:58 CEST)

A peer-reviewed article of this Preprint also exists.

Chen, S.; Wu, S.; Wang, L.; Yu, Z. Self-Attention and Adversary Learning Deep Hashing Network for Cross-Modal Retrieval. Computers & Electrical Engineering 2021, 93, 107262, doi:10.1016/j.compeleceng.2021.107262. Chen, S.; Wu, S.; Wang, L.; Yu, Z. Self-Attention and Adversary Learning Deep Hashing Network for Cross-Modal Retrieval. Computers & Electrical Engineering 2021, 93, 107262, doi:10.1016/j.compeleceng.2021.107262.

Abstract

Recently deep cross-modal hashing networks have received increasing interests due to its superior query efficiency and low storage cost. However, most of existing methods concentrate less on hash representations learning part, which means the semantic information of data cannot be fully used. Furthermore, they may neglect the high-ranking relevance and consistency of hash codes. To solve these problems, we propose a Self-Attention and Adversary Guided Hashing Network (SAAGHN). Specifically, it employs self-attention mechanism in hash representations learning part to extract rich semantic relevance information. Meanwhile, in order to keep invariability of hash codes, adversarial learning is adopted in the hash codes learning part. In addition, to generate higher-ranking hash codes and avoid local minima early, a new batch semi-hard cosine triplet loss and a cosine quantization loss are proposed. Extensive experiments on two benchmark datasets have shown that SAAGHN outperforms other baselines and achieves the state-of-the-art performance.

Keywords

adversarial learning; deep cross-modal hashing; self-attention mechanism

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.