Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Domain Adaptation with Sentiment Domain Adapter

Version 1 : Received: 13 April 2024 / Approved: 15 April 2024 / Online: 18 April 2024 (13:36:39 CEST)

How to cite: Carter, A.; Nasir, W.; Parker, E. Domain Adaptation with Sentiment Domain Adapter. Preprints 2024, 2024041031. https://doi.org/10.20944/preprints202404.1031.v1 Carter, A.; Nasir, W.; Parker, E. Domain Adaptation with Sentiment Domain Adapter. Preprints 2024, 2024041031. https://doi.org/10.20944/preprints202404.1031.v1

Abstract

The field of domain adaptation, particularly in cross-domain sentiment classification, leverages labeled data from a source domain alongside unlabeled or sparsely labeled data from a target domain. The objective is to enhance performance in the target domain by mitigating the discrepancies in data distributions. Traditional methods in this area have focused on identifying and differentiating between pivotal sentiment words (shared across domains) and domain-specific sentiment words. In our work, we introduce a novel framework called Sentiment Domain Adapter (SDA), which incorporates a Category Attention Network (CAN) alongside a Convolutional Neural Network (CNN). This approach treats pivotal and domain-specific words as part of a collective category of attributes, which SDA learns to discern automatically, thereby enhancing domain adaptation. Additionally, SDA seeks to provide interpretative insights by learning these category attributes. Our model's optimization targets three main goals: (1) supervised classification accuracy, (2) minimizing the disparity in category feature distribution, and (3) maintaining domain invariance. Evaluations on three benchmark datasets for sentiment analysis affirm that SDA surpasses several established baselines.

Keywords

Domain Adaptation; Sentiment Analysis; Interpretability

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.