Preprint Article Version 1 This version is not peer-reviewed

Siamese Neural Network Based Apperance Model for Multi-Target

Version 1 : Received: 11 May 2019 / Approved: 13 May 2019 / Online: 13 May 2019 (13:32:25 CEST)

How to cite: Ullah, M. Siamese Neural Network Based Apperance Model for Multi-Target. Preprints 2019, 2019050160 (doi: 10.20944/preprints201905.0160.v1). Ullah, M. Siamese Neural Network Based Apperance Model for Multi-Target. Preprints 2019, 2019050160 (doi: 10.20944/preprints201905.0160.v1).

Abstract

An appearance model plays a crucial rule in multi-target tracking. In traditional approaches, the two steps of appearance modeling i.e visual representation and statistically similarity measure are modeled separately. Visual representation is achieved either through hand-crafted features or deep features and statically similarity is measure through a cross entropy loss function. A loss function based on cross-entropy (KL-divergence, mutual information) find closely related probability distribution for the targets. However, if the targets have similar visual representation, it ends up mixing the targets. To tackle this problem, we come up with a synergetic appearance model named Single Shot Appearance Model based on Siamese neural network. The network is trained with a contrastive loss function for finding the similarity between different targets in a single shot. The input to the network is two target patches and based on their similarity, a contrastive score is output by the network. The proposed model is evaluated on accumulative dissimilarity metric on three datasets. Quantitatively, promising results are achieved against three baseline methods.

Subject Areas

Siamese neural network, appearance model, contrastive loss, cross entropy.

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our diversity statement.

Leave a public comment
Send a private comment to the author(s)
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.