-
Notifications
You must be signed in to change notification settings - Fork 5
/
Copy pathREADME.md~
executable file
·59 lines (41 loc) · 1.72 KB
/
README.md~
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
# Support Neighbor Loss for Person Re-Identification
This repository is for the paper introduced in the following paper
Kai Li, Zhengming Ding, Kunpeng Li, Yulun Zhang, and Yun Fu, "Support Neighbor Loss for Person Re-Identification", ACM Multimedia (ACM MM) 2018, [[arXiv]](https://arxiv.org/abs/1808.06030)
## Environment
Python 3 + PyTorch 3.0
## Data preparation
Please refer [this repo](https://github.com/huanghoujing/person-reid-triplet-loss-baseline) for the data preparation and modify the data locations accordingly in the train.sh and test.sh files.
## Train
```
sh ./train.sh
```
## Test
### Pretrained models: [Market1501](https://github.com/kailigo/SN_loss_for_reID), [CUHK01](https://github.com/kailigo/SN_loss_for_reID), [CUHK03](https://github.com/kailigo/SN_loss_for_reID)
```
sh ./test.sh
```
## Results
### Quantitative Results
Results on Market1501
![Retrieval](/figs/quantitative.png)
<!-- Analysis on the impact of gallery size
![Retrieval](/figs/gal_size_analysis.png) -->
### Visual Results
Pedestrian retrieval results
![Retrieval](/figs/retrieval.png)
Pedestrian feature embedding visualization
![Embedding visualization](/figs/embedding.png)
## Citation
If you find the code helpful in your resarch or work, please cite the following papers.
```
@inproceedings{li2018support,
title={Support neighbor loss for person re-identification},
author={Li, Kai and Ding, Zhengming and Li, Kunpeng and Zhang, Yulun and Fu, Yun},
booktitle={2018 ACM Multimedia Conference on Multimedia Conference},
pages={1492--1500},
year={2018},
organization={ACM}
}
```
## Acknowledgements
This code is built on [this](https://github.com/huanghoujing/person-reid-triplet-loss-baseline) repository, developed by Houjing Huang.