Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About performance on market1501 for global learning and mutual learning #69

Open
vincentman opened this issue Feb 13, 2019 · 0 comments

Comments

@vincentman
Copy link

I try to train with global learning and mutual learning on market1501, and then mAP for model 1 is 68.82%, mAP for model 2 is 68.56%. These performances are worse than yours(your mAP is around 75%).

Snapshot from AlignedReID-Scores.xlsx:
image

What are your parameters for training? I post my training script as following:
python3 script/experiment/train_ml.py
-d "((0,), (1,))"
-r 1
--num_models 2
--dataset market1501
--ids_per_batch 32
--ims_per_id 4
--normalize_feature false
-gm 0.3
-glw 1
-llw 0
-idlw 0
-pmlw 0
-gdmlw 1
-ldmlw 0
--base_lr 2e-4
--lr_decay_type exp
--exp_decay_at_epoch 151
--total_epochs 300

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant