You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
but it went wrong :
bottom[0]->count() == bottom[1]->count() (16 vs. 8) SIGMOID_CROSS_ENTROPY_LOSS layer inputs must have the same count.
I think maybe this Focal loss is using simoid, so how can i use it with softmax??
thanks!
The text was updated successfully, but these errors were encountered:
wvinzh
changed the title
How 同哦
How to define focal loss in train_val.prototxt
May 5, 2018
wvinzh
changed the title
How to define focal loss in train_val.prototxt
How to define focal loss in train_val.prototxt, two classes with softmax?
May 5, 2018
@wvinzh@lyj0823 @Jacky3213 also have this problem. And i try to change the num_output of fc layer from 2 to 1. It does work, but the acc is very low...just 60%... So, how do you solve it? Please help, Thanks!
I'm using Focal Loss to train a two-class model.
my prototxt is like this:
layer {
bottom: "pool5"
top: "fc2"
name: "fc2"
type: "InnerProduct"
inner_product_param {
num_output: 2
}
}
layer {
bottom: "fc2"
top: "prob"
name: "prob"
type: "Softmax"
}
layer {
name: "loss_cls"
type: "FocalLoss"
bottom: "prob"
bottom: "label"
propagate_down: 1
propagate_down: 0
top: "loss_cls"
include { phase: TRAIN }
loss_weight: 1
loss_param { ignore_label: -1 normalize: true }
focal_loss_param { alpha: 0.5 gamma: 2 }
}
but it went wrong :
bottom[0]->count() == bottom[1]->count() (16 vs. 8) SIGMOID_CROSS_ENTROPY_LOSS layer inputs must have the same count.
I think maybe this Focal loss is using simoid, so how can i use it with softmax??
thanks!
The text was updated successfully, but these errors were encountered: