You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I trained a network with the triplet loss over a small dataset with 6 different identities and about 1000 total images. I noticed that the loss goes to 0 after 100 steps. However, the performances are not good: the embeddings seem just noise. I used a small learning rate (1e-4) and a simple network with 6 convolutions (conv+BN+relu) and 3 pooling. This happens both with batch hard and batch all. What could be the problem?
The text was updated successfully, but these errors were encountered:
What are you measuring when you say "performance is not good"?
If you are looking at embeddings on a separate test set, it is possible that the model overfits with 6 identities. Maybe you could add regularization or increase the number of identities / images?
I trained a network with the triplet loss over a small dataset with 6 different identities and about 1000 total images. I noticed that the loss goes to 0 after 100 steps. However, the performances are not good: the embeddings seem just noise. I used a small learning rate (1e-4) and a simple network with 6 convolutions (conv+BN+relu) and 3 pooling. This happens both with batch hard and batch all. What could be the problem?
The text was updated successfully, but these errors were encountered: