Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot train the model with multiple GPU #1

Open
chenweize1998 opened this issue Dec 15, 2020 · 0 comments
Open

Cannot train the model with multiple GPU #1

chenweize1998 opened this issue Dec 15, 2020 · 0 comments

Comments

@chenweize1998
Copy link

Hi. I follow the steps listed in README, but when calculating the loss at training time, the code throws an error at

loss = self.loss.calculate_loss(logits, one_hot_true_types)

When there are 4 GPUs, the dimension of logits is 225, 8822 while one_hot_true_types is 900, 8822. It seems that the true-types tensor is not distributed to different GPUs. Could you fix it?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant