Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

您好,请问我想在fusion_embedding中删除一个embedding,遇到一个问题,请问该如何解决?十分感谢 #54

Open
ZZZdb opened this issue Apr 6, 2022 · 0 comments

Comments

@ZZZdb
Copy link

ZZZdb commented Apr 6, 2022

File "/root/ChineseBert/ChineseBert-main/models/fusion_embedding.py", line 72, in forward
inputs_embeds = self.map_fc(concat_embeddings)
File "/root/miniconda3/lib/python3.8/site-packages/torch/nn/modules/module.py", line 727, in _call_impl
result = self.forward(*input, **kwargs)
File "/root/miniconda3/lib/python3.8/site-packages/torch/nn/modules/linear.py", line 93, in forward
return F.linear(input, self.weight, self.bias)
File "/root/miniconda3/lib/python3.8/site-packages/torch/nn/functional.py", line 1692, in linear
output = input.matmul(weight.t())
RuntimeError: mat1 dim 1 must match mat2 dim 0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant