Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dev zhanglu #3649

Merged
merged 2 commits into from
Aug 29, 2023
Merged

Dev zhanglu #3649

merged 2 commits into from
Aug 29, 2023

Conversation

zhanglu0704
Copy link
Contributor

Check whether the size of the embedding layer has been resized, rather than checking if the length of the tokenizer is equal to the size of the embedding layer.
This is because the embedding layer of the model may be resized and padded to a multiple of 16, but the tokenizer may not be padded, which will cause the freezing layer to fail. However, at this time, the embedding layer will not be resized and freezing can still be performed.

Check whether the size of the embedding layer has been resized,
rather than checking if the length of the tokenizer is equal to the size of the embedding layer.

This is because the embedding layer of the model may be resized and padded to a multiple of 16,
but the tokenizer may not be padded, which will cause the freezing layer to fail. However,
at this time, the embedding layer will not be resized and freezing can still be performed.
@@ -42,7 +42,7 @@ tests = [
]

[tool.setuptools]
py-modules = ["model_training"]
packages = ["model_training"]
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am unsure why this change is shown, this was already merged in #3643.

Copy link
Collaborator

@andreaskoepf andreaskoepf left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok, thanks for the fix. Maybe it could later be changed to completely use the embedding sizes and not use the size of the vocab as the basis. I will merge it for now as you proposed.

@andreaskoepf andreaskoepf merged commit 7e40ee3 into LAION-AI:main Aug 29, 2023
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants