Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError: sat.model.transformer.BaseTransformer() got multiple values for keyword argument 'parallel_output' #27

Open
4ooooo opened this issue Oct 24, 2024 · 3 comments

Comments

@4ooooo
Copy link

4ooooo commented Oct 24, 2024

When I run inference_design2code.py, there is a mistake:

For torch.distributed users or loading model parallel models, set environment variables RANK, WORLD_SIZE and LOCAL_RANK.
[rank0]: Traceback (most recent call last):
[rank0]: File "/data3/zhangsn/Project/design2code/CogVLM/inference_design2code.py", line 37, in
[rank0]: model, model_args = FineTuneTestCogAgentModel.from_pretrained(
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/data3/zhangsn/.conda/envs/design2code/lib/python3.11/site-packages/sat/model/base_model.py", line 217, in from_pretrained
[rank0]: return cls.from_pretrained_base(name, args=args, home_path=home_path, url=url, prefix=prefix, build_only=build_only, overwrite_args=overwrite_args, **kwargs)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/data3/zhangsn/.conda/envs/design2code/lib/python3.11/site-packages/sat/model/base_model.py", line 209, in from_pretrained_base
[rank0]: model = get_model(args, cls, **kwargs)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/data3/zhangsn/.conda/envs/design2code/lib/python3.11/site-packages/sat/model/base_model.py", line 420, in get_model
[rank0]: model = model_cls(args, params_dtype=params_dtype, **kwargs)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/data3/zhangsn/Project/design2code/CogVLM/utils/models/cogagent_model.py", line 222, in init
[rank0]: super().init(args, transformer=transformer, parallel_output=parallel_output, **kw_args)
[rank0]: File "/data3/zhangsn/Project/design2code/CogVLM/utils/models/cogagent_model.py", line 159, in init
[rank0]: super().init(args, transformer=transformer, parallel_output=parallel_output, **kwargs)
[rank0]: File "/data3/zhangsn/.conda/envs/design2code/lib/python3.11/site-packages/sat/model/official/llama_model.py", line 93, in init
[rank0]: super().init(args, transformer=transformer, layernorm=layernorm, activation_func=activation_func, init_method_std=0.01, **kwargs)
[rank0]: File "/data3/zhangsn/.conda/envs/design2code/lib/python3.11/site-packages/sat/model/base_model.py", line 93, in init
[rank0]: self.transformer = BaseTransformer(
[rank0]: ^^^^^^^^^^^^^^^^
[rank0]: TypeError: sat.model.transformer.BaseTransformer() got multiple values for keyword argument 'parallel_output'

@xjywhu
Copy link

xjywhu commented Dec 16, 2024

I have the same error, have you solved it?

@4ooooo
Copy link
Author

4ooooo commented Dec 16, 2024

I give up this project...🤦‍

@xjywhu
Copy link

xjywhu commented Dec 16, 2024

I just solved the issue, this problem was caused by the old version of CogVLM, you can download the latest utils directory from the CogVLM repository (https://github.com/THUDM/CogVLM/) and replace it, then the problem will be solved.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants