You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I just solved the issue, this problem was caused by the old version of CogVLM, you can download the latest utils directory from the CogVLM repository (https://github.com/THUDM/CogVLM/) and replace it, then the problem will be solved.
When I run inference_design2code.py, there is a mistake:
For torch.distributed users or loading model parallel models, set environment variables RANK, WORLD_SIZE and LOCAL_RANK.
[rank0]: Traceback (most recent call last):
[rank0]: File "/data3/zhangsn/Project/design2code/CogVLM/inference_design2code.py", line 37, in
[rank0]: model, model_args = FineTuneTestCogAgentModel.from_pretrained(
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/data3/zhangsn/.conda/envs/design2code/lib/python3.11/site-packages/sat/model/base_model.py", line 217, in from_pretrained
[rank0]: return cls.from_pretrained_base(name, args=args, home_path=home_path, url=url, prefix=prefix, build_only=build_only, overwrite_args=overwrite_args, **kwargs)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/data3/zhangsn/.conda/envs/design2code/lib/python3.11/site-packages/sat/model/base_model.py", line 209, in from_pretrained_base
[rank0]: model = get_model(args, cls, **kwargs)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/data3/zhangsn/.conda/envs/design2code/lib/python3.11/site-packages/sat/model/base_model.py", line 420, in get_model
[rank0]: model = model_cls(args, params_dtype=params_dtype, **kwargs)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/data3/zhangsn/Project/design2code/CogVLM/utils/models/cogagent_model.py", line 222, in init
[rank0]: super().init(args, transformer=transformer, parallel_output=parallel_output, **kw_args)
[rank0]: File "/data3/zhangsn/Project/design2code/CogVLM/utils/models/cogagent_model.py", line 159, in init
[rank0]: super().init(args, transformer=transformer, parallel_output=parallel_output, **kwargs)
[rank0]: File "/data3/zhangsn/.conda/envs/design2code/lib/python3.11/site-packages/sat/model/official/llama_model.py", line 93, in init
[rank0]: super().init(args, transformer=transformer, layernorm=layernorm, activation_func=activation_func, init_method_std=0.01, **kwargs)
[rank0]: File "/data3/zhangsn/.conda/envs/design2code/lib/python3.11/site-packages/sat/model/base_model.py", line 93, in init
[rank0]: self.transformer = BaseTransformer(
[rank0]: ^^^^^^^^^^^^^^^^
[rank0]: TypeError: sat.model.transformer.BaseTransformer() got multiple values for keyword argument 'parallel_output'
The text was updated successfully, but these errors were encountered: