-
Notifications
You must be signed in to change notification settings - Fork 56
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to evaluate toxicity task on local hf-llama2-7B? #19
Comments
Thanks for your interest. To specify a local HF model, please use |
Thanks for your reply, but I have already used my local hf-llama2-7b model's location |
Please try this #!/bin/bash
dt-run +toxicity=realtoxicityprompts-toxic \
++model=hf//../llama/llama-2-7b-hf \
++toxicity.n=25 \
++toxicity.template=1 |
Thx, but I've already tried this and I got this bug: It seems worse than the former one, how could I fix it? FYI, my
|
I think the main problem is there is no |
Could you try using absolute path? |
Here is my code:
#!/bin/bash dt-run +toxicity=realtoxicityprompts-toxic \ ++model=hf/../llama/llama-2-7b-hf \ ++toxicity.n=25 \ ++toxicity.template=1
and the bug is
Traceback (most recent call last):
File "/mnt/disk1/yg/DecodingTrust/src/dt/main.py", line 42, in main
perspective_module.main(perspective_args(**perspective_config))
File "/mnt/disk1/yg/DecodingTrust/src/dt/perspectives/toxicity/text_generation_hydra.py", line 29, in main
generator = Chat.from_helm(OPTS, conv_template=args.conv_template, cache=dirname, api_key=args.key)
File "/mnt/disk1/yg/DecodingTrust/src/dt/chat.py", line 41, in from_helm
return HFChat(model_name.replace("hf/", "").rstrip("/"), **kwargs)
File "/mnt/disk1/yg/DecodingTrust/src/dt/chat.py", line 364, in init
self.conv_template = get_conv_template(conv_template)
File "/mnt/disk1/yg/DecodingTrust/src/dt/conversation.py", line 284, in get_conv_template
return conv_templates[name].copy()
KeyError: None
How can i fix it?
The text was updated successfully, but these errors were encountered: