-
Notifications
You must be signed in to change notification settings - Fork 441
Issues: THUDM/GLM-4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
使用lora方案进行GLM-4-9B Chat 对话模型微调后,如何在xinference平台中加载?
#660
opened Nov 26, 2024 by
xuechaofei
1 of 2 tasks
使用GLM-4V-9B的时候,尽管设置了gen_kwargs = {"do_sample": False,"top_p":None,"temperature":None},但每次结果还是不一样
#654
opened Nov 16, 2024 by
nwym126
1 of 2 tasks
GLM-4-9B-int4模型的输出格式不听从prompt使唤,无论怎么调都不行,非常倔强,有解决方案吗
#636
opened Nov 1, 2024 by
bolt163
1 of 2 tasks
使用glm_server.py作为llm服务端,使用agent传输tools工具时,streaming模式并不会流式输出。
#618
opened Oct 30, 2024 by
jurnea
1 of 2 tasks
我想在jetson agx orin上的ollama跑glm-4v-9b或者是CogVLM2,需要从safetensors转gguf导入到ollama中,用llama.cpp转换,报错。
#616
opened Oct 29, 2024 by
GengyuXu
2 tasks
使用inference.py推理微调后的模型总报这个错:OSError: /tiamat-NAS/boyang/GLM4/gjm/1024/checkpoint-12000 does not appear to have a file named THUDM/glm-4-9b-chat--configuration_chatglm.py. Checkout 'https://huggingface.co//tiamat-NAS/boyang/GLM4/gjm/1024/checkpoint-12000/None' for available files.
#612
opened Oct 27, 2024 by
LolerPanda
1 of 2 tasks
在使用FUDGE时出现报错:RuntimeError: Error(s) in loading state_dict for Model:
#604
opened Oct 22, 2024 by
Enermy
Previous Next
ProTip!
Adding no:label will show everything without a label.