-
Notifications
You must be signed in to change notification settings - Fork 182
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Automatically exit while Launch a model worker. #31
Comments
Got some error. Hope authors to advice how to solve this problem: 2024-09-27 13:20:07 | INFO | model_worker | args: Namespace(host='0.0.0.0', port=40000, worker_address='http://localhost:40000', controller_address='http://localhost:10000', model_path='Llama-3.1-8B-Omni', model_base=None, model_name='Llama-3.1-8B-Omni', device='cuda', limit_model_concurrency=5, stream_interval=1, no_register=False, load_8bit=False, load_4bit=False, use_flash_attn=False, input_type='mel', mel_size=128, s2s=True, is_lora=False) |
Same here. I followed the set up in the readme and |
I am following the steps in the readme.md document to install the environment on my Windows computer. When I execute
, the program automatically exits. What could be the reason? Could you help me take a look?
here is my error log
The text was updated successfully, but these errors were encountered: