-
Notifications
You must be signed in to change notification settings - Fork 44.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open source models support #143
Comments
There's a closed PR from two days ago that added local model support. #29 Not sure what the status is right now. But I don't see anywhere in the current code capable of sending requests anywhere other than the OpenAI API, which is disappointing.
On a bit of a tangent, the recently released Vicuna is far superior to Alpaca and other fine tunes. While Alpaca is trained on 60k synthetic question/answer pairs by GPT-3.5 and GPT4All is trained on 400k GPT-3.5 synthetic question/answers, Vicuna is trained on 90k full conversations (multiple question/answers each) between real humans and GPT-4 or GPT-3.5. The dataset for this comes from ShareGPT and can be accessed here: https://huggingface.co/datasets/anon8231489123/ShareGPT_Vicuna_unfiltered/tree/main/HTML_cleaned_raw_dataset |
This issue might be a duplicate of #25 |
+1 |
1 similar comment
+1 |
Closing as duplicate of #25 |
Any thoughts on adding support for custom open source models like LLaMa, Alpaca, Alpaca-lora, etc?
The text was updated successfully, but these errors were encountered: