Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Some providers like TeachAnything or Pizzagpt aren't returning responses anymore #2392

Open
unical1988 opened this issue Nov 20, 2024 · 21 comments
Assignees
Labels
bug Something isn't working

Comments

@unical1988
Copy link

I have first tested TeachAnything with test-ada-001 model and it worked 2 months ago, but now it doesn't, same for Pizzagpt.

many other providers return errors

is this accounted for / documented somewhere? how to fix these so that a seemless use of g4f can be possible?

@unical1988 unical1988 added the bug Something isn't working label Nov 20, 2024
@TheFirstNoob
Copy link

@unical1988 Hi, provide more information please.
TeachAnything use gpt3.5-turbo-0125 model and llama-3.1-70b
PizzaGPT use gpt-4o-mini model

Old model not used for most main providers and can be remove later on site. Its not g4f bug.
Set correct model and repeat for answer check

@erstrik
Copy link

erstrik commented Nov 21, 2024

Hi, I confirm I can't use PizzaGPT anymore.

your avatar

test
(1 words, 4 chars, 1 tokens)
your avatar
Pizzagpt with gpt-4o-mini
(0 words, 0 chars, 0 tokens)

@unical1988
Copy link
Author

@TheFirstNoob can you provide an exhaustive list of associations between the providers and the models they use?

@TheFirstNoob
Copy link

@unical1988 You can use this full list: https://github.com/xtekky/gpt4free/blob/main/g4f/models.py#L83
Or open g4f/Provider/Name provider and see what models each provider use direclty.

@unical1988
Copy link
Author

@TheFirstNoob How to use these for text completion, I used:
response=g4f.Completion.create(
model='claude-3-sonnet',
prompt=str(question)
provider=Liaobots
)
and got g4f.errors.ModelNotAllowedError: Can't use claude-3-sonnet with Completion.create()

is the list of providers (obtained by g4f.Provider.providers) different from version to other?

@TheFirstNoob
Copy link

TheFirstNoob commented Nov 21, 2024

@unical1988 Try to claude-3.5-sonnet with Blackbox provider. Liaobots provider not stable (Rate limits).

@unical1988
Copy link
Author

unical1988 commented Nov 22, 2024

@TheFirstNoob which method to use to use claude-3.5-sonnet with Blackbox provider?

The error I got is: Can't use claude-3.5-sonnet with Completion.create()

g4f.Completion.create()?

@TheFirstNoob
Copy link

TheFirstNoob commented Nov 22, 2024

@unical1988 response = client.chat.completions.create

from g4f.client import Client

client = Client()

response = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[
        {
            "role": "user",
            "content": "Say this is a test"
        }
    ]
    # Add any other necessary parameters
)

print(response.choices[0].message.content)

https://github.com/xtekky/gpt4free/blob/main/docs/client.md

@unical1988
Copy link
Author

@TheFirstNoob I don't want chat I just want text completion, what is the method to use?

@TheFirstNoob
Copy link

This example code above is text completion

https://github.com/xtekky/gpt4free/blob/main/docs/client.md#text-completions

@unical1988
Copy link
Author

@TheFirstNoob it doesn't show all necessary parameters/attributes; how to specify the provider I want to select ?

@TheFirstNoob
Copy link

@unical1988
Copy link
Author

@TheFirstNoob No I can't find the answer in the link you sent, I just need one example of text completion with specified params of model and provider. Thanks

@unical1988
Copy link
Author

@TheFirstNoob also how to excluse certain providers when using RetryProvider

@TheFirstNoob
Copy link

The example above uses the providers and models exactly as you require. You can put them in arrays and RetryProvider will only work with the providers you specify. It will not work with those you do NOT specify.

Example array:

        "gpt-3.5-turbo": [Airforce],
        "gpt-4": [Mhystical],
        "gpt-4-turbo": [Airforce],
        "gpt-4o-mini": [Pizzagpt, Airforce, ChatGptEs, DDG],
        "gpt-4o": [Blackbox, ChatGptEs, Airforce],
        "claude-3-haiku": [DDG, Airforce],
        "claude-3.5-sonnet": [Blackbox, Airforce],
        "blackbox": [Blackbox],
        "blackbox-pro": [Blackbox],
        "gemini-flash": [Blackbox, Airforce],
        "gemini-pro": [Blackbox, Airforce],
        "gemma-2b-27b": [Airforce],
        "command-r-plus": [HuggingChat],
        "llama-3.1-70b": [HuggingChat, Blackbox, TeachAnything, Free2GPT, Airforce, DDG],
        "llama-3.1-405b": [Blackbox, Airforce],
        "llama-3.2-11b": [HuggingChat, HuggingFace],
        "llama-3.2-90b": [Airforce],
        "nemotron-70b": [HuggingChat],
        "sonar-chat": [PerplexityLabs],
        "lfm-40b": [PerplexityLabs],
        "qwen-2-72b": [HuggingChat],
        "mixtral-8x7b": [HuggingChat, DDG],
        "mixtral-8x22b": [Airforce],
        "yi-34b": [Airforce],
        "phi-3.5-mini": [HuggingChat],

Example full base code:

from g4f.client import Client
from g4f.Provider import RetryProvider, Airforce, Mhystical, Pizzagpt, Blackbox, HuggingChat, TeachAnything, Free2GPT, DDG, PerplexityLabs, ChatGptEs, HuggingFace
import g4f.debug

g4f.debug.logging = True
g4f.debug.version_check = False

# List of available models and their providers
models = {
   "gpt-3.5-turbo": [Airforce],
   "gpt-4": [Mhystical],
   "gpt-4-turbo": [Airforce],
   "gpt-4o-mini": [Pizzagpt, Airforce, ChatGptEs, DDG],
   "gpt-4o": [Blackbox, ChatGptEs, Airforce],
   "claude-3-haiku": [DDG, Airforce],
   "claude-3.5-sonnet": [Blackbox, Airforce],
   "blackbox": [Blackbox],
   "blackbox-pro": [Blackbox],
   "gemini-flash": [Blackbox, Airforce],
   "gemini-pro": [Blackbox, Airforce],
   "gemma-2b-27b": [Airforce],
   "command-r-plus": [HuggingChat],
   "llama-3.1-70b": [HuggingChat, Blackbox, TeachAnything, Free2GPT, Airforce, DDG],
   "llama-3.1-405b": [Blackbox, Airforce],
   "llama-3.2-11b": [HuggingChat, HuggingFace],
   "llama-3.2-90b": [Airforce],
   "nemotron-70b": [HuggingChat],
   "sonar-chat": [PerplexityLabs],
   "lfm-40b": [PerplexityLabs],
   "qwen-2-72b": [HuggingChat],
   "mixtral-8x7b": [HuggingChat, DDG],
   "mixtral-8x22b": [Airforce],
   "yi-34b": [Airforce],
   "phi-3.5-mini": [HuggingChat],
}

# Model selection
print("Available models:")
for i, model in enumerate(models.keys()):
   print(f"{i + 1}. {model}")

model_choice = int(input("Select the model number: ")) - 1
selected_model = list(models.keys())[model_choice]

# Input text for prompt
user_input = input("Enter your text: ")

client = Client(
   provider=RetryProvider(models[selected_model], shuffle=False)
)

response = client.chat.completions.create(
   model=selected_model,
   messages=[
       {
           "role": "user",
           "content": user_input
       }
   ]
)

print(response.choices[0].message.content)

@unical1988
Copy link
Author

How to make RetryProvider work with all possible providers ? how to exclude certain from those?

@TheFirstNoob
Copy link

"gpt-4o-mini": [Pizzagpt, Airforce, ChatGptEs, DDG, remove or add provider here],
provider is not list here never called

@unical1988
Copy link
Author

unical1988 commented Nov 22, 2024

I don't want to specify the model name, I just want the model to try out all possible providers except certain ones that a specifically remove, how to do that?

@unical1988
Copy link
Author

is

"gpt-3.5-turbo": [Airforce],
    "gpt-4": [Mhystical],
    "gpt-4-turbo": [Airforce],
    "gpt-4o-mini": [Pizzagpt, Airforce, ChatGptEs, DDG],
    "gpt-4o": [Blackbox, ChatGptEs, Airforce],
    "claude-3-haiku": [DDG, Airforce],
    "claude-3.5-sonnet": [Blackbox, Airforce],
    "blackbox": [Blackbox],
    "blackbox-pro": [Blackbox],
    "gemini-flash": [Blackbox, Airforce],
    "gemini-pro": [Blackbox, Airforce],
    "gemma-2b-27b": [Airforce],
    "command-r-plus": [HuggingChat],
    "llama-3.1-70b": [HuggingChat, Blackbox, TeachAnything, Free2GPT, Airforce, DDG],
    "llama-3.1-405b": [Blackbox, Airforce],
    "llama-3.2-11b": [HuggingChat, HuggingFace],
    "llama-3.2-90b": [Airforce],
    "nemotron-70b": [HuggingChat],
    "sonar-chat": [PerplexityLabs],
    "lfm-40b": [PerplexityLabs],
    "qwen-2-72b": [HuggingChat],
    "mixtral-8x7b": [HuggingChat, DDG],
    "mixtral-8x22b": [Airforce],
    "yi-34b": [Airforce],
    "phi-3.5-mini": [HuggingChat],

an exhaustive list, is there any other model/provider I can use for chat.completion? Thanks.

@TheFirstNoob
Copy link

You can create array for unused providers with IterListProvider class on excluded list
https://github.com/xtekky/gpt4free/blob/main/g4f/providers/retry_provider.py#L11

But I still recommend the option above because it has more control over the required models and providers.

@TheFirstNoob
Copy link

I have already written about models and providers above, directing you to models.py https://github.com/xtekky/gpt4free/blob/main/g4f/models.py#L83
It works exactly on the same principle that I suggested to you. The model is selected and goes through the list of providers that are indicated there. That is why it is better to make your own list for your needs and requirements.

The file list always indicates the models and providers available to them.
If you specify a model and select the wrong provider, you will get an error as expected.

Please create your own list that you need based on the example above and do not forget to update it if the library is updated.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

5 participants
@TheFirstNoob @erstrik @unical1988 @xtekky and others