You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I use functions in addition to a custom prompt, and a history of messages, to my chatCompletion calls.
I need to calculate the amount of tokens in my calls, to avoid hitting the token limit of the model.
I know how to calculate the amount of tokens for my prompt, but how do I to do it for the functions? Should I just tokenize the JSON describing the functions?
The text was updated successfully, but these errors were encountered:
I use functions in addition to a custom prompt, and a history of messages, to my
chatCompletion
calls.I need to calculate the amount of tokens in my calls, to avoid hitting the token limit of the model.
I know how to calculate the amount of tokens for my prompt, but how do I to do it for the functions? Should I just tokenize the JSON describing the functions?
The text was updated successfully, but these errors were encountered: