You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It would be nice to directly associate a model to a given prompt. As Ollama can switch models at runtime it would be very convenient to automate this instead of having to manually pick a model each time.
The custom prompts specification could include more information:
System prompt
Main prompt
Model to use
Inference parameters to use
Few shots (option)
This would make the custom prompts more precise. For example using a few shots helps a lot with small models, or temperature changes between the tasks. To have different inference params and system prompts for each task would be really nice.
I have made a Typescript spec and library that implements yaml defined prompts with all these features, if it can help: check the doc and code
The text was updated successfully, but these errors were encountered:
It would be nice to directly associate a model to a given prompt. As Ollama can switch models at runtime it would be very convenient to automate this instead of having to manually pick a model each time.
The custom prompts specification could include more information:
This would make the custom prompts more precise. For example using a few shots helps a lot with small models, or temperature changes between the tasks. To have different inference params and system prompts for each task would be really nice.
I have made a Typescript spec and library that implements yaml defined prompts with all these features, if it can help: check the doc and code
The text was updated successfully, but these errors were encountered: