You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I would like to use dspy for extensive Q/A and information extraction on very long input texts. Since dspy builds the prompt based on the Signature and appends input fields, the resulting prompt will change for every question. I want to ask whether we can instruct dspy to place the instruction after the input. This way, we can reuse the KV cache in a self-hosted LLM or save money with OpenAI's prompt caching.
Thanks.
The text was updated successfully, but these errors were encountered:
Hi,
I would like to use dspy for extensive Q/A and information extraction on very long input texts. Since dspy builds the prompt based on the Signature and appends input fields, the resulting prompt will change for every question. I want to ask whether we can instruct dspy to place the instruction after the input. This way, we can reuse the KV cache in a self-hosted LLM or save money with OpenAI's prompt caching.
Thanks.
The text was updated successfully, but these errors were encountered: