You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Discovered while poking around documentation -- the .configurable_alternatives() method may be useful to permit the user to leverage different LLM's based on preference/configuration. Would replace current method in the codebase / might be cleaner. Code example from weblangchain repo:
ifhas_google_creds:
llm=ChatOpenAI(
model="gpt-3.5-turbo-16k",
# model="gpt-4",streaming=True,
temperature=0.1,
).configurable_alternatives(
# This gives this field an id# When configuring the end runnable, we can then use this id to configure this fieldConfigurableField(id="llm"),
default_key="openai",
anthropic=ChatAnthropic(
model="claude-2",
max_tokens=16384,
temperature=0.1,
anthropic_api_key=os.environ.get("ANTHROPIC_API_KEY", "not_provided"),
),
googlevertex=ChatVertexAI(
model_name="chat-bison-32k",
temperature=0.1,
max_output_tokens=8192,
stream=True,
),
)
The text was updated successfully, but these errors were encountered:
Discovered while poking around documentation -- the .configurable_alternatives() method may be useful to permit the user to leverage different LLM's based on preference/configuration. Would replace current method in the codebase / might be cleaner. Code example from weblangchain repo:
The text was updated successfully, but these errors were encountered: