Skip to content

Commit

Permalink
[0.5.10] disable parallel tool calls for OpenAI
Browse files Browse the repository at this point in the history
  • Loading branch information
yashbonde committed Oct 28, 2024
1 parent 725d405 commit 0e55110
Show file tree
Hide file tree
Showing 5 changed files with 17 additions and 4 deletions.
3 changes: 1 addition & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
# Tune API

Python package for building GenAI apps tightly integrated to Tune Studio.
This is **not** the official SDK for Tune AI. This is an open source python package used to build GenAI apps at Tune AI.

> The largest utils for any GenAI package ever! See [tuneapi.utils](./tuneapi/utils/__init__.py)
10 changes: 10 additions & 0 deletions docs/changelog.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,16 @@ minor versions.

All relevant steps to be taken will be mentioned here.

0.5.10
-----

- Remove redundant prints.

0.5.9
-----

- By default set the value ``parallel_tool_calls`` in OpenAI to ``False``.

0.5.8
-----

Expand Down
2 changes: 1 addition & 1 deletion docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
project = "tuneapi"
copyright = "2024, Frello Technologies"
author = "Frello Technologies"
release = "0.5.8"
release = "0.5.10"

# -- General configuration ---------------------------------------------------
# https://www.sphinx-doc.org/en/master/usage/configuration.html#general-configuration
Expand Down
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[tool.poetry]
name = "tuneapi"
version = "0.5.8"
version = "0.5.10"
description = "Tune AI APIs."
authors = ["Frello Technology Private Limited <[email protected]>"]
license = "MIT"
Expand Down
4 changes: 4 additions & 0 deletions tuneapi/apis/model_openai.py
Original file line number Diff line number Diff line change
Expand Up @@ -99,6 +99,7 @@ def chat(
model: Optional[str] = None,
max_tokens: int = 1024,
temperature: float = 1,
parallel_tool_calls: bool = False,
token: Optional[str] = None,
extra_headers: Optional[Dict[str, str]] = None,
**kwargs,
Expand All @@ -109,6 +110,7 @@ def chat(
model=model,
max_tokens=max_tokens,
temperature=temperature,
parallel_tool_calls=parallel_tool_calls,
token=token,
extra_headers=extra_headers,
raw=False,
Expand All @@ -126,6 +128,7 @@ def stream_chat(
model: Optional[str] = None,
max_tokens: int = 1024,
temperature: float = 1,
parallel_tool_calls: bool = False,
token: Optional[str] = None,
timeout=(5, 60),
extra_headers: Optional[Dict[str, str]] = None,
Expand All @@ -142,6 +145,7 @@ def stream_chat(
"model": model or self.model_id,
"stream": True,
"max_tokens": max_tokens,
"parallel_tool_calls": parallel_tool_calls,
}
if isinstance(chats, tt.Thread) and len(chats.tools):
data["tools"] = [
Expand Down

0 comments on commit 0e55110

Please sign in to comment.