ToolNode execution output streaming #104
Unanswered
Soumadip-Saha
asked this question in
Q&A
Replies: 1 comment 3 replies
-
You may want to implement your own version of ToolNode. Within agent-service-toolkit, I think the BG Task might actually be pretty well suited to your use case, or doing something similar with CustomData.adispatch() which enables you to send an arbitrary message blob as a stream chunk in the middle of a node execution. That's about the extent of insight I can offer although Peter might have other thoughts. Cheers and good luck! |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello @JoshuaC215 and @peterkeppert,
First, I'd like to express my appreciation for your excellent project. As I delve into the internals of LangGraph for a personal project, I came across your repository and found it highly insightful.
In line with the ongoing discussion, I have a question regarding the design of a tool intended to execute Python code in a remote environment and stream its standard output via a
/stream
API endpoint. The goal is to transform the ToolNode into a generator that streams output progressively, ultimately returning the complete result upon code execution completion.For remote code execution, I'm utilizing the Jupyter Kernel Gateway's WebSocket method. The rationale behind streaming the tool's output is to handle time-consuming processes—such as model training or computationally intensive tasks—without requiring users to wait for the entire execution to finish before receiving updates. This approach aims to provide real-time progress feedback.
Currently, I have a synchronous function that displays the output in a Streamlit UI:
While this synchronous method works, my objective is to transition to an asynchronous model to facilitate streaming. This transition is essential for handling long-running tasks efficiently.
During my exploration, I observed that the
ToolNode
triggerson_tool_start
andon_tool_end
events:To achieve the desired streaming behavior, I am considering introducing an
on_tool_stream
event. This event would allow the tool to emit intermediate outputs during execution, enabling real-time progress updates.Additionally, in a previous message within the discussion thread #97, it was mentioned that
Since I do not plan to use Streamlit as a frontend and intend to develop a custom solution, I am exploring alternative approaches that might be more suitable for my use case.
I would appreciate any insights or recommendations you can provide on implementing this streaming functionality within the ToolNode framework.
Thank you for your time and assistance.
Beta Was this translation helpful? Give feedback.
All reactions