diff --git a/docs/docs/concepts/streaming.md b/docs/docs/concepts/streaming.md index 0b9144b0..e34f5585 100644 --- a/docs/docs/concepts/streaming.md +++ b/docs/docs/concepts/streaming.md @@ -9,18 +9,19 @@ There are several different modes you can specify when calling these methods (e. - [`"values"`](/langgraphjs/how-tos/stream-values): This streams the full value of the state after each step of the graph. - [`"updates"`](/langgraphjs/how-tos/stream-updates): This streams the updates to the state after each step of the graph. If multiple updates are made in the same step (e.g. multiple nodes are run) then those updates are streamed separately. +- [`"custom"`](/langgraphjs/how-tos/streaming-content.ipynb): This streams custom data from inside your graph nodes. +- [`"messages"`](/langgraphjs/how-tos/streaming-tokens.ipynb): This streams LLM tokens and metadata for the graph node where LLM is invoked. - `"debug"`: This streams as much information as possible throughout the execution of the graph. The below visualization shows the difference between the `values` and `updates` modes: ![values vs updates](./img/streaming/values_vs_updates.png) - ## Streaming LLM tokens and events (`.streamEvents`) In addition, you can use the [`streamEvents`](/langgraphjs/how-tos/streaming-events-from-within-tools) method to stream back events that happen _inside_ nodes. This is useful for streaming tokens of LLM calls. -This is a standard method on all [LangChain objects](https://js.langchain.com/docs/concepts/#runnable-interface). This means that as the graph is executed, certain events are emitted along the way and can be seen if you run the graph using `.streamEvents`. +This is a standard method on all [LangChain objects](https://js.langchain.com/docs/concepts/#runnable-interface). This means that as the graph is executed, certain events are emitted along the way and can be seen if you run the graph using `.streamEvents`. All events have (among other things) `event`, `name`, and `data` fields. What do these mean? @@ -30,9 +31,9 @@ All events have (among other things) `event`, `name`, and `data` fields. What do What types of things cause events to be emitted? -* each node (runnable) emits `on_chain_start` when it starts execution, `on_chain_stream` during the node execution and `on_chain_end` when the node finishes. Node events will have the node name in the event's `name` field -* the graph will emit `on_chain_start` in the beginning of the graph execution, `on_chain_stream` after each node execution and `on_chain_end` when the graph finishes. Graph events will have the `LangGraph` in the event's `name` field -* Any writes to state channels (i.e. anytime you update the value of one of your state keys) will emit `on_chain_start` and `on_chain_end` events +- each node (runnable) emits `on_chain_start` when it starts execution, `on_chain_stream` during the node execution and `on_chain_end` when the node finishes. Node events will have the node name in the event's `name` field +- the graph will emit `on_chain_start` in the beginning of the graph execution, `on_chain_stream` after each node execution and `on_chain_end` when the graph finishes. Graph events will have the `LangGraph` in the event's `name` field +- Any writes to state channels (i.e. anytime you update the value of one of your state keys) will emit `on_chain_start` and `on_chain_end` events Additionally, any events that are created inside your nodes (LLM events, tool events, manually emitted events, etc.) will also be visible in the output of `.streamEvents`. @@ -50,18 +51,19 @@ function callModel(state: typeof MessagesAnnotation.State) { } const workflow = new StateGraph(MessagesAnnotation) - .addNode("callModel", callModel) - .addEdge("start", "callModel") - .addEdge("callModel", "end"); + .addNode("callModel", callModel) + .addEdge("start", "callModel") + .addEdge("callModel", "end"); const app = workflow.compile(); const inputs = [{ role: "user", content: "hi!" }]; - for await (const event of app.streamEvents({ messages: inputs })) { - const kind = event.event; - console.log(`${kind}: ${event.name}`); - } +for await (const event of app.streamEvents({ messages: inputs })) { + const kind = event.event; + console.log(`${kind}: ${event.name}`); +} ``` + ```shell on_chain_start: LangGraph on_chain_start: __start__ @@ -90,7 +92,7 @@ on_chain_end: LangGraph We start with the overall graph start (`on_chain_start: LangGraph`). We then write to the `__start__` node (this is special node to handle input). We then start the `callModel` node (`on_chain_start: callModel`). We then start the chat model invocation (`on_chat_model_start: ChatOpenAI`), -stream back token by token (`on_chat_model_stream: ChatOpenAI`) and then finish the chat model (`on_chat_model_end: ChatOpenAI`). From there, +stream back token by token (`on_chat_model_stream: ChatOpenAI`) and then finish the chat model (`on_chat_model_end: ChatOpenAI`). From there, we write the results back to the channel (`ChannelWrite`) and then finish the `callModel` node and then the graph as a whole. This should hopefully give you a good sense of what events are emitted in a simple graph. But what data do these events contain? @@ -117,6 +119,7 @@ These events look like: 'data': {'chunk': AIMessageChunk({ content: 'Hello', id: 'run-3fdbf494-acce-402e-9b50-4eab46403859' })}, 'parent_ids': []} ``` + We can see that we have the event type and name (which we knew from before). We also have a bunch of stuff in metadata. Noticeably, `'langgraph_node': 'callModel',` is some really helpful information diff --git a/docs/docs/how-tos/index.md b/docs/docs/how-tos/index.md index 5bcfb081..aee88e88 100644 --- a/docs/docs/how-tos/index.md +++ b/docs/docs/how-tos/index.md @@ -60,6 +60,7 @@ These guides show how to use different streaming modes. - [How to configure multiple streaming modes](stream-multiple.ipynb) - [How to stream LLM tokens](stream-tokens.ipynb) - [How to stream LLM tokens without LangChain models](streaming-tokens-without-langchain.ipynb) +- [How to stream custom data](streaming-content.ipynb) - [How to stream events from within a tool](streaming-events-from-within-tools.ipynb) - [How to stream from the final node](streaming-from-final-node.ipynb) diff --git a/examples/how-tos/stream-tokens.ipynb b/examples/how-tos/stream-tokens.ipynb index 33031e2c..9c2bfaf1 100644 --- a/examples/how-tos/stream-tokens.ipynb +++ b/examples/how-tos/stream-tokens.ipynb @@ -192,7 +192,11 @@ "source": [ "import { ChatOpenAI } from \"@langchain/openai\";\n", "\n", - "const model = new ChatOpenAI({ model: \"gpt-4o\", temperature: 0 });" + "const model = new ChatOpenAI({\n", + " model: \"gpt-4o-mini\",\n", + " temperature: 0,\n", + " streaming: true\n", + "});" ] }, { @@ -207,7 +211,7 @@ }, { "cell_type": "code", - "execution_count": 6, + "execution_count": 7, "id": "b4ff23ee", "metadata": { "lines_to_next_cell": 2 @@ -229,7 +233,7 @@ }, { "cell_type": "code", - "execution_count": 7, + "execution_count": 8, "id": "0ba603bb", "metadata": {}, "outputs": [], @@ -270,13 +274,13 @@ }, { "cell_type": "code", - "execution_count": 8, + "execution_count": 9, "id": "a88cf20a", "metadata": {}, "outputs": [ { "data": { - "image/png": "/9j/4AAQSkZJRgABAQAAAQABAAD/4gHYSUNDX1BST0ZJTEUAAQEAAAHIAAAAAAQwAABtbnRyUkdCIFhZWiAH4AABAAEAAAAAAABhY3NwAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAQAA9tYAAQAAAADTLQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAlkZXNjAAAA8AAAACRyWFlaAAABFAAAABRnWFlaAAABKAAAABRiWFlaAAABPAAAABR3dHB0AAABUAAAABRyVFJDAAABZAAAAChnVFJDAAABZAAAAChiVFJDAAABZAAAAChjcHJ0AAABjAAAADxtbHVjAAAAAAAAAAEAAAAMZW5VUwAAAAgAAAAcAHMAUgBHAEJYWVogAAAAAAAAb6IAADj1AAADkFhZWiAAAAAAAABimQAAt4UAABjaWFlaIAAAAAAAACSgAAAPhAAAts9YWVogAAAAAAAA9tYAAQAAAADTLXBhcmEAAAAAAAQAAAACZmYAAPKnAAANWQAAE9AAAApbAAAAAAAAAABtbHVjAAAAAAAAAAEAAAAMZW5VUwAAACAAAAAcAEcAbwBvAGcAbABlACAASQBuAGMALgAgADIAMAAxADb/2wBDAAMCAgMCAgMDAwMEAwMEBQgFBQQEBQoHBwYIDAoMDAsKCwsNDhIQDQ4RDgsLEBYQERMUFRUVDA8XGBYUGBIUFRT/2wBDAQMEBAUEBQkFBQkUDQsNFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBT/wAARCADaAMcDASIAAhEBAxEB/8QAHQABAAMBAAMBAQAAAAAAAAAAAAUGBwgCAwQJAf/EAE8QAAEDBAADAwYIBw0HBQAAAAECAwQABQYRBxIhEzFVCBYiQZTRFBUXMlFhk+EJN0JxdbO0IyQ0NkNSYnN2gaHB0hhUVpGSlbElM0Vyov/EABsBAQACAwEBAAAAAAAAAAAAAAACAwEEBQYH/8QANREAAgECAQgIBQUBAQAAAAAAAAECAxExBBITIUFRUpEFFBVhcaGxwSIyM2LRQnKB4fA0Y//aAAwDAQACEQMRAD8A/VOlKUApSlAK+SbdoNtKBMmx4pX1SH3Uo5vzbNfXWZ5/Cjzs/tSJMdqQkWyQQl1AUAe1a+mjlGEZTlgk2XUaelmoXxLx51WXxiB7Sj3086rL4xA9pR76zvzetfhsP7BHup5vWvw2H9gj3Vye1cn4Jc0dPs77vI0TzqsvjED2lHvp51WXxiB7Sj31nfm9a/DYf2CPdTzetfhsP7BHup2rk/BLmh2d93kaJ51WXxiB7Sj3086rL4xA9pR76zvzetfhsP7BHup5vWvw2H9gj3U7VyfglzQ7O+7yNE86rL4xA9pR76edVl8Yge0o99Z35vWvw2H9gj3U83rX4bD+wR7qdq5PwS5odnfd5GiedVl8Yge0o99eTWS2h91Dbd1hOOLISlCZCCVE9wA3Wc+b1r8Nh/YI91Rl/s1visW91mDGZdTdbfpbbKUqH78Z9YFX0OkKFetCiotZzSxW12IyyDNi5Z2BtdKUrfOQKUpQClKUApSlAKUpQClKUApSlAKznNfxg2v9FyP1rVaNWc5r+MG1/ouR+taqqt9Cp+1m5kn1onjSlK8IenILMs4snD6zi6X+cIENTqI6FBtbq3HVHSUIQgFS1HrpKQT0P0VQMr8pDHsemYQY7U242zJH5DZlsW+WtbCGW3CSGkslal9ogJKNBQHMrWgTUxxztlrueHxhdLbkE4MT2ZEaRjDCnp0B9IUUSEJTs+j1B9FXztFJBNZeZmcO2LhZmGT2O73WRYr5NMtuLbv/AFBcNxiQwxIcit9UrIU2VoSOm+4dQNulThKN5d+3u1GtUnJOy7vU1jJuOeEYbdmbder0q3yXG23SXYb/AGbSXOiC64G+Rrf9Mpr6cl4w4liWRjH7lcnU3tUduWmBGhSJLqmVqUhKwlptWxtCt6+boE6BG8H41NZRnxzu3ybTm0iPPs7Qxe22pl2PDV2kbbhmKSUjtEulQU08e5ICUqJrQ+HlonO8ZxfH7VOjRXcGtcdMmXFW1yu9u+txklQGnACgqQeo6bFSdKEYKT3b/DuIqpNyzUTnDjjjbeIWX5Tj7cObElWe4uQ2lLhSQ282httSlqcU0lCFcy1AIKuYgBQ2FA1plY9wzfnYjxTz+xXCx3dKb3e1XaFdWoS1wFsqiMpIU+PRQoKZUnlVo7I1vdbDVFVRUvhwsi6m21rFRGTfwOB+lLf+2M1L1EZN/A4H6Ut/7YzW10d/20f3R9UKv05eDNfpSlewPIilKUApSlAKUpQClKUApSlAKUpQCs5zX8YNr/Rcj9a1WjVXMlwaDk8+NNfkzYsmO0plK4b/AGe0qIJB6HfVIrEoqpCUG7XTRfQqKlUU2ZzlfD3GM6VGOR4/bL6YvMGDcIqHuy5tc3LzA63yp3r6BUB/s/cMt78wMb/7Wz/prUvkqg+MXv237qfJVB8Yvftv3VxV0XNKyrep1nltB63EpWLcOMVwd997HcctdjdkJCHV2+IhkuJB2AopA2BVjqS+SqD4xe/bfup8lUHxi9+2/dUX0S5O7qrkySy+ktSTI2lZpxkizcJ4ncI7HbL3dEQMlu78O4B2RzKU2hnnTynXonfrrXfkqg+MXv237qx2P/6rkzPaFLcyvXyxW7JrVItl2gx7nbpAAdiy2g404AQRzJPQ9QD/AHVUEcAeGjZ2nAccSdEbFsZHQjRHzforUPkqg+MXv237qfJVB8Yvftv3VNdFSjqVZcmReXUXjEzi18E+H9juMa4W/CrDBnRlh1mTHtzSHG1juUlQTsEfTU9k38DgfpS3/tjNWn5KoPjF79t+6v6nhRbO3juO3G7SUsPtyEtPS+ZBW2sLTsa6jmSD/dWxk/R7pV6dadW+a08HsdyEstpOLjFWuXWlKV0ziClKUApSlAKUpQClKUApSlAKUpQClKUApSlAKUpQHO/lI/jx8nn+0Mv9mNdEVzv5SP48fJ5/tDL/AGY10RQClKUApSlAKUpQClKUApSlAKUpQClKUApSlAKUpQClKUApSlAKUpQHO/lI/jx8nn+0Mv8AZjXRFc7+Uj+PHyef7Qy/2Y10RQClKUApSlAKUpQClKUApSlAKUpQClKUApSlAKUpQClKUApSoTJMvgYyG0P9rJmvAlmDFTzvOgd5A2AlPcOZRCRsbPUVKMXJ2iZScnZE3Xy3S2Rb1bZdvnx25cGWyuO+w6NodbUkpUlQ9YIJB/PVDczzJZJKmLPboTfXlEmWt1z6thKAAfqCj+evX555d/u1k/6nqt0W+S5m11Ws/wBJ+OnlF8G5fArjBfsRfClxWHu2t76v5eKv0mlb9Z16Kv6SVD1V+r3kU8F3+B/AOz2qehbV6ujirxcWXO9p51CAG9eopbQ2kj+clX01UOLnBg8Zs/wrLL5EtIn4w/2qW2u05ZiAoLQ07sbKErHMB/SWPyumueeeXf7tZP8AqepolxLmOqVtxpVKzdGa5Ykgrh2Zwb+al11HT8/Kf/FS1n4jsvyGot5hKskh1QQ26p0OxnFE6CQ7oaJOgAtKdkgDZ6VjRN/K0/B+2JCWT1YK7iXKlKVSa4pSlAKUpQClKUApSlAKUpQClKUApSlAKUpQEVlF+RjNhl3FbfbKaSEtsg6LrqlBLaAfUVLUlP8AfWew4zqFOyZbpk3CQQuQ+T3n1JT9CE7ISn1D6ySZ7istXwbHGv5J27oDn0aSy8tP/wC0IqKq2fwU4pbdfsl5M7OQwWa57RSudOPPFHJ8ZvGSSMQv1zlOY3ARMnWqJZYz0GOeQualSHVJX6aBvlaPMkddHYr08TuL2QfHOTohZfFwNiy41HvUCPIjMPLuzrqXVFO3QSUJLaG9NgK5l9/cK1bG660VdHSNK5me4ncQslvbGO2hrIYr9msdtk3J61wLdIlOy5LJWQ8JTjSUpHLrTaNlXP1SAAZm0ZRxLyrLsRxq63M4RcZeOS7hc2o0OM+6HmZbbSFo5u0QgrSsKI2sAKI79KCxlVU8Ezf1uJb5eZQTzHlGzrZ+ivF9huSy4y82l1pxJQttYBSpJGiCD3g1yreLtknEqzcHJUzI37ZeGcunWp+XAisacdYRMaTICHELAVytH0fm/uqunROuqIbLkeIw06+uU6hCUrfcSlKnCBoqISAAT39AB16Cs4ayUJ599RO4BfHkSpNgmOrfcjtiREfeXzLcYKtFKiepLatDZ6lKkbJPMau1ZXblqa4gY2pHznBKaXrv7Mtcx/u5kI/wrVK2qmtRnvXu17XODlUFCq0hSlKpNQUpSgFKUoBSlKAUpSgFKUoBSlKAUpSgK7ntjfv2NvtRE80+OtEuMknl5nG1BQRv1BYBQT9CzVNhTG7hFbkNE8ix3KGlJI6FJHqIIII9RBFapWP53fbHbOKNpxa1TixmV7juTfiv4K4uM80gHbzziEkMElJSHDvZ0ClXokWq045jdrYfg38lyhUm4ywZRcu4BWPMLtf5b92vkCLkDSGrtbbfMDUaaUt9mlaxyFYPIEpPKpIUEgKB67zzirwnyQ5JZn8etuR3ddstLEKJdI92tiOR1vm0txqSwSgn0SpbOubp6I5RXQjhvcMlEvGLilQ36cUtPtq+sFK+b/mkH6q9fxhP/wCHL17J99Y6vV2LzR03KjJapGbp4KyciYsl9vWQ3SxZ2m1swbtdMZkIYTNKRtSVpW2pJAUVEKCUkb6aGgLZb+G9vt+V2jIRMuEi4WyzrsjZkvh0OMqW2srcURzKc20n0ubrs7BJ3Xvu2dQ7DcLZAucSXbp1zcLMCLLDbTstY1tLSVLBWRsdE7PUVK/GE/8A4cvXsn306vV3E1Oitq5lEk8BLC/iUWxNXG7QzDvD18h3KM+hEuNKcdccUUK5OXl/dnE8qkn0T12etaBaoKrZbIkNUp+cqOyhoyZSgp14pAHOsgAFR1skAdT3V4ImXJ0hKMbvKlE60phKP8VLAqTtuH3u/KHxmn4it5+eyy8Fy3B/NK07S2PUSkqV1Oik6VTQSXz2S8fbEi61GmrpnswW3qu2TSLwRuFBaXCjq3tLjqlDtlD/AOnIlG/pLg6aO9Eqk8OOJWLZrJv9jx3to7+MSvi2bAehORTHUNhHKFJAKFBJKSn1a7t1dqTkpNJYLD/eZwatR1ZubFKUqsqFKUoBSlKAUpSgFKUoBSlKAUpSgFfwkDvOvz1HT8hgQLgzbFTIxvEllx+LblPoQ/ISjXMUJJ2QNjZ7hsbrNIWEz+OuKY3cOJmPycVuFtupujFit95WpBCFEx/hJb5QpSfRXoHopAOwCpFAfdcciu3FRzPcMtEfI8GdtyG4jOXLiIShx5Q5l/BkrO1gJ5RzgD550UkJJvWM4+nGrDbLaZsu6uwYrcX4wuKw5KfCQBzOLAHMo62TrqetStKAUpUbklkTkuO3W0LlyoCLhFdiGXCWEPshaCnnbUQQFp3sEggEDoaA/ILyy/KMmcT/ACiF3ewXBTVrxR8RLJIjr/LaXzKkJPdtTg2FfzUo+iv1L8n/AIvQ+OXCWwZfE5W3pjPJMjp/kJKPRdR9OuYEjfekpPrriHi3+D/4e4FxI4V4/b7zkz0PKro/CmuSZUdTjaEM84LRSwADvv5goa9VdreT/wCT/j3k4YbMxrGplznQJU9dxW5dXW3HQ4pttsgFttA5dNJ9W9k9e7QGmUpSgKvxFwCJxIw+6Y/Jn3CzonpRzT7PIMaU0pCgpCkuD1gpHfsEdKhY96ynFM0xXEW8bnZBi7lt7OTmD89tTrMltJ/99s+krnCUnnH5S+6tCpQEXjmUWfMLYLjY7pDu8ArU18JhPJdb50nSk7SSNg9CKlKzDKOFdxx3DrhE4Pu2TAb3LuKbk8tdtS5GlL6BaFpTrk5wlIKkgkAHQBOxLw+LFrVxTc4dyY9xayBu2puSJSoDiIcpvYDhac6j0CUbBOgVgAkg6AvFKUoBSlKAUpSgFKUoBSlKAVnWY57OvreX4rw6uFqd4i2VqMXI14Q6iPFD/pIcUQn0/wBz5lDl2NgA67q0Ws0uU5GN8d7NGhYIqQvJbe/8YZfGbJ+D/BgC2w8Qg6Srm9EqWOvQA+oCdtPDazoyO35fdrXbZudtW1u3v3xmNyKIAJX2YJVyJKlL9ZOiEkkCrdSlAKUpQClKw7jX5Q0jGsgZ4fcPLajLuKE9vmbgJV+9rW2dfviYsfMSNghOwVbHdzJ2BX/KPuURzyhvJ6tiJLS7im9S5KoiVguhr4OR2hT3hOwRvu6H6DXSNY7wL8nmPwxkzcoyO5Ly/iVeBzXTJJY2ob/kI6f5JlOgAABvQ3oBKU7FQClKUApSlAK9E2G3cIj8Z3nDbzamlFpam1hKho8qkkFJ+sEEeqvfSgMgj4vkPAPBccsOAWeZncBq59lKbvV51KjRHFHRaWtPKUtcydJ6aQg95JUNStV7t19aedts+LcGmXVMOLivJdShxPRSFFJOlD1g9RX21lPk6ysJl41kqsFhzYUBORz0TkTiSpc4LHbqTtSvQJ1ru/MKA1alKUApSlAKUpQClKUAr8+/KJ/CQXPF81h47jGK3exSrHdWlXtu9LjIclIbWsPQwlAeSlCwGyH0Ob79JI0T+gD8hqK2XHnUNIHepxQSP+Zri7y8vJnsfGSyO5ticy3jOLaz++IzUhG7pHSPmaB6upA9E96h6J36OpKMpYIFz8hzym808pa35fOyq1We3RbU7FZhOWlh1sOrWHS6F9o6vfKEta1r5x7/AFdRVyl+Dyx6Dw48nC3qucli23O9TZFzfjS3EtuoBIab2lWiAUNJWPqXv110z51WXxiB7Sj31LRz4WZsyUpXzQ7nDuG/gstiTrqexcC//BrmfJuIuVeVFkU/C+F8yRj2BQnVRb/nqElLj6h0XFt++9XqLvq3sdOXng01qZgluJfHTIc/zGXww4MdjLyBj0L5lrqe0gWBB2CAe52R0OkDYBHXelcui8FOBePcD7A/EtfbXG8T1/CLrfp6u0mXF87JcdWeutk6TvQ2e8kkzfDLhfjfCDEYmN4rbW7bbI/UhPVx5Z+c44vvWs66k/UBoAAWusAUpSgFKUoBSvkm3aDbSgTJseKV9Uh91KOb82zXzedVl8Yge0o99TUJNXSM2ZKUqL86rL4xA9pR76edVl8Yge0o99Z0c+FizMY8rPyn5vkv2XH7s3hispt9zkOxXnxcfgiYriUpU2k/uTnMVjtCO7XZnv3XO/Cf8Jffs3yy24rC4UQpd2vFx7GMIV3VHQhK1DRcBYXspGypewNAnQ1XV/HnD8Y438KMhw+Zd7ahc6OTEkLko/e8lPpNOdDvQUBvXekqHrrjn8G5wOZxbJ8izzLixbbhbXXLPbI0x1CFJc7pDwBPqGmwobB5nB6qaOfCxZn6NUqL86rL4xA9pR76edVl8Yge0o99NHPhYsyUpUX51WXxiB7Sj30TlFmUoAXeCSegAko6/wCNNHPhYsyUpSlVmBVQy7Ln4ksWm0hBuBSFvyXBzNxEHu6flOK/JT3AAqV05Urtch9EWO684dNtpK1H6gNmshxpbku1N3F/Rl3I/DX1DfVSwCB19SU8qR9SRVsbRi6j2YeJu5LRVWfxYI/i8agy3u3uLZvEsjRk3HTyz130BHKkfUkAfVXu837WP/jYf2CPdVO4wcXYnCOJj78qHImC63Vi3nsGHnS0hSvTc02hZUoDuR0Kj3b0RX0ZFxsw3FI1sdul0djKuUb4ZHjCBJXJ7HptxbKWy42kb6laU6OwdEGq3WqSxkzuJwjq1KxafN+1+Gw/sE+6nm/a/DYf2CfdVdv/ABgw/G7PaLnMvbS4l4Tz24wmnJTktPLzFTbbSVLUACCSBobG9VDSuLzFyyjhszjkiFdLBlLs5DkwBRUAxHW4OTqOVXOjlUFAkaI0DUdJPiZlyii7PYrZ3lBZtsZDqSFJdabDbiSO4hSdEf3GpbGr+7hfZw5jhkWNbhAkKSO1irWvZU4ofPbKlElZ9JJJUoqBKkU2wcXMTyjJ5WP2q6mbc4y3W3EojPBrmbOnEpeKOzUUnoQlRIq3PMokMradQlxtaSlSFDYUD0INWRrSwm7r/YbiqpShWjY1ClVPhjcHJmKNx33C6/b3nYKlkklSW1ENkk9SS3yEk+vff31bKTjmScdx5yUXFuLFKUqBEUpSgMzz+FHnZ/akSY7UhItkghLqAoA9q19NfH5vWvw2H9gj3VJZr+MG1/ouR+tarxrn5fUnGcUm1qXqzxfS0pLKWk9iI/zetfhsP7BHup5vWvw2H9gj3VIVGZLk1rw+ySrvepzVutsYAuyHjoDZAAHrJJIAA2SSAASa5ulqP9T5nHU5t2TZ5+b1r8Nh/YI91PN61+Gw/sEe6qjD474LNsV2vCb8lmFaezM/4VFeYdjJcUEoUtpxCXAlRPRXLroevQ1IYrxYxXNJc+Larr2kmCymS+1JjuxlBlW+V1IdSnnbOj6adp+us59ZbX5ljVZJtp6vEnvN61+Gw/sEe6nm9a/DYf2CPdWVRvKOsmUcTsLxvFJ0e6wruuaJj64j6PQZYUtCmHFBKFpK06Kk8419HfWy0lOrHGT5mJqrTtn3VyP83rX4bD+wR7qhc0sluYxa4uNQIrbiW9pWhlIIOx3HVWqoLOf4pXP+q/zFbOSVajyims5/MtveWZPOWmhr2r1NlpSldg+inzXKILhbpUUnQfaU3v6Ngj/OslxVxS8btoWlSHW2EsuIUNFK0DlWD+ZSSK2Os6yqwu45cZN1iMKetUtZdmNtDa4zpABdCfW2rXpa6pV6WiFKKLorPg6axxX4/wBusdDI6qpzaltMm8oK23GTjmOXK322Xd/iTI7fdZMSA2XZC2GnP3QtoHVagFb5R1OjVWVkcvFeK9yzl7E8mulnyGxxY0X4HaXHZcR1h17mYdY1ztBfaJUCoBOwdkVusaSzMYQ/HdQ+y4OZDjagpKh9II6GvZWq9WpnYcLvOTOWuHWJZDwZl4JkV9xy53GKmx3C3yIVmjGa9anX53wttPZo2op5D2RUgHRQN6FeeM4pkmP3vC8vmY1c24UjMLxc3bZHY7STb485lbbKnW0np6Wlr1vl5zvuNdRUrFyCopWs8P6/BgOAfGti4xfFuK2fJrbh0uRPfvUG+wC3CivbKkPwnj1IdcJJbSpSdKJ0kjVb9SvlhxnswkLt9scIjBXJMuKN8jKd6UhtQ6F0jYAHzPnK/JSuyEHUfdte4k3GjFuT1Fk4URyMdlzNEJn3CRIRsaJQFdmk/mIbBH1EVdK9EKGxbobESM0liMw2lpppA0lCEjQA+oACvfVtSWfNyR5ucs+TlvFKUqsgKUpQGc5r+MG1/ouR+tarxryzX8YNr/Rcj9a1Vcyvh7jGdKinI8ftt9MXmDBuEVD3Zc2ubl5gdb5U719ArmdIW0kb7l7niulbda17kWGsj8pfErrlWGWR61xJ1y+Jr7EusuBa5CmJcmO3zhxLK0qSQ4OcLTpQJKBo71U5/s+8Mt/xAxv/ALWz/pqdxXhviuDPvvY7jlrsbshIQ6u3xEMlxIOwFFIGwK5yai7o5kJRpyU4t3Xd/Zz3mWE23JeFmd3PHMZzpd9ehxbeheTKnPyZLQkodLbLT61r0ggknlA6nW+tWjjXgN/zTiDkEazxZCPjDh7cLa1N5FJYMhUlooZU5rlClDm6E70VHu3W/wBKlpWixZTJNNbL46934OdLJe5uYcRODoj4RkePR7C1OanfGFqcYjxCYRbSgOa5VJ5hpKh6J6ddnVdF181xt0W8W+TBnR2pkKS2pl+O+gLQ6hQ0pKknoQQSCDVJHk/8MwQRgGOAjuItjP8ApqLkpY6iE5wqWvqt/O1vf3l/qCzn+KVz/qv8xUBG4DcN4chp9jBMdZfaUFtuItjIUlQOwQeXoQan85/ilc/6r/MVsZJbrNO3EvUzQUdNDNe1evibLSlK7h9GFKUoCr3PhvYbnJckiM7BkuHa3bfIcjlZ3slQQQFHfrIJr4PkogeL3r237qu9KvVeov1FiqzjqUmUj5KIHi969t+6nyUQPF717b91XelZ09Tf6EtNU4mU5nhVYwoGUu4XJIIPZy5zimzr6UAhJ/MQRVriRGIEZuPGZbjx2khKGmkBKEAdwAHQCvdSq5VJz1SZXKUpfM7ilKVWRFKUoBSlKArmS4NByefGmvyZsWTHaUylcN/s9pUQSD0O+qRUZ8lUHxi9+2/dV2pVmklZL2RXKnCTvKKf8FJ+SqD4xe/bfup8lUHxi9+2/dV2pTSPu5IjoaXAuSKT8lUHxi9+2/dT5KoPjF79t+6rtSmkfdyQ0NLgXJFJ+SqD4xe/bfup8lUHxi9+2/dV2pTSPu5IaGlwLkik/JVB8Yvftv3V65HCG1y2lNSLneH2VfObXM2lQ+g9KvVKyqsk7r0RlUaSd1FckKUpVRaf/9k=" + "image/png": "/9j/4AAQSkZJRgABAQAAAQABAAD/4gHYSUNDX1BST0ZJTEUAAQEAAAHIAAAAAAQwAABtbnRyUkdCIFhZWiAH4AABAAEAAAAAAABhY3NwAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAQAA9tYAAQAAAADTLQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAlkZXNjAAAA8AAAACRyWFlaAAABFAAAABRnWFlaAAABKAAAABRiWFlaAAABPAAAABR3dHB0AAABUAAAABRyVFJDAAABZAAAAChnVFJDAAABZAAAAChiVFJDAAABZAAAAChjcHJ0AAABjAAAADxtbHVjAAAAAAAAAAEAAAAMZW5VUwAAAAgAAAAcAHMAUgBHAEJYWVogAAAAAAAAb6IAADj1AAADkFhZWiAAAAAAAABimQAAt4UAABjaWFlaIAAAAAAAACSgAAAPhAAAts9YWVogAAAAAAAA9tYAAQAAAADTLXBhcmEAAAAAAAQAAAACZmYAAPKnAAANWQAAE9AAAApbAAAAAAAAAABtbHVjAAAAAAAAAAEAAAAMZW5VUwAAACAAAAAcAEcAbwBvAGcAbABlACAASQBuAGMALgAgADIAMAAxADb/2wBDAAMCAgMCAgMDAwMEAwMEBQgFBQQEBQoHBwYIDAoMDAsKCwsNDhIQDQ4RDgsLEBYQERMUFRUVDA8XGBYUGBIUFRT/2wBDAQMEBAUEBQkFBQkUDQsNFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBT/wAARCAD5ANYDASIAAhEBAxEB/8QAHQABAAICAwEBAAAAAAAAAAAAAAYHAwUCBAgBCf/EAFIQAAEEAQIDAgUOCQkGBwAAAAEAAgMEBQYRBxIhEzEWFyJBlAgUFTJRVVZhcXSy0dLTIzY3QlSBkZOVGDVDUnWCkrO0JCUncpahMzRTZLHB8P/EABsBAQEAAwEBAQAAAAAAAAAAAAABAgMFBAYH/8QAMxEBAAECAQkFCAIDAAAAAAAAAAECEQMEEiExQVFSkdEUM2FxoQUTFSNiscHhgZIi8PH/2gAMAwEAAhEDEQA/AP1TREQEREBERAWG1cr0o+exPHXZ/WleGj9pWju37uevz47FTGlVrnkt5NrQ5zX/APpQhwLS4d7nuBa3cNAc4u5Ptbh/p+F5llxcF+ydua1fb65mcR5y9+5/Z0W+KKae8n+IW293fCrC++9D0ln1p4VYX34oeks+tPBXC+89D0Zn1J4K4X3noejM+pX5Pj6LoPCrC+/FD0ln1p4VYX34oeks+tPBXC+89D0Zn1J4K4X3noejM+pPk+PoaDwqwvvxQ9JZ9aeFWF9+KHpLPrTwVwvvPQ9GZ9SeCuF956HozPqT5Pj6Gg8KsL78UPSWfWu5UyFW+0uq2YbLR3mGQOA/Yun4K4X3noejM+pdS1oHTluQSuw1OGdp3bYrRCGZp+KRmzh+op8mds+n6TQ36KMR2bmkZ4Yb9qbJYeVwjZen5e1quJ2a2UgAOYegD9twdubfcuEnWuujN8YJgREWtBERAREQEREBERAREQEREBajV2Yfp/S+VyMQDpq1Z8kTXdxft5IP69lt1HuIVOW9onMxwtMkza7pWMaNy5zPLAA90luy24MROJTFWq8LGtsNP4ePAYapQjPN2LPLk88khO73n43OLnE+6StisNO1FeqQWYHc8MzGyMd7rSNwf2FZlhVMzVM1a0FEuIHFbS3C6LHv1JkzSfkJHRVIIa01madzW8z+SKFj3kNHUnbYbjchS1Up6pWhUfBp3Jx4/WDdSY59mTEZzR2ON2ahK6NocyaIBwdHL0Ba5paeXqW9CsR2cp6pjT+N4q6b0m2tetUc3hfZeHJ1cdbnB55IWwtDY4XeS5sjnOkJAZs0O5S4KQWuP2gqOuW6Qs571vnX2m0WxS052wmw4bthE5j7LtDuNm8+53A2VUx5fWendd8Ltfax0nlrtuxpGzicxDp6g+4+neklrTDnij3LWu7J43G4aehPnUA4t4/Wep5tTDMYbX+W1Bj9VwW8fUxsEwwsOJguRSRyRtjIjsSGJpJGz5ec9GgDoHpi3x20TT1je0ocpYsahozR17VCnjbVh8DpI2yMLzHE4NYWvb5ZPLuSN9wQNXwF4943jngrNyrRu465XsWY5K89KyyMRssSRRubNJExj3OawOcxpJYSWuAIXW4S6fu4zjFxpyVrG2KkGSy2PdVtzQOY21GzHQNJY4jZ7Wv529NwDzDv3Wr9THYyGl8PlNCZjT2axuSxeUylr19YovbQswy3pJY3Q2NuR5c2Zp5Qdxyu3A2QXgiIg6+QoV8rQs0rcTZ6tmN0MsT+57HDZwPyglajQ1+e/puEWpe3t1JZqM0p33kfDK6IvO/9bk5v1rfqM8PG9pp+S4N+S/dtXI+YbbxyTvdGdvjZyn9a9FPc1X3x+V2JMiIvOgiIgIiICIiAiIgIiICIiAiIgilOdmg3mjb2iwDnl1O315Km53MMp7mN3J5H9G7bMOxDe0x6r4RaG1/kY8lqPSWEz95sQhZayFGKeQRgkhoc4E8u7nHb4ypa9jZGOY9oexw2LXDcEe4VGn8PsdCScbZyGFB/osdbfHEPc2iO8bf1NH/YL0TVRiaa5tPO/wDv8stEo8fU28KC0N8W+luUEkD2Jg2B8/5vxBSbR/DvS3D2GzFpjT2M0/FZc107MbUZAJSNwC4NA323Pf7qw+BNj4VZ799D90ngTY+FWe/fQ/dJ7vD4/SUtG9KEUX8CbHwqz376H7pRO9jstX4q4PTzNU5j2OuYW/flJlh7TtYZ6bGbfg/a8tiTfp38vUed7vD4/SS0b1qLS6s0XgNd4xuO1HhaGdx7ZBM2rka7Z4w8AgO5XAjcBxG/xldHwJsfCrPfvofuk8CbHwqz376H7pPd4fH6SWje0DfU3cKWBwbw40u0PGzgMTB1G4Ox8n3QP2LZ6Z4K6A0Zl4srgNF4HDZOIObHco4+KGVocNnAOa0EbgkFdzwJsfCrPfvoful98AKdh3+8MhlcqzffsbV14iPysZytcPicCEzMONdfKP8AhaHHK5Dwu7fDYqXnqP5ochkYXeRCzqHRRuHfKe7p7QbuJB5WuksEEdaCOGFjYoo2hjGMGwa0DYADzBfKtWGlXjr14Y68EbQ1kUTQ1rQO4ADoAsqwrriYzadUEiIi1IIiICIiAiIgIiICIiAiIgIiICIiAiIgKvssW+P7SwJPN4MZfYebb11jd/P8nm/WPPYKr/K7+P7S3Vu3gxl+hA3/APNY3u8+3ydO7fzILAREQEREBERAREQEREBERAREQEREBERAREQEREBERAREQFXuWA/lA6VPM0HwXzHk7dT/ALXjOu+3d+vzj9VhKvctt/KC0r1PN4L5jYcv/u8Z5/8A9/2QWEiIgIiICIiAiIgIiICIiAiIgIiICIiAiIgIiICIonf1ZkbVyxBg6NazFXkMMtu7O6JhkG4c1gaxxdykbE9ADuBuQdtuHh1Yk2pW10sRQj2d1h+gYP0ub7tPZ3WH6Bg/S5vu1v7LXvjnBZN14D1j6vbK6e9URXxNrhXO7UOJjuadGPizAd28s9is5r2O9b78p9bjbYeUHg+YL2L7O6w/QMH6XN92qgz3qf5tQ+qDw/Fqxj8MMzjqvYmoLEhinmaOWKdx7PfnY07D/lZ/V6uy1745wWelkUI9ndYfoGD9Lm+7T2d1h+gYP0ub7tOy1745wWTdFCPZ3WH6Bg/S5vu1li1flsW5kmdoU4qBcGvtUbD5OwJOwc9jmDyN9t3AnbfcjYFwk5LibLT/ADBZMkRF5EEREBERAREQEREBERAREQEREBERAVeaGO+BeT3m/eJ+M+upVYarzQv8wP8An13/AFUq9+T93V5x+V2JAiItiCIiAiLo2M5j6uXqYua7BHkrcckteo6QCWVjOXnc1veQ3mbufNzD3UHeUd4jnbh7qg9Nxi7RG43/AKJykSjnEj8neqf7Ktf5LluwO9o84+7KnXCxGe1HyLkuLPaN+RclxmIiIgIiICIiAiIgIiICIiAiIgIiICrzQv8AMD/n13/VSqw1Xmhf5gf8+u/6qVe/J+7q84/K7EgXkPiHrLUMOqb+t9KXNSMw2K1ZVw1qfI6gIpTO9dx1rEEOOEZa6Pdzm9o5zXhwLhuAvXirXOepw4dajyOSvZDTgnnyMxtWGtuWGRmc7bzsjbIGRzdP/FYGv7/K6lWqJnUig+Kua1DndU8QMQNRarp68hzFSrpvT2IsWIaU+NeIfwjhFs0hwNkvlc4FnJ0LdgDtci/iXxc1xxGOCuWKR0/lX4jHMg1XLi2U+SGNzJpKrKsrbAe55fvI4gjyQG8u5kHE/wBTzq3VeuM7k9PS4fADJyxyx52tm8tWu1ntjYwymrFIK80gDBsTyggNDgdtzZ2p+AGhda5o5jOYQXctJCyC1aiszV/XjWDZonZE9rZQPceHbDp3LDNmbiqW4jU+tNea+xeoNW5vGXcLpfEWew0/k5a1aO/JDZ7WVnLsS3ni6NOzXD2zSQNtHgqLuK3ELgJn83lMvDksroqzasyY7KT0w+ZgqOJAie0DmMji4Do4BoO4a3b0xDofCQZzN5iOly5HNVoad+btX/hoog8Rt5ebZuwlf1aATzdSdhtHstwH0Nm9OadwdvCE4/T0XY4oQ3LEU1WPkDC1szJBIQWgAguPNsN99llmyJ8o5xI/J3qn+yrX+S5SMDYAKOcSPyd6p/sq1/kuXqwO9o84+7KnXCxGe0b8i5Liz2jfkXJcZiIiICIiAiIgIiICIiAiIgIiICIiAq80L/MD/n13/VSqw1XczMhpjMzY7HYufO0rEs9thqPa19RznCSSKQyFrBu6YFg5g4tcQG7Rlx92TzGbVRe0zadOjVfqsarJCi0nstnvgZlfSqX36ey2e+BmV9Kpffr05n1R/aOq2btFpPZbPfAzK+lUvv1F7vGOtj+IWP0PYwd+LVWQqPu1scZ6vNJCzfmdzdtyjucdidyGkgbApmfVH9o6llhotJ7LZ74GZX0ql9+nstnvgZlfSqX36Zn1R/aOpZu1HOJH5O9U/wBlWv8AJcux7LZ74GZX0ql9+sWQx+e1Tj56EmElxVWaMtsOtWYjJIzY7xs7NzgHO9rzEgNDidiRsc8O2HXFdVUWib646kRabrBZ7RvyLktZhs/Xy7WRFrqWSFeKxYxdl7PXNVsnNyiRrHOA6se3mBLSWO5XHZbNcViIiICIiAiIgIiICIiAiIgIiICL45wY0ucQ1oG5J7gtDG+xqew2SOSaliIJz7URublIzF0IduS2Lmee7lc50QIPZn8IHGfIWdSiatiZZadMxwyszkXZSRSgyeXHCNyS7kad3lvKO0YW85Dg3bY3FU8PDJDRqxVIpJpLD2xMDQ6SR5fI87d7nOcST5ySs1atDSrRV68TIIImCOOKJoa1jQNg0AdAAOmyyoCIiAvzx4g+pl43Z71XVTWVbUWlaufnM2ZxcbrtoxQVKksEQgeRX84sRggAg7v3Pu/ocq/yHLNx8wHKGl1fTOR5zueZoktUeXp3bHsnf4flQWAiIgIiINbmcFBmIXDtZqVrZoZepuDJ4w17XgB2x8kuY3dpBa4dHAgkLpw5y5jrorZuGGIWrskNCxSEkkb4gznZ2/k7Qv6Pb1cWuLAQ4OkEY3y+OaHtLXAOaRsQe4oPqKMCrNoam31jBLa05SqNiZjasTprUJEnVzCXbvYI3H8GAXARAMDiQ1SSOVkrS5j2vaCW7tO43B2I/UQR+pBzREQEREBERAREQEREBEWK1P61rTTcj5ezYX8kY3c7Yb7AecoNBZEOsr1zHu5J8JUdJTyVK5j+eO690bHBjXv8l0bQ883K1wL9m8wMcjDJFodBx8mi8I7tcpMZKkcxfmz/ALbu9ocRMB0DxzbFo6AjYdAFvkBERAREQFX3DgnVeodQa435qOREWOxDt9w+jAXkTjrttLLLM4Ee2jbCfc256ltS8QsrY0pjJnR4iu8Mz+Qhc5ruXYO9ZROHdI8Edo4Hdkbths+RrmTqvXiqQRwQRshhiaGMjjaGtY0DYAAdwA8yDIiIgIiICIiAo9fqeC5tZShEGUS+W7kadeo+eaw7kA54g078/kAloa4v67DmO5kKIMdexHbrxTwvEkUrQ9jx3OaRuCsi0OBgmxeZy2O7C++kXNvQ3bdgTRudM+TtII9zzNDCwO5T0AlaGnYcrd8gIiICIiAiIgIi0uY1tp7T9oVsnnMdj7JHN2Nm0xj9vd5Sd9lnTRVXNqYvK2u3SKLeNLR3wpxHpsf1qM8S7/DbivoTM6Sz+o8VNispB2MoZfja9pBDmPad/bNe1rhv03aNwR0W3s+NwTylc2dzY6F4gaXhlqaMOpN9TUnS0his7kInZicQlw7Z8fNzvD42CVr9vKjc157yp8vzi9RTwXo8FfVE6vv6jzeLkx+Hpmticp65YIrhmcPwkZ323EbXBw72l+x+P3p40tHfCnEemx/WnZ8bgnlJmzuSlFFvGlo74U4j02P608aWjvhTiPTY/rTs+NwTykzZ3JSobns7kNQZeTTmm5ewkiLRlczy8zcewjfsotxyvsub3NO4ia4SPB3jjm1GS4jVdZ51ml9LZypA+WPnt5eKeNzoWEe0rNduJZj7uxZGOrtzysdOsHg6Gm8XDjsbWbVpw8xbG0kkuc4ue9zjuXOc5znOc4lznOJJJJK1VUVUTauLJaz5gcDQ0xiK2MxlcVqVcEMZzFxJJLnOc5xLnvc4lznuJc5ziSSSStgiLBBERAREQEREBERBHrVH/iDjbjcZPJ/uu1E/JNsbRQ/ha5bC6L85z/KcHfmiJw/OUhVMZT1QHCqHibh3y690xzwYvIQvu+EtVsNdxmp7wyR9p1kfyktcerRDIPzlc6AiIgIiICIiDpZq47H4e9aYAXwQSStB91rSR/8ACiOkqkdbAUpAOaezEyeeZ3V80jmgue4nqSSf1d3cFJ9VfixmPmc30Co9pr8XMV80i+gF0MDRhT5rsbJERZoIiICIiDq5LG1stTkrWoxJE/49i0jqHNI6tcDsQ4dQQCOq7+g8pPmtF4O9af2tmenE+WTbbndyjd23m3PXb41iWHhZ+TnTnzGL6KxxdODPhMfaei7EpREXOQREQERRvXWs4NFYgWHRizcnf2VWrzcvav7ySfM1o3JPuDYbkgHZh4dWLXFFEXmRucnlqOEqOt5G5XoVW+2ntStjYPlc4gKMS8YdHQvLTnIXEdN445Hj9oaQqPydq1ncj7IZWw6/e68skg8mIb+1jb3Mb0HQdTsCST1WNfW4XsPDin5tc38P3cvC8fHNo336b6PL9hPHNo336b6PL9hUci3fA8m4qucdC8KC4kep00nqn1Y2O1JXuRnh7kpPZjKuEUgbHYYd3wcu3N+FfynoNgHu9xe7vHNo336b6PL9hUcifA8m4qucdC8Lx8c2jffpvo8v2F9Zxk0a923s3G343wyNH7S1UaifA8m4qucdC8PS2H1BjNQ13T4vIVchE08rnVpWyBp9w7HofiK2C8sQGSlejvUp5KN+P2lquQ17fiPQhw6DyXAg7dQVevDfXw1jSmr22sgy9MNE8bPaytPdKweZpIII72kEdRsTxcu9l1ZLT7yib0+sLr1JkiIuEjV6q/FjMfM5voFR7TX4uYr5pF9AKQ6q/FjMfM5voFR7TX4uYr5pF9ALo4Pcz5/hdjvWHSMgkdCxsswaSxjncoc7boCdjt18+xXnbhbx61RjOCuY1nrzFRWK9S9bgqzY+6JrN2f2Qkrx1hD2MbWbO5I2u5jzAcxDeq9Grz3DwC1dLoHUugp8jhYsA6/Nl8DloTK65DZN4XImzxFoZyteXNJa8kjboFJvsRIG+qEn0tazNTiHpg6QtUMLLn4vWuQbkI7NaJwbK1rwxm0rXOYOTbY842cQsFfjfnZ7FXEan0dNo6bUGLt2sJZjybbTnvih7V0UoaxphlDDzgAuHku8rcLW5ngRqji5kM3e4i3MNRdPp2xp+hU086WaOHt3NdJZe+VrCXbxx7MA2AB3J713cdwo11q/VWmsjr+/gmVNNU7UNRmBMz33LE8Brunl7RrRGBGX7MbzdXnyugU/yGj0lxxzGmuGHBbGRYt2q9UarwjJmz5XLCoyR8UETpOad7Xl8rzINm7Eu2cSRsvQmPmns0K01msadmSJr5a5eH9k8gEs5h0Ox3G46HZefrHBbXzuCGB4e2KOhdRV8fUkx0kmV9ctHZsa1lWxHyscWTNAcXAefbleFdmg9P29KaJwGFv5KTMXsdQgqT5CbfnsvZGGukO5J3cQT1JPXqSrTfaN6sPCz8nOnPmMX0VmWHhZ+TnTnzGL6KuL3M+cfaV2JSiIucgiIgKguLOSdkuIliBziYsbVjgjae5rpPwjyPlHZA/8gV+qguLONdjOIc87mkRZOrHPG89znx/g3gfIOyP98Lvexc3tWnXaben4uuyUWRdfI34sXRntziUwwsL3iGF8r9h7jGAucfiAJUVHFvT5/os5/wBO5D7hfb1YlFGiqYhrTJzg1pJIAHUk+ZUnS9VBh7uQqPZBjzhLdtlSKdmagde8p/I2R1MeWGFxB9sXBp3LQp2zijp++9tXsc0e3PZ7P0/fY079OrjAAB17ydlHuH2hNXaDix+n2v0/e0zQkc2K9M2UX3V9yWsLAOTmG4HPzdw9ruvJiV111U+5q0bbWndb8qxT8br9eHKZKTSxbp7F5mTD3L/sg3tGltgQiVkXJ5Td3NJBc0jcgcwG56/EzihmJsPrmjpfCTXIMLRniu5pt8VjVnMBftCNiXvja5rjsW7HoDus+R4TZe3w61hgGWaQuZjOzZOu9z39m2J9tkwDzybh3K0jYAjfz+dYNQ8NNYV/DnH6cs4WTCaqE00gybpmTVbEsAikLeRpD2u5Wnrtsfd8+iqcozbTfTHhfb+hY+i55bWjsFNNI+aaShA98kji5znGNpJJPeSfOtwoLj9b4rRuMoYO+3KSXcfWhrTOp4W9PEXNjaCWyMhLXD4wVn8bunj/AEWd/wCnch9wvbTi4cRETVF/NEzW20VknYfXuAsscWiac0pQPz2StIA/xiN391RvC5qtn8dHdqCw2B5IAtVpa8nQ7HdkjWuHd5x1Uk0TjXZnXuArMbzNgnN2Uj8xkbSQf8ZjH95TKJonArmrVafsyp1vSCIi/MFavVX4sZj5nN9AqPaa/FzFfNIvoBSnM03ZHEXqjCA+eCSIE+YuaR/9qIaSuR2MDThB5LNaFkFiB3R8MjWgOY4HqCD+0bEdCF0MDThTHiuxuERFmgiIgIiICw8LPyc6c+YxfRWPJ5StiKj7NqURxt6Ad7nuPQNa0dXOJIAaNySQB1K2GhMXPhNGYSjaZ2dmCnEyWPffkfyjdu/n2PTf4lji6MGfGY+09V2N6iIucgiIgKOa50ZBrXDis+QVrcL+1q2uXmMT+7qOm7SNwRv3HoQQCJGi2YeJVhVxXRNpgeXcrUtafyHrDLVzj7nXla87slH9aN/c8d3d1G43DT0WNenMli6WZqPq36kF6s/20NmJsjD8rSCFGJeEGjpXFxwNdpPXaNz2D9gIC+twvbmHNPzaJv4fstCikV5eJvRvvHF+9k+0nib0b7xxfvZPtLd8cybhq5R1LQo1FeXib0b7xxfvZPtJ4m9G+8cX72T7SfHMm4auUdS0KNRXl4m9G+8cX72T7S+s4O6NY7f2Cgd8T3vcP2F2yfHMm4auUdS0b1F1hLkLzKNGCS/ff7WrXAc8/GeuzR1HlOIA36lXtw40ENG0Zp7T2T5e3ymeRntI2j2sTD3loJJ3PVxJOwGzWyLEYLG4CuYMZQrY+EncsrRNjDj7p2HU/GV31xMu9qVZXT7uiLU+srq1CIi4aC0uY0Vp/UNgWMpg8bkZwOUS2qkcjwPc3cCdlukWVNdVE3pm0mpFvFXoz4J4T+HxfZTxV6M+CeE/h8X2VKUW7tGNxzzlbzvRbxV6M+CeE/h8X2U8VejPgnhP4fF9lSlE7Rjcc85LzvRbxV6M+CeE/h8X2U8VejPgnhP4fF9lSlE7Rjcc85LzvaPFaG05grLbOOwGMoWG78s1apHG9u/fsQNxut4iLVVXVXN6pumsREWAIiICIiAiIgIiICIiAiIgIiICIiD/2Q==" }, "metadata": {}, "output_type": "display_data" @@ -294,20 +298,95 @@ }, { "cell_type": "markdown", - "id": "055aacad", + "id": "b00e15e2", "metadata": {}, "source": [ - "## How to stream tool calls\n", + "## Streaming LLM Tokens\n", + "\n", + "You can access the LLM tokens as they are produced by each node with two methods:\n", "\n", - "You can now run your agent. Let's first look at an example of streaming back intermediate tool calls. This is not supported by all providers, but some support token-level streaming of tool invocations.\n", + "- The `stream` method along with `streamMode: \"messages\"`\n", + "- The `streamEvents` method\n", "\n", - "To get the partially populated tool calls, you can access the message chunks' `tool_call_chunks` property:" + "### The `stream` method\n", + "\n", + "For this method, you must be using an LLM that supports streaming as well and enable it when constructing the LLM (e.g. `new ChatOpenAI({ model: \"gpt-4o-mini\", streaming: true })`) or call `.stream` on the internal LLM call." ] }, { "cell_type": "code", - "execution_count": 9, - "id": "c704d23c", + "execution_count": 15, + "id": "5af113ef", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "ai MESSAGE TOOL CALL CHUNK: \n", + "ai MESSAGE TOOL CALL CHUNK: {\"\n", + "ai MESSAGE TOOL CALL CHUNK: query\n", + "ai MESSAGE TOOL CALL CHUNK: \":\"\n", + "ai MESSAGE TOOL CALL CHUNK: current\n", + "ai MESSAGE TOOL CALL CHUNK: weather\n", + "ai MESSAGE TOOL CALL CHUNK: in\n", + "ai MESSAGE TOOL CALL CHUNK: Nepal\n", + "ai MESSAGE TOOL CALL CHUNK: \"}\n", + "ai MESSAGE CONTENT: \n", + "tool MESSAGE CONTENT: Cold, with a low of 3℃\n", + "ai MESSAGE CONTENT: \n", + "ai MESSAGE CONTENT: The\n", + "ai MESSAGE CONTENT: current\n", + "ai MESSAGE CONTENT: weather\n", + "ai MESSAGE CONTENT: in\n", + "ai MESSAGE CONTENT: Nepal\n", + "ai MESSAGE CONTENT: is\n", + "ai MESSAGE CONTENT: cold\n", + "ai MESSAGE CONTENT: ,\n", + "ai MESSAGE CONTENT: with\n", + "ai MESSAGE CONTENT: a\n", + "ai MESSAGE CONTENT: low\n", + "ai MESSAGE CONTENT: temperature\n", + "ai MESSAGE CONTENT: of\n", + "ai MESSAGE CONTENT: \n", + "ai MESSAGE CONTENT: 3\n", + "ai MESSAGE CONTENT: ℃\n", + "ai MESSAGE CONTENT: .\n", + "ai MESSAGE CONTENT: \n" + ] + } + ], + "source": [ + "import { isAIMessageChunk } from \"@langchain/core/messages\";\n", + "\n", + "const stream = await agent.stream(\n", + " { messages: [{ role: \"user\", content: \"What's the current weather in Nepal?\" }] },\n", + " { streamMode: \"messages\" },\n", + ");\n", + "\n", + "for await (const [message, _metadata] of stream) {\n", + " if (isAIMessageChunk(message) && message.tool_call_chunks?.length) {\n", + " console.log(`${message.getType()} MESSAGE TOOL CALL CHUNK: ${message.tool_call_chunks[0].args}`);\n", + " } else {\n", + " console.log(`${message.getType()} MESSAGE CONTENT: ${message.content}`);\n", + " }\n", + "}" + ] + }, + { + "cell_type": "markdown", + "id": "f8332924", + "metadata": {}, + "source": [ + "### The `streamEvents` method\n", + "\n", + "You can also use the `streamEvents` method like this:" + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "id": "ec7c31a2", "metadata": {}, "outputs": [ { @@ -318,7 +397,7 @@ " {\n", " name: 'search',\n", " args: '',\n", - " id: 'call_ziGo5u8fYyqQ78SdLZTEC9Vg',\n", + " id: 'call_fNhlT6qSYWdJGPSYaVqLtTKO',\n", " index: 0,\n", " type: 'tool_call_chunk'\n", " }\n", @@ -371,6 +450,15 @@ "[\n", " {\n", " name: undefined,\n", + " args: ' today',\n", + " id: undefined,\n", + " index: 0,\n", + " type: 'tool_call_chunk'\n", + " }\n", + "]\n", + "[\n", + " {\n", + " name: undefined,\n", " args: '\"}',\n", " id: undefined,\n", " index: 0,\n", @@ -381,229 +469,29 @@ } ], "source": [ - "import type { AIMessageChunk } from \"@langchain/core/messages\";\n", - "\n", "const eventStream = await agent.streamEvents(\n", - " { messages: [{role: \"user\", content: \"What's the weather like today?\" }] },\n", + " { messages: [{ role: \"user\", content: \"What's the weather like today?\" }] },\n", " {\n", " version: \"v2\",\n", - " },\n", + " }\n", ");\n", "\n", "for await (const { event, data } of eventStream) {\n", - " if (event === \"on_chat_model_stream\") {\n", - " const msg = data.chunk as AIMessageChunk;\n", - " if (msg.tool_call_chunks !== undefined && msg.tool_call_chunks.length > 0) {\n", - " console.log(msg.tool_call_chunks);\n", - " }\n", - " }\n", - "}" - ] - }, - { - "cell_type": "markdown", - "id": "1d9b168b", - "metadata": {}, - "source": [ - "Because this is a ReAct-style agent, this will only log intermediate steps and not the final response because the model generates a final response with no tool calls when it no longer needs to gather more information from calling tools.\n", - "\n", - "## Streaming final responses\n", - "\n", - "### ReAct agents\n", - "\n", - "For ReAct-style agents, you know that as soon as you start message chunks with no `tool_call_chunks`, the model is responding directly to the user. So we can flip the conditional like this to only log tokens from the final response:" - ] - }, - { - "cell_type": "code", - "execution_count": 10, - "id": "86f843bb", - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "\n", - "\n", - "The\n", - " weather\n", - " today\n", - " is\n", - " cold\n", - ",\n", - " with\n", - " a\n", - " low\n", - " of\n", - " \n", - "3\n", - "℃\n", - ".\n", - "\n" - ] - } - ], - "source": [ - "const eventStreamFinalRes = await agent.streamEvents(\n", - " { messages: [{ role: \"user\", content: \"What's the weather like today?\" }] },\n", - " { version: \"v2\" });\n", - "\n", - "for await (const { event, data } of eventStreamFinalRes) {\n", - " if (event === \"on_chat_model_stream\") {\n", - " const msg = data.chunk as AIMessageChunk;\n", - " if (!msg.tool_call_chunks?.length) {\n", - " console.log(msg.content);\n", + " if (event === \"on_chat_model_stream\" && isAIMessageChunk(data.chunk)) {\n", + " if (data.chunk.tool_call_chunks !== undefined && data.chunk.tool_call_chunks.length > 0) {\n", + " console.log(data.chunk.tool_call_chunks);\n", " }\n", " }\n", "}" ] }, - { - "cell_type": "markdown", - "id": "f13b4790", - "metadata": {}, - "source": [ - "### Other graphs\n", - "\n", - "If your graph has multiple model calls in multiple nodes and there's one that will always be called last, you can distinguish that model by assigning it a run name or a tag. To illustrate this, declare a new graph like this:" - ] - }, { "cell_type": "code", - "execution_count": 11, - "id": "0fea2f20", + "execution_count": null, + "id": "5d6f7346", "metadata": {}, "outputs": [], - "source": [ - "const OtherGraphAnnotation = Annotation.Root({\n", - " messages: Annotation({\n", - " reducer: (x, y) => x.concat(y),\n", - " }),\n", - "});\n", - "\n", - "const respond = async (state: typeof OtherGraphAnnotation.State) => {\n", - " const { messages } = state;\n", - " const model = new ChatOpenAI({ model: \"gpt-4o\", temperature: 0 });\n", - " const responseMessage = await model.invoke(messages);\n", - " return {\n", - " messages: [responseMessage],\n", - " }\n", - "};\n", - "\n", - "const summarize = async (state: typeof OtherGraphAnnotation.State) => {\n", - " const { messages } = state;\n", - " // Assign the final model call a run name\n", - " const model = new ChatOpenAI({\n", - " model: \"gpt-4o\",\n", - " temperature: 0\n", - " }).withConfig({ runName: \"Summarizer\" });\n", - " const userMessage = { role: \"human\", content: \"Now, summarize the above messages\" };\n", - " const responseMessage = await model.invoke([\n", - " ...messages,\n", - " userMessage,\n", - " ]);\n", - " return { \n", - " messages: [userMessage, responseMessage]\n", - " };\n", - "}\n", - "\n", - "const otherWorkflow = new StateGraph(OtherGraphAnnotation)\n", - " .addNode(\"respond\", respond)\n", - " .addNode(\"summarize\", summarize)\n", - " .addEdge(\"__start__\", \"respond\")\n", - " .addEdge(\"respond\", \"summarize\")\n", - " .addEdge(\"summarize\", \"__end__\");\n", - "\n", - "const otherGraph = otherWorkflow.compile();" - ] - }, - { - "cell_type": "code", - "execution_count": 12, - "id": "2149f527", - "metadata": {}, - "outputs": [ - { - "data": { - "image/png": "/9j/4AAQSkZJRgABAQAAAQABAAD/4gHYSUNDX1BST0ZJTEUAAQEAAAHIAAAAAAQwAABtbnRyUkdCIFhZWiAH4AABAAEAAAAAAABhY3NwAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAQAA9tYAAQAAAADTLQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAlkZXNjAAAA8AAAACRyWFlaAAABFAAAABRnWFlaAAABKAAAABRiWFlaAAABPAAAABR3dHB0AAABUAAAABRyVFJDAAABZAAAAChnVFJDAAABZAAAAChiVFJDAAABZAAAAChjcHJ0AAABjAAAADxtbHVjAAAAAAAAAAEAAAAMZW5VUwAAAAgAAAAcAHMAUgBHAEJYWVogAAAAAAAAb6IAADj1AAADkFhZWiAAAAAAAABimQAAt4UAABjaWFlaIAAAAAAAACSgAAAPhAAAts9YWVogAAAAAAAA9tYAAQAAAADTLXBhcmEAAAAAAAQAAAACZmYAAPKnAAANWQAAE9AAAApbAAAAAAAAAABtbHVjAAAAAAAAAAEAAAAMZW5VUwAAACAAAAAcAEcAbwBvAGcAbABlACAASQBuAGMALgAgADIAMAAxADb/2wBDAAMCAgMCAgMDAwMEAwMEBQgFBQQEBQoHBwYIDAoMDAsKCwsNDhIQDQ4RDgsLEBYQERMUFRUVDA8XGBYUGBIUFRT/2wBDAQMEBAUEBQkFBQkUDQsNFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBT/wAARCAEuAHUDASIAAhEBAxEB/8QAHQABAAIDAAMBAAAAAAAAAAAAAAYHBAUIAQMJAv/EAFIQAAEDAwEDBQoJCQUFCQAAAAECAwQABQYRBxIhExYxVZQIFBUXIkFRk9HhMjdUVmFxdpKzIzU2QlJ0kaGyJHWBtNQzRXKWsUNGYoOVosHS8P/EABsBAQACAwEBAAAAAAAAAAAAAAACAwEEBQYH/8QANhEAAgECAgcECQUAAwAAAAAAAAECAxEEURITFSExUpEFFEFhMnGBobHB0eHwIjNCYnI0Y8L/2gAMAwEAAhEDEQA/APqnSlKA1jmTWdpxSF3WChaSUqSqSgEHzgjWvHOqy9cQO0o9tVNjVlt8m3OuvQIzripkveWtlJJ/tDnSSK2vN619Ww/UI9laFftChQqzpOLei2uK8HY68cBpRUtLiWJzqsvXEDtKPbTnVZeuIHaUe2q75vWvq2H6hHspzetfVsP1CPZVG1cPyS6ols7+3uLE51WXriB2lHtpzqsvXEDtKPbVd83rX1bD9Qj2U5vWvq2H6hHsptXD8kuqGzv7e4sTnVZeuIHaUe2nOqy9cQO0o9tV3zetfVsP1CPZTm9a+rYfqEeym1cPyS6obO/t7ixOdVl64gdpR7ac6rL1xA7Sj21XfN619Ww/UI9lOb1r6th+oR7KbVw/JLqhs7+3uLNg3WFc9/vOZHl7mm9yDqV7uvRroeHQay6rXZ1Cjwc6yBEZhqOg22CopaQEgnlZXHQVZVdZSjJKUeDSfVXOZVp6qbhkKUpQpFKUoCnsT/NK/wB8l/5hytzWmxP80r/fJf8AmHK3NeO7Q/5lb/Uviz11L0I+pCoddNruJWfMWsVlXUi+uKZQYzUZ50NqdOjSXHEIKGyrzBahrUxrn/aGLrYdr6Lhg9nydnJJ0uAxdCIBcsl0ijdC1uvHUNLabKgFApVqkJ0UDWtSgptpipJxSaJhgO3q0Ztk2XWZUWZAcsU56Ml1yFJDTrTTTaluKcU0EIVvLUAgq3iEhQ1Cga3WFbZ8O2hXCRAsV476msMd8qjvRXo61M66cogOoSVo1IG8nUcRx4iq5tMzJsKyDa5Z7bj9zcvt4mybxYbj3kpdudUYDaW0re+AhQcZKSlRGpI8x1qLbPLXdZG1nCb0u251KJs0+HdbnkzL4QmWtDTm4ltXBlGrSxqhKWySgAqNbDpQd2t27PyKVUmml5/Mn+Ud1NiELZxesqx12RkiIMLvtoNQJTbDpJCQkvFkpSQVDeB4p47wFWdimUwcysrN0twlCM4SkCXDeiuajgfybyErA+nTQ+aqMs+z69zu4eRijFpfi39zHFtC2vtFl7luKigpVoQonXp04mrowPLU5lj7c9Nqu1mKSGlRbzCXEeCglJPkLAJTx03hwJB0qupCCi9DwbXEnTlJtaXikSOlKVqmwezBP09v/wDdkH8WVViVXeCfp7f/AO7IP4sqrEr3dH9qn/mPwR5fF/vyFKUq01BSlKAp7E/zSv8AfJf+YcrQT9huzu6TpE2Zg+PypclxTzz71taUtxajqpSiU6kkkkk+mrJGya2NrdLNyu8dDji3eTal6JSVKKlaDTgNSa8+KqD1xe+2+6uXiOznVr1KsKttJt8H4u53I42loqMlwKtHc/bMh/3Axv8A9LZ/+tTS02iDYbbHt9tiMwIEZAbZjRmwhttI6AlI4AVvvFVB64vfbfdTxVQeuL3233VQ+ypS41k/YySx1FcImtpWy8VUHri99t91VFsOizc8zTavbbte7ouLjmRKtsANSNwpZDaVaKOnlHUnjUNj/wDaujJbQpZMsuozlGzLEc3mtS8hxm03uU03yTb0+G28tCNSd0FQJA1JOn01OPFVB64vfbfdTxVQeuL3233VldkuLuqq6Mw8fSe5plXnYDs0KAg4DjhQCSE+DGdATpqfg/QP4VIMVwPG8GbkN47YrdY25BSp5NvjIZDhGuhVuga6an+NTDxVQeuL3233U8VUHri99t91SfZc2rOt8SKxtFb1EwcE/T2//wB2QfxZVWJUexnCIWLTJkuPImSpEpttpxyY9yh3UFZSBw4cXFfxqQ12lFQjGCd7JLojk15qpUc14ilKVkoFKUoBSlKAUpSgFc79y38Ze3/7Yq/BRXRFc79y38Ze3/7Yq/BRQHRFKUoBSlKAUpSgFKUoBSlKAUpSgFKUoBXO/ct/GXt/+2KvwUV0RXO/ct/GXt/+2KvwUUB0RSlKAUpSgFKUoBSlKAUpSgFKVCrrtLb5ZxixW9d6cbVuqkKc5CKlXnHKEEr+tCVAcQSCNKnGEp8Pz2k4QlN2irk1pVbKzTLVcRDsrf8A4S68v+eg/wCleOeeXfJrJ956rNUuZdTY7pWyK97u3YV47Nh09cCOHcjx7eudv0TqtwJT+WZHn8tA1AHSpCK+YHcw7Fn9vW2axYsEOeDCvvq6Oo4FqI2QXDr5irggH9paa+wPPPLvk1k+89VR7FtjB2F5PmV8x+Jae+ckl8uptwuBENrUqDDQA4I3lKP1BI/VBpqlzLqO6VsjqalVrzzy75NZPvPU555d8msn3nqapcy6julbIsqlV01nmSxiFSLPbprfDeEWWttz6dApBB+oqH11Ksby+BkyXEMcrGmMgF6DKTuPNa9BI1IIOh0UklJ0OhOhqLpSSut68ncqnRqU98kbulKVUUilKUApSlAQTPru5OuLePR3FNtFkSJ7ja91XJlRDbQI4jfKV6n0I0/WrVNtoZbS22kIQkBKUpGgAHQAK9Dqy7muXKX8NE1lpP0IERhQH8VqP+JrSbTLvLx/Zvld0t7ve8+DaZcmO7uhW44hlakq0IIOhAOhBFWV9zUFwSXvSZ6HDQVOknnvJJSudL1nmeYhs1w+e5fJOQ5Jmj0GKy1Gt8RKLeVsLedUwhRbDiylIADrm7vDUADyT4Xkm2qHi2TtMWy7yFsiI7BuVzg28XHcLukpKGI7qmXVJbAWjUJ1JIIPDXWsXa1ZM6MpXPCc+vt/Ts2tmPbQpEwXq7z4NwujlojsymwzFdd5FxhbejbqFIAPkjjoSCOBxrltXzXHZ95wRV2YuGQIyO2WWHkj0NCS2xMYU9yjjKdEKcbS24kaBKVEoJHTqsNasvzidIUrmrPdpec7M2syxxeSIvN1gxbRdLbepMBlK0tyLgmM6y82gJQr4J0KQk6LPEEAjZZXtUybYtdM2iXe7HMGYGMIv8JyTFajrbeL6mC0eSSkFsq3FakbwGvE9NLDXRXFfm/6F8SLvBiXCJAfmx2Z0wLVGiuOpS6+EAFZQknVW6FDXTo1GvTX6mx3lKbkw3e9rjHJVHfHmPnSr0oVoApPnHoIBFBN4/ldk28bK15VlYyWTIg3dXJogNRm4znJMb6WygAqQdQBv6nyddeOg6GqUZODUkST1iakidYxfW8msUS5NtlnlkkOMk6lpxJKXGyfOUrSpJ+qtpUG2UrV3pkLX/ZNXZwN+jRTTS1f+9a/51Oa2KsVGbS4fXeeaqR0JuOQpSlVFYpSlAVxmMBVmy0XDQiFdW0NLXr5LchGoTr/AMaCAPpbA6SK02TWCPlWN3WyS1utxblEdhvLZIC0ocQUKKSQQDoo6ag/VVr3G3RbvBehzGESYzyd1bTg1BH/AO46+Y1X1xxO/wBhUrvJsZDBHwEcolqWgeglZCHP+IqQejUE6mrnHXWadms91/kdbDYmKjq6hCMi2TWPKMFtmLTVSxFtYjqgzWHuSlxnWAA0824kDdcGnSBpxPDQ6Vix9k62can2pea5bIkS3m3zdnLggS2igghLZS2EJSdOKdzRWp111qXmbckcF41eUq/ZEdKv5pUR/OvHhCf83L12T31Hu9XL4G/p0Xvuik8i7np6LeMGj2G4XpLDN7n3a85Ama0J4deiLbDxKk7pKlBtG6lBG70p01NS+P3PuMt4pcbM/Jus2VPnouj99ky9biZjZTyT6XQAEqRupCQlISACNNCdZ74Qn/Ny9dk99aWwbQoeUTrxDtNuuk6VZ5Pec9pqLqY72mu4rj06EGnd6uRFOivFEWPc8WGTZb7DuF2vd2n3t2I5NvM6S2uYtMZ1LrLaSGwhKApPwUoGu8rz6ESHItlFhyvIrldrqh6WbjZFWGTDWsBhcYuFwnQDeC9VHiFcOGg141IvCE/5uXrsnvp4Qn/Ny9dk99O71ciWnRzRAcV2DW/GMksV6cyfJr5IsjL8aC1d5rbzbTbqUpUnQNpJ0CE6EnXhxJ4VY06a3b4jkh3XdQPgpGqlEnQJSPOokgAeckCvW0b3MUERMZuJUdPLlFphsfSSpe9/BJP0VKcawd6PKauN7eZlzm+LMZgEx4x/aSVDVa/NvkDQdCU6nUqOjvqPd69/29pTPEUqUf0u7M/A7E/YMbYZlgCe+tcqUAreCXXFFRQD5wnUIB9CRUhpSsTk5ycn4nBbcndilKVAwKUpQClKUApSlAK537lv4y9v/wBsVfgoroiud+5b+Mvb/wDbFX4KKA6IpSlAKUpQClKUApSlAKUpQClKUApSlAK537lv4y9v/wBsVfgoroiud+5b+Mvb/wDbFX4KKA6IpSlAKUpQClKUApSlAKUpQClet+Q1FbLjzqGkDpU4oJH8TWv502Uf73gdpR7akoylwQNpStXzqsvXEDtKPbTnVZeuIHaUe2paufKzNmVX3U3dA3PubsJg5TGxDnZbVyhFmEXExFRSofk1H8i5vJJBSTw0JSOO9w4e2K939MxTPMyXbtmy77cM4v6Z0eC3eeTU04sJbSyD3urfJOnHRPT0V9Edo9txLafgl8xS8XO3uW67RVxnf7Q2SgkeStOp+ElQSoegpFcA9wh3OAxzbvkd9zJyLHZwyQuHCU66lLUuYdQHWyT5aEt+UDp0uNkdFNXPlYsz6bUrV86rL1xA7Sj2051WXriB2lHtpq58rFmbSlavnVZeuIHaUe2vZHyC1ynAhi5Q3lnoS2+hRP8AgDTVzXgxZmwpSlVmBSlKAVEMuy5+JLFptIQbgUhb8lwbzcRB6OH6ziv1U9AAKlcN1K5XIfRFjuvOHRttJWo/QBqaqHGluS7U3cX9DLuR79fUNeKlgEDj5kp3Uj6EirY2jF1H4cPWbuFoqrP9XBHheNQZb3L3Fs3iWRoZNx0eWeOvAEbqR9CQB9Fe7m/ax/u2H6hHsrVZttFx7Z3FiP3+4GGJjvIRmmmHJDzywkqIQ22lS1aAEnQcB01+7NtBsGQXWHbYE/vibLtqLww3yLid+IpW4lzUpAGquG6TvfRVbrVJcZM7q0I/pRsub9r6th+oT7Kc37X1bD9Qn2VEJW3fB4kK1yl3lam7oqW3BQ1CkOOyVxnQ0+lDaWytSkrOmgGpAJGoBI9Nl7oXZ/kMy3RrfkCX13B8RY7hiPoaL510YU4psIQ6dP8AZrIV0cOIqOsqczGnDhdE15v2vq2H6hPspzftfVsP1CfZUJu3dDbPrFPlxJ+QpjLhyzBlOqiP8hGfB3dx10N7jZJ6CpQB6RqK2Nn2yYhfYF7mRbsUtWVoP3BMmI9HdYaKSoOFtxCVlBCVEKAIOh0JprKnMxpQ4XRJeb9r6th+oT7Kc37X1bD9Qn2VFsf214Zk9qvdwt94K49kYMq4IeiPsPR2glSt8tOISspISoghJ10OmtRzJu6Gx04iu6Y7d2lKXJiR2J1wtM9UFRec0A5RtrjqlLiQoHdSvdCiNdC1lTmZhzgle5ZnN+19Ww/UJ9lfh3GbO+jdctUFxP7K4yCP+lQa/d0ZgeMXZ+23O43KNLZld5KHgKetBe3twIStLBSvVXAbpIPm1rY37bjg+MX52z3O/NxprC0NyFcg6tiMpWm6l55KC20TqDotQOhB89NZUX8n1GlDNEvtom4opLlleX3sj4Vqec3o7g9CCdS0r0bp3fSk+aybFe42Q2xqbF3wheoU24NFtLHBSFDzKB4H+Wo0NQHpr34TL8G5tKgp0Szc4hmbvH/bMqQ2pXo1Uhxsf+WKvjJ1k1L0lvvnnf2bzn4uhHR1kVvLHpSlUnGMa5RBcLdKik6B9pTevo1BH/zVS4q4peN20LSpDrbCWXEKGhStA3Vg/UpJFXHVdZVYXccuMm6xGFPWqWsuzG2hquM6QAXQnztq08rTilXlaEKUUXRWnB01x4r6fmVjoYOqqc2peJUW3qBBkQbHMdhZR4WhSXHbbd8ThGVJt7pbKSVIAO82sEpKSkpPQdOkQrGrrmFizTE8zzPGrpImXDEfBk1NmgKkKZmJkh1KXG29eT30HXU+SlQIJGldCxpLMxhD8d1D7Lg3kONqCkqHpBHA17K1Xdbmdhwu9JM5i2O4vf4112QyZ+P3O296rypyYiVFUkxC/LCmg4dNE744pOuihxGor2O4hfPFPcIostw79O0gz0MCIvlO9/DAc5YJ013OT8rf6N3jrpXTNKwRVFJWv+WS+RzNkeIXyRsg2vQm7LcHJc7NDLix0xFlyQz3zDVyjadNVo0Qo7w1GiT6DW82qwcyt+0nNrziFvlqnqwqGxEltRt9KnkzpBcS2VDcW8lpRUEE66lPDQ1ftKGdUs/zf9Tk1jHJ8m/55LtlmziVAueAyrexNyZmQ5IlS0lai2Euaqb1Do3UbqAo7+4D57C2iYzcpHcuWazQrVKduDLFiQbexHUp5HJyopWOTA1G6EqJ4cAk69FXhSlzCpJJq/E55y3PXLntrC75ieYSMZxU62xEHHJclmdPUCFyipKN0paSSlvp1KlLB+DUYg4Axa7zmON5hjmf3bwzfJcll6wzZvgydEkubwLobdS02pIUUrSsDgnhva11bSlw6V3ds9caOiJGaYaBDbSAhIJJIAGg4npr34tHVM2hsOJCuTgW14uHTgFPONhHH6mXf4VgyrilmS1DYQZdyeGrEJojlHOOmunmSPOo8B5zU6w7GTjsB5UhSHblMWH5breu7v7oSEI147qQAB0a8ToCo1tU06adR+Ksvbufstf2mtjKqjDQXFm/pSlVHCFKUoCL3PZvYbnJckiM7BkuHVbtvkORys66kqCCAo6+cgmsDxUQOt71233VN6Veq9RfyLFVnHcpMhHiogdb3rtvup4qIHW967b7qm9Kzr6mfwJa6pzMhHiogdb3rtvuqoth0OZnmabV7bdr3dFxccyJVtgBqRuFLIbSrRR08o6k8a6UrnfuW/jL2/8A2xV+CimvqZ/Aa6pzMtHxUQOt71233U8VEDre9dt91TelNfUz+A11TmZCPFRA63vXbfdX6Tsoteo5W43l5I/VNwWjX/FG6f51NaVjX1MxrqnMzWWLGbXjTC2rZCbihZBcUnUrcPmKlnVSjx6STWzpSqpScneTuypu+9ilKVEwKUpQClKUApSlAK537lv4y9v/ANsVfgoroiud+5b+Mvb/APbFX4KKA6IpSlAKUpQClKUApSlAKUpQClKUApSlAK537lv4y9v/ANsVfgor193bsK8dmw6euBHDuR49vXO36J1W4Ep/LMjz+WgagDpUhFfMDuYdiz+3rbNYsWCHPBhX31dHUcC1EbILh18xVwQD+0tNAfculKUApSlAKUpQClKUApSlARHOcquVgm2iHbGIjr04u7ypZUEpCEg8N3061ped+X/J7J956sjaP+lGK/VL/oRWPWvisTLD6EYJb1fevN/Q832jjq+GrKFN7rZDnfl/yeyfeepzvy/5PZPvPUpWltCrkuhy9rYrNdEOd+X/ACeyfeeqpNi2xVWwvJ8yvmPxLT3zkkvl1NucoEQ2tSoMNADgjeUo/UEj9UGrbpTaFXJdBtbFZrohzvy/5PZPvPU535f8nsn3nqxLhd4NpVFTOmx4apb6Y0cSHUtl50glLaNT5SiEqISOPA+isum0KuS6DauLzXRDnfl/yeyfeerHuOeZbbrfKlqi2VaWGlOlIU9qQkE6fyrIrWZR+jV2/dHv6DVtLHVJVIxaVm14E4dq4qUkm10RaVtlGdboslSQlTzSHCkdA1AOn86ya1+P/mG2/uzX9ArYV0pK0mke1FKUqIFKUoCv9o/6UYr9Uv8AoRWPWRtH/SjFfql/0IqOZTNyCDFZVj1ogXeQpejjVwuK4aUJ06QpLLup104aD665vaG+VP8Az/6keM7XV8Sl5L5m7qA7dNoEzZlszud9tsZMu5JcYixG1gFPKvPIaSogqSCAV66FSQdNNRrrX55wbTfmTjf/ADQ9/oa/Muy3zaXabnjedYlZ4uOzo5Q6qDfHZTilbySnRJjNbuhG8FBWoKRoPRzUrNN8DlQhoyUp2aXHevqVcvMtrOM2LL5c9m9uW2NjVwnN3S/QbYw7DmtNFbPJpivOJcQfK1StPApTxUCa2sDaBlmE37DZWQZAcjtmSWKbcpEMQGWBDdjx23/yBQN4pKVKTo4pR10OtTiBsWYYx3ILNcMtym/xbzbnLW4q6z0OqYZWhSCWwGwnf0UfLUFE6DUmts9sutEi5YjNdckuKxmK/DitLUgtvNuspaXyo3fKO6gdG6NSeB6Ksc45Gw6tJ7ml48F5bvVvKFnu5lk9v2LZlkWTNy4t+ya3zmrDGgtIYhJdjvONBDoHKKKUHRW8SCTw0049VVTsLua7RYjZzAv2RyINhnC52ixSrigwo7qUrCGhq0pYb8sjiVFI6PODIxkG0zUa4TjYH0ZQ9/oaxNqXo/QjWcattC1lfJeJP61mUfo1dv3R7+g1EucG035k43/zS9/oKleSlRxe6lQCVd5u6gHUA7h89Zoq1WHrXxKIRcZxvnmizcf/ADDbf3Zr+gVsK1+P/mG2/uzX9ArYV6OfpM+kilKVAClKUBAdpcab4Xx6ZGt0q4NRzIS6Ije+pG8lISSNfoNaPwnO+bl77J76tqlYqQpVdHWRu0rcfNv5nOxGAo4menUvf1lS+E53zcvfZPfTwnO+bl77J76tqlVd2w3I+prbIwvn1+xUvhOd83L32T308Jzvm5e+ye+rapTu2G5H1GyML59fsVL4TnfNy99k99PCc75uXvsnvq2qU7thuR9RsjC+fX7FS+E53zcvfZPfWFe5Fyn2WfGaxy9F16O42gGLoNSkgef6auelSjQw8JKShw8zMeycNFpq/X7GFZWVx7PAacSUOIYbSpJ6QQkAis2lKtbu7nZFKUrAP//Z" - }, - "metadata": {}, - "output_type": "display_data" - } - ], - "source": [ - "const otherRunnableGraph = otherGraph.getGraph();\n", - "const otherImage = await otherRunnableGraph.drawMermaidPng();\n", - "const otherArrayBuffer = await otherImage.arrayBuffer();\n", - "\n", - "await tslab.display.png(new Uint8Array(otherArrayBuffer));" - ] - }, - { - "cell_type": "markdown", - "id": "5ff9d991", - "metadata": {}, - "source": [ - "Now when we call `streamEvents`, we can see that we can now filter on run name to only see the final summary generation of the current chat history:" - ] - }, - { - "cell_type": "code", - "execution_count": 13, - "id": "51381303", - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "\n", - "You\n", - " asked\n", - " about\n", - " the\n", - " capital\n", - " of\n", - " Nepal\n", - ",\n", - " and\n", - " I\n", - " responded\n", - " that\n", - " it\n", - " is\n", - " Kathmandu\n", - ".\n", - "\n" - ] - } - ], - "source": [ - "const otherEventStream = await otherGraph.streamEvents(\n", - " { messages: [{ role: \"user\", content: \"What's the capital of Nepal?\" }] },\n", - " { version: \"v2\" },\n", - " { includeNames: [\"Summarizer\"] }\n", - ");\n", - "\n", - "for await (const { event, data } of otherEventStream) {\n", - " if (event === \"on_chat_model_stream\") {\n", - " console.log(data.chunk.content);\n", - " }\n", - "}" - ] - }, - { - "cell_type": "markdown", - "id": "46998967", - "metadata": {}, - "source": [ - "And you can see the resulting chunks are only ones from the final summary model call.\n", - "\n", - "## Next steps\n", - "\n", - "You've now seen some ways to stream LLM tokens from within your graph. Next, check out some of the other how-tos around streaming by going [to this page](/langgraphjs/how-tos/#streaming)." - ] + "source": [] } ], "metadata": { diff --git a/examples/how-tos/streaming-content.ipynb b/examples/how-tos/streaming-content.ipynb new file mode 100644 index 00000000..121eb9a5 --- /dev/null +++ b/examples/how-tos/streaming-content.ipynb @@ -0,0 +1,352 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "15c4bd28", + "metadata": {}, + "source": [ + "# How to stream custom data\n", + "\n", + "
\n", + "

Prerequisites

\n", + "

\n", + " This guide assumes familiarity with the following:\n", + "

\n", + "

\n", + "
\n", + "\n", + "The most common use case for streaming from inside a node is to stream LLM tokens, but you may also want to stream custom data.\n", + "\n", + "For example, if you have a long-running tool call, you can dispatch custom events between the steps and use these custom events to monitor progress. You could also surface these custom events to an end user of your application to show them how the current task is progressing.\n", + "\n", + "You can do so in two ways:\n", + "\n", + "* using your graph's `.stream` method with `streamMode: \"custom\"`\n", + "* emitting custom events using [`dispatchCustomEvents`](https://js.langchain.com/docs/how_to/callbacks_custom_events/) with `streamEvents`.\n", + "\n", + "Below we'll see how to use both APIs.\n", + "\n", + "## Setup\n", + "\n", + "First, let's install our required packages:\n", + "\n", + "```bash\n", + "npm install @langchain/langgraph @langchain/core\n", + "```" + ] + }, + { + "cell_type": "markdown", + "id": "12297071", + "metadata": {}, + "source": [ + "
\n", + "

Set up LangSmith for LangGraph development

\n", + "

\n", + " Sign up for LangSmith to quickly spot issues and improve the performance of your LangGraph projects. LangSmith lets you use trace data to debug, test, and monitor your LLM apps built with LangGraph — read more about how to get started here. \n", + "

\n", + "
" + ] + }, + { + "cell_type": "markdown", + "id": "29814253-ca9b-4844-a8a5-d6b19fbdbdba", + "metadata": {}, + "source": [ + "## Stream custom data using `.stream`" + ] + }, + { + "cell_type": "markdown", + "id": "b729644a-b65f-4e69-ad45-f2e88ffb4e9d", + "metadata": {}, + "source": [ + "### Define the graph" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "id": "9731c40f-5ce7-460d-b2ad-33185529c99d", + "metadata": {}, + "outputs": [], + "source": [ + "import {\n", + " StateGraph,\n", + " MessagesAnnotation,\n", + " LangGraphRunnableConfig,\n", + "} from \"@langchain/langgraph\";\n", + "\n", + "const myNode = async (\n", + " _state: typeof MessagesAnnotation.State,\n", + " config: LangGraphRunnableConfig\n", + ") => {\n", + " const chunks = [\n", + " \"Four\",\n", + " \"score\",\n", + " \"and\",\n", + " \"seven\",\n", + " \"years\",\n", + " \"ago\",\n", + " \"our\",\n", + " \"fathers\",\n", + " \"...\",\n", + " ];\n", + " for (const chunk of chunks) {\n", + " // write the chunk to be streamed using streamMode=custom\n", + " // Only populated if one of the passed stream modes is \"custom\".\n", + " config.writer?.(chunk);\n", + " }\n", + " return {\n", + " messages: [{\n", + " role: \"assistant\",\n", + " content: chunks.join(\" \"),\n", + " }],\n", + " };\n", + "};\n", + "\n", + "const graph = new StateGraph(MessagesAnnotation)\n", + " .addNode(\"model\", myNode)\n", + " .addEdge(\"__start__\", \"model\")\n", + " .compile();" + ] + }, + { + "cell_type": "markdown", + "id": "ecd69eed-9624-4640-b0af-c9f82b190900", + "metadata": {}, + "source": [ + "### Stream content" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "id": "00a91b15-82c7-443c-acb6-a7406df15cee", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Four\n", + "score\n", + "and\n", + "seven\n", + "years\n", + "ago\n", + "our\n", + "fathers\n", + "...\n" + ] + } + ], + "source": [ + "const inputs = [{\n", + " role: \"user\",\n", + " content: \"What are you thinking about?\",\n", + "}];\n", + "\n", + "const stream = await graph.stream(\n", + " { messages: inputs },\n", + " { streamMode: \"custom\" }\n", + ");\n", + "\n", + "for await (const chunk of stream) {\n", + " console.log(chunk);\n", + "}" + ] + }, + { + "cell_type": "markdown", + "id": "c7b9f1f0-c170-40dc-9c22-289483dfbc99", + "metadata": {}, + "source": [ + "You will likely need to use [multiple streaming modes](https://langchain-ai.github.io/langgraphjs/how-tos/stream-multiple/) as you will\n", + "want access to both the custom data and the state updates." + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "id": "f8ed22d4-6ce6-4b04-a68b-2ea516e3ab15", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[ 'custom', 'Four' ]\n", + "[ 'custom', 'score' ]\n", + "[ 'custom', 'and' ]\n", + "[ 'custom', 'seven' ]\n", + "[ 'custom', 'years' ]\n", + "[ 'custom', 'ago' ]\n", + "[ 'custom', 'our' ]\n", + "[ 'custom', 'fathers' ]\n", + "[ 'custom', '...' ]\n", + "[ 'updates', { model: { messages: [Array] } } ]\n" + ] + } + ], + "source": [ + "const streamMultiple = await graph.stream(\n", + " { messages: inputs },\n", + " { streamMode: [\"custom\", \"updates\"] }\n", + ");\n", + "\n", + "for await (const chunk of streamMultiple) {\n", + " console.log(chunk);\n", + "}" + ] + }, + { + "attachments": {}, + "cell_type": "markdown", + "id": "ca976d6a-7c64-4603-8bb4-dee95428c33d", + "metadata": {}, + "source": [ + "## Stream custom data using `.streamEvents`\n", + "\n", + "If you are already using graph's `.streamEvents` method in your workflow, you can also stream custom data by emitting custom events using `dispatchCustomEvents`" + ] + }, + { + "cell_type": "markdown", + "id": "b390a9fe-2d5f-4e82-a1ea-c7c0186b8559", + "metadata": {}, + "source": [ + "### Define the graph" + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "id": "486a01a0", + "metadata": {}, + "outputs": [], + "source": [ + "import { dispatchCustomEvent } from \"@langchain/core/callbacks/dispatch\";\n", + "\n", + "const graphNode = async (_state: typeof MessagesAnnotation.State) => {\n", + " const chunks = [\n", + " \"Four\",\n", + " \"score\",\n", + " \"and\",\n", + " \"seven\",\n", + " \"years\",\n", + " \"ago\",\n", + " \"our\",\n", + " \"fathers\",\n", + " \"...\",\n", + " ];\n", + " for (const chunk of chunks) {\n", + " await dispatchCustomEvent(\"my_custom_event\", { chunk });\n", + " }\n", + " return {\n", + " messages: [{\n", + " role: \"assistant\",\n", + " content: chunks.join(\" \"),\n", + " }],\n", + " };\n", + "};\n", + "\n", + "const graphWithDispatch = new StateGraph(MessagesAnnotation)\n", + " .addNode(\"model\", graphNode)\n", + " .addEdge(\"__start__\", \"model\")\n", + " .compile();" + ] + }, + { + "cell_type": "markdown", + "id": "7dcded03-6776-405e-afae-005a3212d3e4", + "metadata": {}, + "source": [ + "### Stream content" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "id": "ce773a40", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Four|\n", + "score|\n", + "and|\n", + "seven|\n", + "years|\n", + "ago|\n", + "our|\n", + "fathers|\n", + "...|\n" + ] + } + ], + "source": [ + "const eventStream = await graphWithDispatch.streamEvents(\n", + " {\n", + " messages: [{\n", + " role: \"user\",\n", + " content: \"What are you thinking about?\",\n", + " }]\n", + " },\n", + " {\n", + " version: \"v2\",\n", + " },\n", + ");\n", + "\n", + "for await (const { event, name, data } of eventStream) {\n", + " if (event === \"on_custom_event\" && name === \"my_custom_event\") {\n", + " console.log(`${data.chunk}|`);\n", + " }\n", + "}" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "TypeScript", + "language": "typescript", + "name": "tslab" + }, + "language_info": { + "codemirror_mode": { + "mode": "typescript", + "name": "javascript", + "typescript": true + }, + "file_extension": ".ts", + "mimetype": "text/typescript", + "name": "typescript", + "version": "3.7.2" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/libs/langgraph/src/pregel/algo.ts b/libs/langgraph/src/pregel/algo.ts index 13cff6eb..7b1d71bc 100644 --- a/libs/langgraph/src/pregel/algo.ts +++ b/libs/langgraph/src/pregel/algo.ts @@ -672,7 +672,7 @@ export function _prepareSingleTask< mergeConfigs(config, { metadata, tags: proc.tags, - store: extra.store ?? config?.store, + store: extra.store ?? config.store, }), { runName: name, diff --git a/libs/langgraph/src/pregel/index.ts b/libs/langgraph/src/pregel/index.ts index 18ec0d3d..cbdf3b1a 100644 --- a/libs/langgraph/src/pregel/index.ts +++ b/libs/langgraph/src/pregel/index.ts @@ -806,7 +806,7 @@ export class Pregel< StreamMode[], // stream mode string | string[], // input keys string | string[], // output keys - RunnableConfig, // config without pregel keys + LangGraphRunnableConfig, // config without pregel keys All | string[], // interrupt before All | string[], // interrupt after BaseCheckpointSaver | undefined, @@ -1016,19 +1016,24 @@ export class Pregel< const messageStreamer = new StreamMessagesHandler((chunk) => stream.push(chunk) ); - const { callbacks } = restConfig; + const { callbacks } = config; if (callbacks === undefined) { - restConfig.callbacks = [messageStreamer]; + config.callbacks = [messageStreamer]; } else if (Array.isArray(callbacks)) { - restConfig.callbacks = callbacks.concat(messageStreamer); + config.callbacks = callbacks.concat(messageStreamer); } else { const copiedCallbacks = callbacks.copy(); copiedCallbacks.addHandler(messageStreamer, true); - restConfig.callbacks = copiedCallbacks; + config.callbacks = copiedCallbacks; } } - const callbackManager = await getCallbackManagerForConfig(restConfig); + // setup custom stream mode + if (streamMode.includes("custom")) { + config.writer = (chunk: unknown) => stream.push([[], "custom", chunk]); + } + + const callbackManager = await getCallbackManagerForConfig(config); const runManager = await callbackManager?.handleChainStart( this.toJSON(), _coerceToDict(input, "input"), @@ -1036,7 +1041,7 @@ export class Pregel< undefined, undefined, undefined, - restConfig?.runName ?? this.getName() + config?.runName ?? this.getName() ); const { channelSpecs, managed } = await this.prepareSpecs(config); diff --git a/libs/langgraph/src/pregel/messages.ts b/libs/langgraph/src/pregel/messages.ts index a03bf3ab..0e4c155c 100644 --- a/libs/langgraph/src/pregel/messages.ts +++ b/libs/langgraph/src/pregel/messages.ts @@ -74,7 +74,7 @@ export class StreamMessagesHandler extends BaseCallbackHandler { _extraParams?: Record, tags?: string[], metadata?: Record, - runName?: string + name?: string ) { if ( metadata && @@ -83,7 +83,7 @@ export class StreamMessagesHandler extends BaseCallbackHandler { ) { this.metadatas[runId] = [ (metadata.langgraph_checkpoint_ns as string).split("NS_SEP"), - { tags, runName, ...metadata }, + { tags, name, ...metadata }, ]; } } @@ -135,16 +135,16 @@ export class StreamMessagesHandler extends BaseCallbackHandler { tags?: string[], metadata?: Record, _runType?: string, - runName?: string + name?: string ) { if ( metadata !== undefined && - runName === metadata.langgraph_node && + name === metadata.langgraph_node && (tags === undefined || !tags.includes(TAG_HIDDEN)) ) { this.metadatas[runId] = [ (metadata.langgraph_checkpoint_ns as string).split("NS_SEP"), - { tags, runName, ...metadata }, + { tags, name, ...metadata }, ]; } } diff --git a/libs/langgraph/src/pregel/runnable_types.ts b/libs/langgraph/src/pregel/runnable_types.ts index 278df0c6..fa01ec70 100644 --- a/libs/langgraph/src/pregel/runnable_types.ts +++ b/libs/langgraph/src/pregel/runnable_types.ts @@ -6,4 +6,6 @@ export interface LangGraphRunnableConfig< ConfigurableType extends Record = Record > extends RunnableConfig { store?: BaseStore; + + writer?: (chunk: unknown) => void; } diff --git a/libs/langgraph/src/pregel/types.ts b/libs/langgraph/src/pregel/types.ts index e7dd3723..f2197ca8 100644 --- a/libs/langgraph/src/pregel/types.ts +++ b/libs/langgraph/src/pregel/types.ts @@ -182,7 +182,8 @@ export interface PregelExecutableTask< > { readonly name: N; readonly input: unknown; - readonly proc: Runnable; + // eslint-disable-next-line @typescript-eslint/no-explicit-any + readonly proc: Runnable; readonly writes: PendingWrite[]; readonly config?: LangGraphRunnableConfig; readonly triggers: Array; diff --git a/libs/langgraph/src/pregel/utils/config.ts b/libs/langgraph/src/pregel/utils/config.ts index 549c878b..a8b19968 100644 --- a/libs/langgraph/src/pregel/utils/config.ts +++ b/libs/langgraph/src/pregel/utils/config.ts @@ -16,6 +16,7 @@ const CONFIG_KEYS = [ "outputKeys", "streamMode", "store", + "writer", ]; const DEFAULT_RECURSION_LIMIT = 25; diff --git a/libs/langgraph/src/tests/pregel.test.ts b/libs/langgraph/src/tests/pregel.test.ts index 6a466454..ad593ef0 100644 --- a/libs/langgraph/src/tests/pregel.test.ts +++ b/libs/langgraph/src/tests/pregel.test.ts @@ -8076,9 +8076,7 @@ export function runPregelTests( }); }); - it("should work with streamMode messages from within a subgraph", async () => { - const checkpointer = await createCheckpointer(); - + it("should work with streamMode messages and custom from within a subgraph", async () => { const child = new StateGraph(MessagesAnnotation) .addNode("c_one", () => ({ messages: [new HumanMessage("foo"), new AIMessage("bar")], @@ -8092,8 +8090,11 @@ export function runPregelTests( runName: "c_two_chat_model_stream", }); // eslint-disable-next-line @typescript-eslint/no-unused-vars - for await (const _ of stream) { - // pass + for await (const chunk of stream) { + config.writer?.({ + content: chunk.content, + from: "subgraph", + }); } return { messages: [await model.invoke("hey", config)] }; }) @@ -8106,6 +8107,9 @@ export function runPregelTests( const toolExecutor = RunnableLambda.from(async () => { return [new ToolMessage({ content: "qux", tool_call_id: "test" })]; }); + config.writer?.({ + from: "parent", + }); return { messages: await toolExecutor.invoke({}, config), }; @@ -8123,21 +8127,20 @@ export function runPregelTests( .addEdge("p_two", "p_three") .addEdge("p_three", END); - const graph = parent.compile({ checkpointer }); - const config = { configurable: { thread_id: "1" } }; + const graph = parent.compile({}); + const config = {}; - const checkpointEvents: StateSnapshot[] = await gatherIterator( + const streamedEvents: StateSnapshot[] = await gatherIterator( graph.stream({ messages: [] }, { ...config, streamMode: "messages" }) ); - expect(checkpointEvents).toEqual([ + expect(streamedEvents).toEqual([ [ new _AnyIdToolMessage({ tool_call_id: "test", content: "qux", }), { - thread_id: "1", langgraph_step: 1, langgraph_node: "p_one", langgraph_triggers: ["__start__:p_one"], @@ -8146,7 +8149,7 @@ export function runPregelTests( __pregel_resuming: false, __pregel_task_id: expect.any(String), checkpoint_ns: expect.stringMatching(/^p_one:/), - runName: "p_one", + name: "p_one", tags: ["graph:step:1"], }, ], @@ -8155,7 +8158,6 @@ export function runPregelTests( content: "foo", }), { - thread_id: "1", langgraph_step: 1, langgraph_node: "c_one", langgraph_triggers: ["__start__:c_one"], @@ -8164,7 +8166,7 @@ export function runPregelTests( __pregel_resuming: false, __pregel_task_id: expect.any(String), checkpoint_ns: expect.stringMatching(/^p_two:/), - runName: "c_one", + name: "c_one", tags: ["graph:step:1"], }, ], @@ -8173,7 +8175,6 @@ export function runPregelTests( content: "bar", }), { - thread_id: "1", langgraph_step: 1, langgraph_node: "c_one", langgraph_triggers: ["__start__:c_one"], @@ -8182,7 +8183,7 @@ export function runPregelTests( __pregel_resuming: false, __pregel_task_id: expect.any(String), checkpoint_ns: expect.stringMatching(/^p_two:/), - runName: "c_one", + name: "c_one", tags: ["graph:step:1"], }, ], @@ -8191,7 +8192,6 @@ export function runPregelTests( content: "1", }), { - thread_id: "1", langgraph_step: 2, langgraph_node: "c_two", langgraph_triggers: ["c_one"], @@ -8204,7 +8204,7 @@ export function runPregelTests( ls_provider: "FakeChatModel", ls_stop: undefined, tags: ["c_two_chat_model"], - runName: "c_two_chat_model_stream", + name: "c_two_chat_model_stream", }, ], [ @@ -8212,7 +8212,6 @@ export function runPregelTests( content: "2", }), { - thread_id: "1", langgraph_step: 2, langgraph_node: "c_two", langgraph_triggers: ["c_one"], @@ -8225,7 +8224,7 @@ export function runPregelTests( ls_provider: "FakeChatModel", ls_stop: undefined, tags: ["c_two_chat_model"], - runName: "c_two_chat_model_stream", + name: "c_two_chat_model_stream", }, ], [ @@ -8233,7 +8232,6 @@ export function runPregelTests( content: "3", }), { - thread_id: "1", langgraph_step: 2, langgraph_node: "c_two", langgraph_triggers: ["c_one"], @@ -8246,7 +8244,7 @@ export function runPregelTests( ls_provider: "FakeChatModel", ls_stop: undefined, tags: ["c_two_chat_model"], - runName: "c_two_chat_model_stream", + name: "c_two_chat_model_stream", }, ], [ @@ -8254,7 +8252,6 @@ export function runPregelTests( content: "baz", }), { - thread_id: "1", langgraph_step: 2, langgraph_node: "c_two", langgraph_triggers: ["c_one"], @@ -8274,7 +8271,6 @@ export function runPregelTests( content: "parent", }), { - thread_id: "1", langgraph_step: 3, langgraph_node: "p_three", langgraph_triggers: ["p_two"], @@ -8290,6 +8286,222 @@ export function runPregelTests( }, ], ]); + + const streamedCustomEvents: StateSnapshot[] = await gatherIterator( + graph.stream({ messages: [] }, { ...config, streamMode: "custom" }) + ); + + expect(streamedCustomEvents).toEqual([ + { + from: "parent", + }, + { + content: "1", + from: "subgraph", + }, + { + content: "2", + from: "subgraph", + }, + { + content: "3", + from: "subgraph", + }, + ]); + + const streamedCombinedEvents: StateSnapshot[] = await gatherIterator( + graph.stream( + { messages: [] }, + { ...config, streamMode: ["custom", "messages"] } + ) + ); + + expect(streamedCombinedEvents).toEqual([ + ["custom", { from: "parent" }], + [ + "messages", + [ + new _AnyIdToolMessage({ + tool_call_id: "test", + content: "qux", + }), + { + langgraph_step: 1, + langgraph_node: "p_one", + langgraph_triggers: ["__start__:p_one"], + langgraph_path: [PULL, "p_one"], + langgraph_checkpoint_ns: expect.stringMatching(/^p_one:/), + __pregel_resuming: false, + __pregel_task_id: expect.any(String), + checkpoint_ns: expect.stringMatching(/^p_one:/), + name: "p_one", + tags: ["graph:step:1"], + }, + ], + ], + [ + "messages", + [ + new _AnyIdHumanMessage({ + content: "foo", + }), + { + langgraph_step: 1, + langgraph_node: "c_one", + langgraph_triggers: ["__start__:c_one"], + langgraph_path: [PULL, "c_one"], + langgraph_checkpoint_ns: + expect.stringMatching(/^p_two:.*\|c_one:.*/), + __pregel_resuming: false, + __pregel_task_id: expect.any(String), + checkpoint_ns: expect.stringMatching(/^p_two:/), + name: "c_one", + tags: ["graph:step:1"], + }, + ], + ], + [ + "messages", + [ + new _AnyIdAIMessage({ + content: "bar", + }), + { + langgraph_step: 1, + langgraph_node: "c_one", + langgraph_triggers: ["__start__:c_one"], + langgraph_path: [PULL, "c_one"], + langgraph_checkpoint_ns: + expect.stringMatching(/^p_two:.*\|c_one:.*/), + __pregel_resuming: false, + __pregel_task_id: expect.any(String), + checkpoint_ns: expect.stringMatching(/^p_two:/), + name: "c_one", + tags: ["graph:step:1"], + }, + ], + ], + [ + "messages", + [ + new _AnyIdAIMessageChunk({ + content: "1", + }), + { + langgraph_step: 2, + langgraph_node: "c_two", + langgraph_triggers: ["c_one"], + langgraph_path: [PULL, "c_two"], + langgraph_checkpoint_ns: + expect.stringMatching(/^p_two:.*\|c_two:.*/), + __pregel_resuming: false, + __pregel_task_id: expect.any(String), + checkpoint_ns: expect.stringMatching(/^p_two:/), + ls_model_type: "chat", + ls_provider: "FakeChatModel", + ls_stop: undefined, + tags: ["c_two_chat_model"], + name: "c_two_chat_model_stream", + }, + ], + ], + ["custom", { from: "subgraph", content: "1" }], + [ + "messages", + [ + new _AnyIdAIMessageChunk({ + content: "2", + }), + { + langgraph_step: 2, + langgraph_node: "c_two", + langgraph_triggers: ["c_one"], + langgraph_path: [PULL, "c_two"], + langgraph_checkpoint_ns: + expect.stringMatching(/^p_two:.*\|c_two:.*/), + __pregel_resuming: false, + __pregel_task_id: expect.any(String), + checkpoint_ns: expect.stringMatching(/^p_two:/), + ls_model_type: "chat", + ls_provider: "FakeChatModel", + ls_stop: undefined, + tags: ["c_two_chat_model"], + name: "c_two_chat_model_stream", + }, + ], + ], + ["custom", { from: "subgraph", content: "2" }], + [ + "messages", + [ + new _AnyIdAIMessageChunk({ + content: "3", + }), + { + langgraph_step: 2, + langgraph_node: "c_two", + langgraph_triggers: ["c_one"], + langgraph_path: [PULL, "c_two"], + langgraph_checkpoint_ns: + expect.stringMatching(/^p_two:.*\|c_two:.*/), + __pregel_resuming: false, + __pregel_task_id: expect.any(String), + checkpoint_ns: expect.stringMatching(/^p_two:/), + ls_model_type: "chat", + ls_provider: "FakeChatModel", + ls_stop: undefined, + tags: ["c_two_chat_model"], + name: "c_two_chat_model_stream", + }, + ], + ], + ["custom", { from: "subgraph", content: "3" }], + [ + "messages", + [ + new _AnyIdAIMessage({ + content: "baz", + }), + { + langgraph_step: 2, + langgraph_node: "c_two", + langgraph_triggers: ["c_one"], + langgraph_path: [PULL, "c_two"], + langgraph_checkpoint_ns: + expect.stringMatching(/^p_two:.*\|c_two:.*/), + __pregel_resuming: false, + __pregel_task_id: expect.any(String), + checkpoint_ns: expect.stringMatching(/^p_two:/), + ls_model_type: "chat", + ls_provider: "FakeChatModel", + ls_stop: undefined, + tags: ["c_two_chat_model"], + }, + ], + ], + [ + "messages", + [ + new _AnyIdAIMessage({ + content: "parent", + }), + { + langgraph_step: 3, + langgraph_node: "p_three", + langgraph_triggers: ["p_two"], + langgraph_path: [PULL, "p_three"], + langgraph_checkpoint_ns: expect.stringMatching(/^p_three/), + __pregel_resuming: false, + __pregel_task_id: expect.any(String), + checkpoint_ns: expect.stringMatching(/^p_three/), + ls_model_type: "chat", + ls_provider: "FakeChatModel", + ls_stop: undefined, + tags: [], + }, + ], + ], + ]); }); it("debug retry", async () => {