-
Hi guys,
I couldn't find any example in the documentation. The issue that I'm facing is that the AI agent will send me a reply in two or 3 parts (not in one shot). How can I make sure that I receive all data? Thank you, |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
Hi @asnagni , Perhaps you need to pass your own callback function to the // define your own callback function
auto callback = [](std::string data, intptr_t ptr) -> bool {
std::cout << data << '\n';
return true;
};
// pass it to stream
xxx.create_async(model, conversation, std::nullopt, std::nullopt, std::nullopt, callback); The stream will return streaming messages with Hope this may help you. |
Beta Was this translation helpful? Give feedback.
Hi @asnagni ,
Perhaps you need to pass your own callback function to the
stream
parameter increate_async
, and here is a tiny demo illustrating how it works.The stream will return streaming messages with
data: {somejsondata}
and be closed by adata: [DONE]
message.Hope this may help you.