Skip to content

Commit

Permalink
chunk delimeter
Browse files Browse the repository at this point in the history
  • Loading branch information
lassejaco committed Sep 9, 2024
1 parent de0dd23 commit e83a493
Show file tree
Hide file tree
Showing 3 changed files with 53 additions and 53 deletions.
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,11 +17,11 @@ This is the main repository for events organized by the Ethereum Foundation

## Development

- npm install inside lib before you start for types to be available
- run whatever project you like as if you weren't working in a monorepo (yay, no added complexity :^D)
- npm install inside lib before you start for types to be available in lib
- run whichever project you like as if you weren't working in a monorepo

### Some notes:

- the reason we don't use yarn workspaces to solve shared dependencies is that there were too many hacks required to make it work - hoisting means typescript gets confused, webpack/nextjs gets confused, weird edge cases/traps, harder to reason about, harder to install in isolation on netlify, and so on - it sounds simple in theory, but it isn't (as of 19/10/2023).
- lib actually doesn't need _all_ dependencies installed, _just enough for types to be present/for typescript to work_ (e.g. @react/types is enough, it doesn't also need to install react, even though it uses react, read more below). Note: some packages may need to be installed simply because the types are included in the package itself. But this is fine; no harm done even if you install too much, it won't get used anyway (read more below).
- the _actual dependencies_ must be installed by the project that _uses_ lib ("the consumer") - we then use simple webpack config to tell any imports from lib to resolve dependencies in the consumer's node_modules rather than it's own - voila, no hosting needed (and the complications that brings), and we only have one source of dependencies (no version conflicts, no duplicated output, etc.). This also means its the consumers responsibility to have all packages installed when it uses lib - which is fine, but worth keeping in mind, because it also means changes to lib that require new packages or versions need immediate action in the consumers.
- the _actual dependencies_ must be installed by the project that _uses_ lib ("the consumer") - we then use simple webpack config to tell any imports from lib to resolve dependencies in the consumer's node_modules rather than it's own - no package hoisting needed (and the complications that brings), and we only have one source of dependencies (no version conflicts, no duplicated output, etc.). This also means its the consumers responsibility to have all packages installed when it uses lib - which is fine, but worth keeping in mind, because it also means changes to lib that require new packages or versions need immediate action in the consumers.
2 changes: 1 addition & 1 deletion devcon/src/pages/api/ai/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)

// Stream the response to the client
for await (const chunk of stream) {
res.write(JSON.stringify(chunk))
res.write(JSON.stringify(chunk) + '_chunk_end_')
// res.flush() // Ensure the data is sent immediately
}

Expand Down
98 changes: 49 additions & 49 deletions lib/components/ai/overlay.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@ import {
threadIDState,
messagesState,
} from "./state"; // Adjust the import path
import { InfoIcon } from "tinacms";

const DevaBot = () => {
const [visible, setVisible] = useRecoilState(visibleState);
Expand Down Expand Up @@ -51,12 +52,14 @@ const DevaBot = () => {
};

const [streamingMessage, setStreamingMessage] = React.useState("");
const [partialChunk, setPartialChunk] = React.useState("");

const onSend = async () => {
if (executingQuery) return;

setExecutingQuery(true);
setStreamingMessage("");
setPartialChunk("");

try {
const response = await fetch("/api/ai", {
Expand All @@ -75,57 +78,58 @@ const DevaBot = () => {
.pipeThrough(new TextDecoderStream())
.getReader();

let buffer = "";
const chunkDelimiter = "_chunk_end_";

const processChunk = (chunk: string) => {
try {
const response = JSON.parse(chunk);

if (response.error) {
setError(response.error);
setExecutingQuery(false);
return;
}

if (response.type === "thread.message.delta") {
setStreamingMessage((prev) => prev + response.content);
}

if (response.type === "done") {
setThreadID(response.threadID);
setMessages(response.messages);
setStreamingMessage("");
setExecutingQuery(false);
}
} catch (parseError) {
console.error("Error parsing chunk:", parseError);
}
};

while (true) {
const { value, done } = await reader.read();
if (done) break;

console.log(value, "value");

// Sometimes the stream sends multiple JSON objects in a single string - this splits them, then repairs them
const jsonStrings = value.trim().split('"}{"');

for (let i = 0; i < jsonStrings.length; i++) {
const jsonString = jsonStrings[i];

try {
let response;
let repairedString = jsonString;

if (i === 0 && jsonStrings.length > 1) {
repairedString = jsonString + '"}';
} else if (i > 0 && i < jsonStrings.length - 1) {
repairedString = '{"' + jsonString + '"}';
} else if (i > 0 && i === jsonStrings.length - 1) {
repairedString = '{"' + jsonString;
}

response = JSON.parse(repairedString);

if (response.error) {
setError(response.error);
setExecutingQuery(false);
return;
}

if (response.type === "thread.message.delta") {
setStreamingMessage((prev) => prev + response.content);
}

if (response.type === "done") {
setThreadID(response.threadID);
setMessages(response.messages);
setStreamingMessage("");
setExecutingQuery(false);
}
} catch (parseError) {
console.error("Error parsing JSON:", parseError);
// Optionally, you can set an error state here if needed
}
buffer += value;

let delimiterIndex;

while ((delimiterIndex = buffer.indexOf(chunkDelimiter)) !== -1) {
const chunk = buffer.slice(0, delimiterIndex);
processChunk(chunk);
buffer = buffer.slice(delimiterIndex + chunkDelimiter.length);
}
}

// Process any remaining data in the buffer
if (buffer.length > 0) {
processChunk(buffer);
}
} catch (e: any) {
console.error(e, "error");
setError("Testing streaming responses, errors will occur.." + e.message);
setError("An error occurred: " + e.message);
setExecutingQuery(false);
}
};
Expand Down Expand Up @@ -270,13 +274,9 @@ const DevaBot = () => {
>
This is an MVP and Deva may rarely provide answers that are not
true - we take no responsibility for, or endorse, anything Deva
says.
says{" "}
<Popover>
<PopoverTrigger>
<Button className="text-xs" size="sm">
More Information
</Button>
</PopoverTrigger>
<PopoverTrigger>ℹ️</PopoverTrigger>
<PopoverContent>
<div className="text-xs">
We currently use OpenAI due to the ease of use and mature
Expand All @@ -294,7 +294,7 @@ const DevaBot = () => {
</Popover>
</div>

<div className="shrink-0 relative w-full flex bg-slate-100 flex-col rounded overflow-hidden mb-2 text-black">
<div className="shrink-0 relative w-full flex bg-slate-800 flex-col rounded overflow-hidden mb-2">
<div className="absolute flex items-center opacity-0 w-5/6 right-0 translate-x-[60%] translate-y-[22%] bottom-0 h-full pointer-events-none">
<Image src={DevaHead} alt="Deva" className="object-cover" />
</div>
Expand Down Expand Up @@ -350,7 +350,7 @@ const DevaBot = () => {
</div>

<div
className={`flex absolute w-full h-full bg-gray-400 ${
className={`flex absolute w-full h-full bg-slate-800 ${
executingQuery || error
? "bg-opacity-90 pointer-events-auto"
: "bg-opacity-0 pointer-events-none"
Expand Down

0 comments on commit e83a493

Please sign in to comment.