-
-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support caching for streamed responses #7
Comments
I was able to combine I believe that we should not rely on streaming response client-side caching, as it is unreliable and unfit for purpose (especially for nested promises), and instead, we should only rely on server-side and http caching. import localforage from 'localforage';
import { decode, encode } from 'turbo-stream';
function promiseToReadableStream(promise: Promise<Uint8Array | null>): ReadableStream<Uint8Array> {
return new ReadableStream<Uint8Array>({
async start(controller) {
try {
const chunk = await promise;
if (chunk !== null) {
controller.enqueue(chunk);
}
controller.close();
} catch (error) {
controller.error(error);
}
},
});
}
export class LocalForageAdapter {
async getItem(key: string) {
const encoded = localforage.getItem<Uint8Array>(key);
if (!(await encoded)) return undefined;
const stream = promiseToReadableStream(encoded);
const decoded = await decode(stream);
const data = decoded.value;
await decoded.done;
return data;
}
async setItem(key: string, value: unknown) {
const stream = encode(value);
const reader = stream.getReader().read();
const setter = localforage.setItem(key, (await reader).value);
(await reader).done;
return await setter;
}
async removeItem(key: string) {
return localforage.removeItem(key);
}
} let cache = new LocalForageAdapter();
export async function clientLoader({ serverLoader, request }: ClientLoaderFunctionArgs) {
const cacheKey = new URL(request.url).pathname;
const cachedData = await cache.getItem(cacheKey);
if (cachedData) return cachedData;
// this will happen instantly (e.g. an empty pending promise will be saved)
const serverData = await serverLoader();
await cache.setItem(cacheKey, serverData);
return serverData;
}
clientLoader.hydrate = true; |
@predaytor I looked into this today and I'll try to figure out turbo-stream decoding/encoding and see if we could get it to work with nested promises etc properly with some great DX (0 setup on user side), I'm definitely going to look into it but not sure when and IF it will land, so no promises atm |
Stackblitz
When using
defer
to stream a response from a loader that returned a promise which later used in a boundary, we cannot properly serialize the promise to store say in localStorage or IndexedDB (JSON.stringify
will result in an empty object{}
). The only way to store this type of data is directly in memory, so on the next navigation,clientLoader
will return fulfilled promise with the data, rather than calling the server `loader'.By using
remix-client-cache
, we can create an adapter for when we need to cache streaming responses, rather than relying on a global setting using, for example,localStorage
.routes/index.tsx
:This opens up a question whether it's a good idea to store data in server memory, it would be interesting if we could transform fulfilled promises using a library like
turbo-stream
to store on the client-side or use a web worker and decode to the original form for consumption by .There is currently a bug where a returned promise from the cache has already been fulfilled, the internal logic of
remix-client-cache
cannot understand when a revalidation should occur, or fulfilled data is currently present, since we must store promises directly in memory and not as a string.https://github.com/forge42dev/remix-client-cache/blob/main/src/index.tsx#L116-L140
The text was updated successfully, but these errors were encountered: