Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: added compression to resume data cache using deflate #73227

Open
wants to merge 2 commits into
base: canary
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hmmm something's weird with this snapshot test, i updated it locally, but CI is getting a different result...

...but the result deserializes to the same string?

> mine = 'eJwlibEKg0AQBf/l1VfELiykvj+wEYvNZdWDE+V2I4Fj/100zcDMNKhtVUANk1haQM0DEqflbt2Fg8tXQODY6zs+XwgwnhU0jAFqXAT0CLC8ihqv+23y23P9jyoHl/xhu9Td/QSgqiXb'
> ci   = 'eJyrViouyS9KVbKqVkpLLUnOADJqdZSSE5MzwGKGIKIsMacUyFNKdA8rTnK3sFXSUSpJTC9WsoqO1QFqT8wBShoAxTJzU4G83AIwL7WiILMIIlGUCjQhMyWxBMStBQIAoKol2w=='
> decode = (s) => require('node:zlib').inflateSync(Buffer.from(s, 'base64')).toString()
>
> a = decode(mine)
'{"store":{"fetch":{},"cache":{"1":{"value":"aGVsbG8=","tags":[],"stale":0,"timestamp":0,"expire":0,"revalidate":0}}}}'
> b = decode(ci)
'{"store":{"fetch":{},"cache":{"1":{"value":"aGVsbG8=","tags":[],"stale":0,"timestamp":0,"expire":0,"revalidate":0}}}}'
>
> a === b
true

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, that's the other snapshot. I think this is just missing the decompress treatment as in packages/next/src/server/resume-data-cache/resume-data-cache.test.ts, no? Asserting on the compressed string is not very readable anyways.

Copy link
Member

@lubieowoce lubieowoce Nov 27, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i found some stuff about zlib outputs being platform dependent which'd explain this, but that's about gzip, not sure if it should apply to deflate too
nodejs/node#12244

in any case, i guess we need to change this test, maybe to just test decode(encode(input)) === input or something?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll think about that one and fix it!

Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ describe('getDynamicHTMLPostponedState', () => {
)

expect(state).toMatchInlineSnapshot(
`"169:39[["slug","%%drp:slug:e9615126684e5%%"]]{"%%drp:slug:e9615126684e5%%":"%%drp:slug:e9615126684e5%%","nested":{"%%drp:slug:e9615126684e5%%":"%%drp:slug:e9615126684e5%%"}}{"store":{"fetch":{},"cache":{"1":{"value":"aGVsbG8=","tags":[],"stale":0,"timestamp":0,"expire":0,"revalidate":0}}}}"`
`"169:39[["slug","%%drp:slug:e9615126684e5%%"]]{"%%drp:slug:e9615126684e5%%":"%%drp:slug:e9615126684e5%%","nested":{"%%drp:slug:e9615126684e5%%":"%%drp:slug:e9615126684e5%%"}}eJwlibEKg0AQBf/l1VfELiykvj+wEYvNZdWDE+V2I4Fj/100zcDMNKhtVUANk1haQM0DEqflbt2Fg8tXQODY6zs+XwgwnhU0jAFqXAT0CLC8ihqv+23y23P9jyoHl/xhu9Td/QSgqiXb"`
)
})

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,24 @@ import {
} from './resume-data-cache'
import { createPrerenderResumeDataCache } from './resume-data-cache'
import { streamFromString } from '../stream-utils/node-web-streams-helper'
import { inflateSync } from 'node:zlib'

function createCacheWithSingleEntry() {
const cache = createPrerenderResumeDataCache()
cache.cache.set(
'key',
Promise.resolve({
value: streamFromString('value'),
tags: [],
stale: 0,
timestamp: 0,
expire: 0,
revalidate: 0,
})
)

return cache
}

describe('stringifyResumeDataCache', () => {
it('serializes an empty cache', async () => {
Expand All @@ -12,20 +30,17 @@ describe('stringifyResumeDataCache', () => {
})

it('serializes a cache with a single entry', async () => {
const cache = createPrerenderResumeDataCache()
cache.cache.set(
'key',
Promise.resolve({
value: streamFromString('value'),
tags: [],
stale: 0,
timestamp: 0,
expire: 0,
revalidate: 0,
})
)
const cache = createCacheWithSingleEntry()
const compressed = await stringifyResumeDataCache(cache)

// We have to decompress the output because the compressed string is not
// deterministic. If it fails here it's because the compressed string is
// different.
const decompressed = inflateSync(
Buffer.from(compressed, 'base64')
).toString('utf-8')

expect(await stringifyResumeDataCache(cache)).toMatchInlineSnapshot(
expect(decompressed).toMatchInlineSnapshot(
`"{"store":{"fetch":{},"cache":{"key":{"value":"dmFsdWU=","tags":[],"stale":0,"timestamp":0,"expire":0,"revalidate":0}}}}"`
)
})
Expand All @@ -37,4 +52,14 @@ describe('parseResumeDataCache', () => {
createPrerenderResumeDataCache()
)
})

it('parses a cache with a single entry', async () => {
const cache = createCacheWithSingleEntry()
const serialized = await stringifyResumeDataCache(cache)

const parsed = createRenderResumeDataCache(serialized)

expect(parsed.cache.size).toBe(1)
expect(parsed.fetch.size).toBe(0)
})
})
85 changes: 55 additions & 30 deletions packages/next/src/server/resume-data-cache/resume-data-cache.ts
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
import { InvariantError } from '../../shared/lib/invariant-error'
import {
type UseCacheCacheStore,
type FetchCacheStore,
Expand Down Expand Up @@ -64,22 +65,32 @@ type ResumeStoreSerialized = {
export async function stringifyResumeDataCache(
resumeDataCache: RenderResumeDataCache | PrerenderResumeDataCache
): Promise<string> {
if (resumeDataCache.fetch.size === 0 && resumeDataCache.cache.size === 0) {
return 'null'
}
if (process.env.NEXT_RUNTIME === 'edge') {
throw new InvariantError(
'`stringifyResumeDataCache` should not be called in edge runtime.'
)
} else {
if (resumeDataCache.fetch.size === 0 && resumeDataCache.cache.size === 0) {
return 'null'
}

const json: ResumeStoreSerialized = {
store: {
fetch: Object.fromEntries(
stringifyFetchCacheStore(resumeDataCache.fetch.entries())
),
cache: Object.fromEntries(
await stringifyUseCacheCacheStore(resumeDataCache.cache.entries())
),
},
}
const json: ResumeStoreSerialized = {
store: {
fetch: Object.fromEntries(
stringifyFetchCacheStore(resumeDataCache.fetch.entries())
),
cache: Object.fromEntries(
await stringifyUseCacheCacheStore(resumeDataCache.cache.entries())
),
},
}

return JSON.stringify(json)
// Compress the JSON string using zlib. As the data we already want to
// decompress is in memory, we use the synchronous deflateSync function.
const { deflateSync } = require('node:zlib') as typeof import('node:zlib')

return deflateSync(JSON.stringify(json)).toString('base64')
}
}

/**
Expand Down Expand Up @@ -114,24 +125,38 @@ export function createRenderResumeDataCache(
export function createRenderResumeDataCache(
prerenderResumeDataCacheOrPersistedCache: PrerenderResumeDataCache | string
): RenderResumeDataCache {
if (typeof prerenderResumeDataCacheOrPersistedCache !== 'string') {
// If the cache is already a prerender cache, we can return it directly,
// we're just performing a type change.
return prerenderResumeDataCacheOrPersistedCache
}
if (process.env.NEXT_RUNTIME === 'edge') {
throw new InvariantError(
'`createRenderResumeDataCache` should not be called in edge runtime.'
)
} else {
if (typeof prerenderResumeDataCacheOrPersistedCache !== 'string') {
// If the cache is already a prerender cache, we can return it directly,
// we're just performing a type change.
return prerenderResumeDataCacheOrPersistedCache
}

if (prerenderResumeDataCacheOrPersistedCache === 'null') {
return {
cache: new Map(),
fetch: new Map(),
if (prerenderResumeDataCacheOrPersistedCache === 'null') {
return {
cache: new Map(),
fetch: new Map(),
}
}
}

const json: ResumeStoreSerialized = JSON.parse(
prerenderResumeDataCacheOrPersistedCache
)
return {
cache: parseUseCacheCacheStore(Object.entries(json.store.cache)),
fetch: parseFetchCacheStore(Object.entries(json.store.fetch)),
// This should be a compressed string. Let's decompress it using zlib.
// As the data we already want to decompress is in memory, we use the
// synchronous inflateSync function.
const { inflateSync } = require('node:zlib') as typeof import('node:zlib')

const json: ResumeStoreSerialized = JSON.parse(
inflateSync(
Buffer.from(prerenderResumeDataCacheOrPersistedCache, 'base64')
).toString('utf-8')
)

return {
cache: parseUseCacheCacheStore(Object.entries(json.store.cache)),
fetch: parseFetchCacheStore(Object.entries(json.store.fetch)),
}
}
}
Loading