Fast and consistently responsive apps using a single function call
npm install main-thread-scheduling
The library lets you run computationally heavy tasks on the main thread while ensuring:
- Your app's UI doesn't freeze.
- Your users' computer fans don't spin.
- Your INP (Interaction to Next Paint) is in green.
- It's easy to plug it into your existing codebase.
A real world showcase of searching in a folder with 10k notes, 200k+ lines of text, that take 50MB on disk and getting results instantly.
- You want to turn a synchronous function into a non-blocking asynchronous function. Avoids UI freezes.
- You want to render important elements first and less urgent ones second. Improves perceived performance.
- You want to run a long background task that doesn't spin the fans after a while. Avoids bad reputation.
- You want to run multiple backgrounds tasks that don't degrade your app performance with time. Prevents death by a thousand cuts.
- Uses
requestIdleCallback()
andrequestAfterFrame()
for scheduling. - Stops task execution when user interacts with the UI (if
navigator.scheduling.isInputPending()
API is available). - Global queue. Multiple tasks are executed one by one so increasing the number of tasks doesn't degrade performance linearly.
- Sorts tasks by importance. Sorts by strategy and gives priority to tasks requested later.
- Considerate about your existing code. Tasks with
idle
strategy are executed last so there isn't some unexpected work that slows down the main thread after the background task is finished.
- Simple. 90% of the time you only need the
yieldOrContinue(strategy)
function. The API has two more functions for more advanced cases. - Not a weekend project. Actively maintained for three years — see contributors page. I've been using it in my own products for over four years — Nota and iBar. Flux.ai are also using it in their product (software for designing hardware circuits using web technologies).
- This is the future. Some browsers have already implemented support for scheduling tasks on the main thread. This library tries even harder to improve user perceived performance — see explanation for details.
- High quality. Aiming for high-quality with my open-source principles.
You can see the library in action in this CodeSandbox. Try removing the call to yieldToContinue()
and then type in the input to see the difference.
The complexity of the entire library is hidden behind this method. You can have great app performance by calling a single method.
async function findInFiles(query: string) {
for (const file of files) {
await yieldOrContinue('interactive')
for (const line of file.lines) {
fuzzySearchLine(line, query)
}
}
}
The library has two more functions available:
yieldControl(strategy: 'interactive' | 'smooth' | 'idle', signal?: AbortSignal)
isTimeToYield(strategy: 'interactive' | 'smooth' | 'idle', signal?: AbortSignal)
These two functions are used together to handle more advanced use cases.
A simple use case where you will need those two functions is when you want to render your view before yielding back control to the browser to continue its work:
async function doHeavyWork() {
for (const value of values) {
if (isTimeToYield('interactive')) {
render()
await yieldControl('interactive')
}
computeHeavyWorkOnValue(value)
}
}
There are three scheduling strategies available. You can think about them more easily by completing the sentence with one of the three words: "Scheduling the task keeps the page interactive
/smooth
/idle
."
interactive
– use this for things that need to display to the user as fast as possible. Everyinteractive
task is run for 83ms – this gives you a nice cycle of doing heavy work and letting the browser render pending changes.smooth
— use this for things you want to display to the user quickly but you still want for animations to run smoothly for example.smooth
runs for 13ms and then gives around 3ms to render the frame.idle
– use this for background tasks. Every idle task is run for 5ms.
Web Workers are a great fit if you have: 1) heavy algorithm (e.g. image processing), 2) heavy process (runs for a long time, big part of the app lifecycle). However, in reality, it's rare to see people using them. That's because they require significant investment of time due to the complexity that can't be avoided when working with CPU threads regardless of the programming language. This library can be used as a gateway before transitioning to Web Workers. In most cases, you would discover the doing it on the main thread is good enough.
scheduler.postTask()
is available in some browsers today. postTask()
and main-thread-scheduling
do similar things. You can think of postTask()
as a lower level API — it might be the right choice in specific scenarios. Library owners might be interested in exploring the nuanced differences between the two. For most cases, main-thread-scheduling
provides a scheduleTask()
method that mimics that API of postTask()
while providing the extra benefits of the library.