Skip to main content

Web Worker Integration

View this page on the UpscalerJS website

This guide demonstrates how to integrate UpscalerJS into a Web Worker.

Open in Codesandbox.


Another strategy for speeding up inference calls in the browser is to move the upscaling process to a Web Worker.

This guide won't cover how to build a web worker; MDN has a great overview of that. This guide will instead focus on what's necessary to incorporate UpscalerJS into a web worker.


While incorporating a Web Worker will help improve UI performance, it's not a panacea. Larger images, slower models, or older hardware can still suffer from UI jank.


In a Web Worker, we don't have access to HTMLImageElement, which means that we need to rely on tensor inputs and outputs.

In the UI thread, we can load our image and get its data with:

// UI thread
const pixels = tf.browser.fromPixels(image)
const data = await

When passing messages between a UI thread and a web worker, the data is serialized and then deserialized. Therefore, we need to transform our tensor (which is not serializable) into a format that can be serialized.

We pass data along with the tensor's shape to our worker:

// UI thread
worker.postMessage([data, pixels.shape])

In our worker, we'll turn this into a tensor:

// Worker thread
const tensor = tf.tensor(data, shape)

Now we can pass this tensor to our upscaler. We will also specify it's output as a tensor; otherwise, we'll get an error (since HTMLImageElement is not available):

// Worker thread
const upscaledImg = await upscaler.upscale(tensor, {
output: 'tensor',

To get our image back to the UI thread, we'll again need to transform it into a serializable format:

// Worker thread
const upscaledShape = upscaledImg.shape
const upscaledData = await
postMessage([upscaledData, upscaledShape])

Finally, we receive it in our UI thread and can work with it as normal:

// UI thread
worker.onmessage = async (e) => {
const [ data, shape ] =
const tensor = tf.tensor(data, shape)