Self Hosting Models
Demonstrates how to self host a model with UpscalerJS.
Open example in Stackblitz.If you're looking for a guide on how to host your own models in a Node environment, check out the Node-specific guide.
Background
UpscalerJS provides support for loading models via the local filesystem. This might be useful when we want to host the models ourselves (perhaps we're running offline), or if we have a custom model we wish to integrate with UpscalerJS.
In this example, we'll be using the raw 2x model.json
available via the esrgan-slim
package to demonstrate.
Code
We first need to ensure that our model file is accessible locally via a URL, as Tensorflow.js requires a HTTP-compatible model file. In our example, we're using vite
, which automatically exposes the public
folder statically. Therefore, we've placed our model files into the public
folder which has made them accessible at /model.json
.
We can then specify the model with a custom model
attribute:
import Upscaler from 'upscaler'
const upscaler = new Upscaler({
model: {
scale: 2,
path: '/model.json',
}
})
Both scale
and path
are required options. Models are tied to a specific scale which must be specified per model.
Model options
We can further specify our model with additional configuration options:
import Upscaler from 'upscaler'
const upscaler = new Upscaler({
model: {
scale: 2,
path: '/model.json',
preprocess: input => tf.tidy(() => tf.mul(input, 1 / 255)),
postprocess: output => tf.tidy(() => output.clipByValue(0, 255)),
}
})
preprocess
and postprocess
are functions called on the input and output tensors, respectively.
The model can also define a function that returns a ModelDefinition
, which can be helpful for defining custom layers and ops:
import Upscaler from 'upscaler'
const getModelDefinition = (
/**
* tf refers to the currently active Tensorflow.js library, which may be
* @tensorflow/tfjs, @tensorflow/tfjs-node, or @tensorflow/tfjs-node-gpu.
**/
tf,
) => {
class CustomLayer extends Layer {
call(inputs: Inputs) {
... some definition ...
}
static className = 'CustomLayer'
}
tf.serialization.registerClass(CustomLayer);
return {
scale: 2,
path: '/model.json',
preprocess: input => tf.tidy(() => tf.mul(input, 1 / 255)),
postprocess: output => tf.tidy(() => output.clipByValue(0, 255)),
}
}
const upscaler = new Upscaler({
model: getModelDefinition,
})
We can see an example of two custom models defined in the esrgan-thick
model package.