Self Hosting Models
View this page on the UpscalerJS websiteDemonstrates how to self host a model with UpscalerJS.
Open example in Codesandbox.If you're looking for a guide on how to host your own models in a Node environment, check out the Node-specific guide.
Background
UpscalerJS provides support for loading models via the local filesystem. This might be useful when wanting to host the models locally (perhaps when running offline), or for when wanting to integration a custom model with UpscalerJS.
This example uses the raw 2x model.json
available via the esrgan-slim
package to demonstrate.
Code
We first need to ensure that our model file is accessible locally via a URL, as Tensorflow.js requires a HTTP-compatible model file. In our example, we're using vite
, which automatically exposes the public
folder statically. Therefore, we've placed our model files into the public
folder which has made them accessible at /model.json
.
We can then specify the model with a custom model
attribute:
import Upscaler from 'upscaler'
const upscaler = new Upscaler({
model: {
scale: 2,
path: '/model.json',
}
})
path
is a required option.
We can also further specify our model with additional configuration options. See the guide on custom model configurations for more information.