Exporting models for deployment to personal or edge devices
In Model Playground, you have the option of exporting the best model from an experiment. To do so, you need to navigate to the "deploy and export" page inside your split, as shown:
In the export section, you can choose which experiment you would like to export the model from, and in what format.
Currently, all model families support TorchScript exports. This format can be run on CPU, GPU, and some can also be run on Android or iOS devices (depending on the model you choose). To read up on TorchScript, we recommend looking at the official documentation provided by Pytorch:
Once you export the model and download the export file, you will need to unzip it. Unzipping it will leave you with a folder which contains the following files:
These 4 files provide all the info needed to correctly use/deploy the model on edge. We have provided some sample inference scripts for you to get started in Python. You can find these on the next pages.
Automate 90% of the work, reduce your time to deployment by 40%, and replace your whole ML software stack with our platform.
Start for free Request a demo