Using Your Model

This guide details how to use your newly trained model, which is now in your model catalog. Then you can see how your model performs by installing the model and starting up your application. In some cases, the model may need to be converted before being used in a application.

Add the Model to a Project

Every model that you train using the alwaysAI Model Training Web Application is automatically published to your private model catalog upon completion. Now you can interact with that model the same way you can any of the models in our public model catalog.

Once your model is available in your model catalog, you can add it to a Project as usual to test it out. This model will remain private to your Projects. Collaborators on Projects that use this model will also be able to use the model.

Use the Model in the Project

As with any alwaysAI model, once you have added your model to a Project, you must install it, and then you can start your application to view how it performs in your application.

We have detailed documentation on using models in Projects in general, which can be found in our Using Models documentation.

Model Conversion

Models can be converted to different formats for particular devices or to optimize their performance.

Available Target Formats

  • tensor-rt

Converting a Model Using the CLI

After training a model, you can convert them into different formats by using the aai model convert command.

$ aai model convert --help
Usage: aai model convert model ID e.g. alwaysai/mobilenet_ssd <options>

   Convert an alwaysAI model to a different format


   --format <value>     : The output format of the model conversion
                          Allowed values {tensor-rt}
   [--output-id <str>]  : Model ID of the converted model
   [--version <num>]    : Version of the model to convert
   [--batch-size <num>] : Batch size if converting to tensor-rt

Note: Your username is automatically prepended to <new_modelname> and supplying it may result in errors!

Tensor RT

To convert a model to Tensor RT, supply tensor-rt as the format parameter.

$ aai model convert <username/modelname> --format tensor-rt --output_id <new_modelname>

This will output a model with the name username/new_modelname to a local directory named out.

Next, publish this model to your private catalog to make it available to your projects:

$ cd out/username/new_modelname
$ aai model publish