Deploy your models to various providers.
Once you train SLMs on the Datawizz platform, you must first deploy them to a provider to start evaluating them & conntecting them to endpoints. You can always download the model weights and manually deploy them to any inference provider. Datawizz makes it simple to deploy to various providers with automated “one clicke” deployments.
We are actively working on adding more providers. If you have a specific provider in mind, please reach out to us and we will prioritize it.
Before deploying to any third party providers, please check their pricing as deployments often incurr charges. The models will be deployed to your account, and you will be responsible for any charges incurred.
Datawizz offers a serverless deployment option for your models. This is the simplest way to deploy your models and get started with inference. Datawizz includes a free tier for serverless deployments and pay as you go pricing.
To deploy your model to Datawizz serverless, go to any trained model, click “Deploy Model” and select “Datawizz Serverless”. Your model will now be available for inference in the Datawizz serverless environment. You can use it in evaluations or connect it to endpoints.
Datawizz serverless has a longer first cold-start time, so your first request to the model may take longer. This delay should not repeat as long as your model is in use at least once every 24 hours.
Datawizz supports one-click deployment to the Fireworks AI platform. Fireworks deployments provide dedicated inference instances for your models, with powerful GPU support and automated scaling. Fireworks also provides convinient scale to zero for less active models, so you only pay for what you use.
Learn more about Fireworks AI Inference here.
To deploy your models to Fireworks:
Datawizz supports one-click deployment to the Together AI platform. Together AI supports both serverless and dedicated inference instances for your models, with powerful GPU support and automated scaling. Datawizz currently supports serverless deployments to Together AI, with imminent plans to support dedicated deployments as well.
Learn more about Together AI Inference here.
To deploy your models to Together AI:
Togetrher AI Serverless deployments don’t support all model architectures trainable on Datawizz. Please consult their documentation for a list of supported architectures.