Introduction
bentoctl is a command line tool for deploying BentoML packaged ML models as API endpoint on popular cloud platforms. It automates Bento docker image build, generate terraform project, and easily manage deployments.
Why bentoctl?
- Supports major cloud providers: AWS, Azure, Google Cloud, and more.
- Easy to deploy, update and reproduce model deployments.
- First class integration with Terraform.
- Optimized for CI/CD workflow.
- Extensible with custom operators.
-
High performance serving powered by BentoML Supported platforms:
- AWS SageMaker
- AWS EC2
- Google Cloud Run
- Azure Functions
- Looking for Kubernetes? Try out Yatai: Model deployment at scale on Kubernetes.
- Customize deploy target by creating bentoctl plugin from the deployment operator template.
Upcoming: * Google Compute Engine (BentoML 1.0 migration in progress) * Azure Container Instances (BentoML 1.0 migration in progress) * Heroku (BentoML 1.0 migration in progress) * Knative (WIP)
How to install
Install via pip:
pip install --pre bentoctl
bentoctl is in pre-release stage, include the
--pre
to download the latest version
Next steps
- Quickstart Guide walks through a series of steps to deploy a bento to AWS Lambda as API server.
- Core Concepts explains the core concepts in bentoctl.
- Operator List lists official operators and their current status.
Community
- To report a bug or suggest a feature request, use GitHub Issues.
- To receive release announcements, please subscribe to our mailing list or join us on Slack.
Contributing
There are many ways to contribute to the project:
- If you have any feedback on the project, share it with the community in #bentoctl channel in slack.
- Report issues you're facing and "Thumbs up" on issues and feature requests that are relevant to you.
- Investigate bugs and review other developer's pull requests.