intel x balena

Deploy high-performance deep learning inference with Intel OpenVINO and balena

Team balena is proud to support the Intel® Distribution of the OpenVINO™ Toolkit. Intel provides the AI power of OpenVINO and balena deploys that power to massive fleets of edge devices.

As Intel states, “The Intel® Distribution of OpenVINO™ toolkit enables you to optimize, tune, and run comprehensive AI inference using the included model optimizer and runtime and development tools.” Combine this with balena's cloud-based infrastructure to develop, deploy, manage, and orchestrate massive fleets of edge devices running OpenVINO.

Whether you’re working with an early prototype, or want to migrate your deep learning project to OpenVINO and balena, get in touch and we can help.

Start using OpenVINO and balena

How to run edge AI with OpenVINO on balenaCloud

Learn how to set up edge AI inferencing using OpenVINO, an Intel NUC, and provision and manage those devices with balenaCloud.

Try the project

Supported devices

For OpenVINO projects, balena supports the Intel NUC. Check out OpenVINO's specific documentation for more details. If you’re curious about our ability to support a particular type of hardware, please do get in touch!

Intel Nuc

Learn more

Getting Started

Check out this Getting Started guide to familiarize yourself with Intel NUC and balena.

View Guide