https://store-images.s-microsoft.com/image/apps.42711.fa062229-d40b-4ec2-bfa5-c8ea445d008b.08e49838-1511-44b2-9eaa-fb2daab8d3be.ecfce4a6-6d64-41f9-9684-eb4652f64b1e

Intel® Distribution of OpenVINO™ Toolkit

Intel

Intel® Distribution of OpenVINO™ Toolkit

Intel

Your AI Inferencing Apps...Now Faster

The Intel® Distribution of OpenVINO™ toolkit quickly deploys applications and solutions that emulate human vision. Based on Convolutional Neural Networks (CNN), the OpenVINO™ toolkit extends computer vision (CV) workloads across Intel® hardware, maximizing network performance.

Benefits:

  • Boost deep learning performance in computer vision, automatic speech recognition, natural language processing, and other common tasks.
  • Use models trained with popular frameworks, such as TensorFlow, PyTorch, and many more.
  • Reduce resource demands and efficiently deploy on a range of Intel® platforms from edge to cloud.

Resources:

You can use DockerHub CI framework for Intel® Distribution of OpenVINO™ toolkit to generate a Dockerfile, build, test, and deploy an image with the Intel® Distribution of OpenVINO™ toolkit. You can reuse available Dockerfiles, add your layer, and customize the image of OpenVINO™ for your needs.

Find more information about how to use OpenVINO™ images on various hardware platforms in the Getting Started Guide for the corresponding OpenVINO release.

The source code of 3d party LGPL/GPL packages is available on the Docker Hub inside the OpenVINO™ images with `_src` tag postfix.