Intel® Distribution of OpenVINO™ Toolkit
Intel
Intel® Distribution of OpenVINO™ Toolkit
Intel
Intel® Distribution of OpenVINO™ Toolkit
Intel
Your AI Inferencing Apps...Now Faster
The Intel® Distribution of OpenVINO™ toolkit quickly deploys applications and solutions that emulate human vision. Based on Convolutional Neural Networks (CNN), the OpenVINO™ toolkit extends computer vision (CV) workloads across Intel® hardware, maximizing network performance.
Benefits:
- Boost deep learning performance in computer vision, automatic speech recognition, natural language processing, and other common tasks.
- Use models trained with popular frameworks, such as TensorFlow, PyTorch, and many more.
- Reduce resource demands and efficiently deploy on a range of Intel® platforms from edge to cloud.
Resources:
- Web-site: https://software.intel.com/en-us/openvino-toolkit
- Release Notes: https://software.intel.com/en-us/articles/OpenVINO-RelNotes
- Documentation: https://docs.openvino.ai
- Support Forum: https://community.intel.com/t5/Intel-Distribution-of-OpenVINO/bd-p/distribution-openvino-toolkit
You can use DockerHub CI framework for Intel® Distribution of OpenVINO™ toolkit to generate a Dockerfile, build, test, and deploy an image with the Intel® Distribution of OpenVINO™ toolkit. You can reuse available Dockerfiles, add your layer, and customize the image of OpenVINO™ for your needs.
Find more information about how to use OpenVINO™ images on various hardware platforms in the Getting Started Guide for the corresponding OpenVINO release.
The source code of 3d party LGPL/GPL packages is available on the Docker Hub inside the OpenVINO™ images with `_src` tag postfix.