|
|
||
|---|---|---|
| .. | ||
| 1/debian-10 | ||
| README.md | ||
| docker-compose.yml | ||
README.md
Intel Neural Compressor (INC) Container for Intel packaged by Bitnami
What is Intel Neural Compressor?
Intel® Neural Compressor (INC) is an open-source Python library designed to help optimize inference solutions on popular deep-learning frameworks. It applies quantization, pruning, and knowledge distillation methods to achieve optimal product objectives.
Overview of Intel Neural Compressor
This software listing is packaged by Bitnami. The respective trademarks mentioned in the offering are owned by the respective companies, and use of them does not imply any affiliation or endorsement
TL;DR
$ docker run -it --name inc bitnami/inc-intel:latest
Docker Compose
$ curl -sSL https://raw.githubusercontent.com/bitnami/bitnami-docker-inc-intel/master/docker-compose.yml > docker-compose.yml
$ docker-compose up -d
Why use Intel optimized containers
Optimized containers fully leverage 3rd gen Intel® Xeon® Scalable Processor (Ice Lake) cores and architecture. Intel® AVX-512 instructions have been further improved to accelerate performance for HPC/AI across a diverse set of workloads, including 3D modeling, scientific simulation, financial analytics, machine learning and AI, image processing, visualization, digital content creation, and data compression. This wider vectorization speeds computation processes per clock, increasing frequency over the prior generation. New instructions, coupled with algorithmic and software innovations, also deliver breakthrough performance for the industry's most widely deployed cryptographic ciphers. Security is becoming more pervasive with most organizations increasingly adopting encryption for application execution, data in flight, and data storage
Why use Bitnami Images?
- Bitnami closely tracks upstream source changes and promptly publishes new versions of this image using our automated systems.
- With Bitnami images the latest bug fixes and features are available as soon as possible.
- Bitnami containers, virtual machines and cloud images use the same components and configuration approach - making it easy to switch between formats based on your project needs.
- All our images are based on minideb a minimalist Debian based container image which gives you a small base container image and the familiarity of a leading Linux distribution.
- All Bitnami images available in Docker Hub are signed with Docker Content Trust (DCT). You can use
DOCKER_CONTENT_TRUST=1to verify the integrity of the images. - Bitnami container images are released daily with the latest distribution packages available.
Why use a non-root container?
Non-root container images add an extra layer of security and are generally recommended for production environments. However, because they run as a non-root user, privileged tasks are typically off-limits. Learn more about non-root containers in our docs.
Supported tags and respective Dockerfile links
Learn more about the Bitnami tagging policy and the difference between rolling tags and immutable tags in our documentation page.
Subscribe to project updates by watching the bitnami/inc-intel GitHub repo.
Get this image
The recommended way to get the Bitnami Pytorch Docker Image is to pull the prebuilt image from the Docker Hub Registry.
$ docker pull bitnami/inc-intel:latest
To use a specific version, you can pull a versioned tag. You can view the list of available versions in the Docker Hub Registry.
$ docker pull bitnami/inc-intel:[TAG]
If you wish, you can also build the image yourself.
$ docker build -t bitnami/inc-intel 'https://github.com/bitnami/bitnami-docker-inc-intel.git#master:1/debian-10'
Entering the REPL
By default, running this image will drop you into the Python REPL, where you can interactively test and try things out with Intel Neural Compressor in Python.
$ docker run -it --name inc bitnami/inc-intel
Configuration
Running your Intel Neural Compressor app
The default work directory for the Intel Neural Compressor image is /app. You can mount a folder from your host here that includes your Intel Neural Compressor script, and run it normally using the python command.
$ docker run -it --name inc -v /path/to/app:/app bitnami/inc-intel \
python script.py
Running a Intel Neural Compressor app with package dependencies
If your Intel Neural Compressor app has a requirements.txt defining your app's dependencies, you can install the dependencies before running your app.
$ docker run -it --name inc -v /path/to/app:/app bitnami/inc-intel \
sh -c "pip install -r requirements.txt && python script.py"
Further Reading:
Maintenance
Upgrade this image
Bitnami provides up-to-date versions of Intel Neural Compressor, including security patches, soon after they are made upstream. We recommend that you follow these steps to upgrade your container.
Step 1: Get the updated image
$ docker pull bitnami/inc-intel:latest
or if you're using Docker Compose, update the value of the image property to bitnami/inc-intel:latest.
Step 2: Remove the currently running container
$ docker rm -v inc-intel
or using Docker Compose:
$ docker-compose rm -v inc-intel
Step 3: Run the new image
Re-create your container from the new image.
$ docker run --name inc bitnami/inc-intel:latest
or using Docker Compose:
$ docker-compose up inc-intel
Contributing
We'd love for you to contribute to this container. You can request new features by creating an issue, or submit a pull request with your contribution.
Issues
If you encountered a problem running this container, you can file an issue. For us to provide better support, be sure to include the following information in your issue:
- Host OS and version
- Docker version (
$ docker version) - Output of
$ docker info - Version of this container (
$ echo $BITNAMI_IMAGE_VERSIONinside the container) - The command you used to run the container, and any relevant output you saw (masking any sensitive information)
License
Copyright © 2022 Bitnami
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.