1.12.0-debian-11-r1 release
This commit is contained in:
parent
0524dd77f0
commit
d3d32f378d
|
|
@ -9,7 +9,7 @@ ENV HOME="/" \
|
|||
COPY prebuildfs /
|
||||
# Install required system packages and dependencies
|
||||
RUN install_packages acl ca-certificates curl gzip libbz2-1.0 libc6 libcom-err2 libcrypt1 libffi7 libgcc-s1 libgl1 libglib2.0-0 libgomp1 libgssapi-krb5-2 libk5crypto3 libkeyutils1 libkrb5-3 libkrb5support0 liblzma5 libncursesw6 libnsl2 libpcre3 libreadline8 libsqlite3-0 libssl1.1 libstdc++6 libtinfo6 libtirpc3 procps tar zlib1g
|
||||
RUN . /opt/bitnami/scripts/libcomponent.sh && component_unpack "python" "3.8.13-1" --checksum 79af9dcbaa89c4047d2d24b4a4c2ae17b771fe94972734379b9e50ef3dec3442
|
||||
RUN . /opt/bitnami/scripts/libcomponent.sh && component_unpack "python" "3.8.13-2" --checksum 79af9dcbaa89c4047d2d24b4a4c2ae17b771fe94972734379b9e50ef3dec3442
|
||||
RUN . /opt/bitnami/scripts/libcomponent.sh && component_unpack "inc-intel" "1.12.0-0" --checksum ed8151f40f3f42f6579eecf97b07cae861931e8865d043e49eb52f050533e1a7
|
||||
RUN . /opt/bitnami/scripts/libcomponent.sh && component_unpack "gosu" "1.14.0-0" --checksum da4a2f759ccc57c100d795b71ab297f48b31c4dd7578d773d963bbd49c42bd7b
|
||||
RUN apt-get update && apt-get upgrade -y && \
|
||||
|
|
|
|||
|
|
@ -18,6 +18,6 @@
|
|||
"digest": "79af9dcbaa89c4047d2d24b4a4c2ae17b771fe94972734379b9e50ef3dec3442",
|
||||
"distro": "debian-11",
|
||||
"type": "NAMI",
|
||||
"version": "3.8.13-1"
|
||||
"version": "3.8.13-2"
|
||||
}
|
||||
}
|
||||
|
|
@ -1,12 +1,12 @@
|
|||
# Intel Neural Compressor (INC) Container for Intel packaged by Bitnami
|
||||
# Intel Neural Compressor packaged by Bitnami
|
||||
|
||||
## What is Intel Neural Compressor?
|
||||
|
||||
> Intel® Neural Compressor (INC) is an open-source Python library designed to help optimize inference solutions on popular deep-learning frameworks. It applies quantization, pruning, and knowledge distillation methods to achieve optimal product objectives.
|
||||
> Intel® Neural Compressor (INC) is an open-source Python library designed to help quickly optimize inference solutions on popular deep-learning frameworks.
|
||||
|
||||
[Overview of Intel Neural Compressor](https://github.com/intel/neural-compressor)
|
||||
[Overview of Intel Neural Compressor](https://intel.github.io/neural-compressor)
|
||||
|
||||
This software listing is packaged by Bitnami. The respective trademarks mentioned in the offering are owned by the respective companies, and use of them does not imply any affiliation or endorsement
|
||||
Trademarks: This software listing is packaged by Bitnami. The respective trademarks mentioned in the offering are owned by the respective companies, and use of them does not imply any affiliation or endorsement.
|
||||
|
||||
## TL;DR
|
||||
|
||||
|
|
@ -42,7 +42,7 @@ Non-root container images add an extra layer of security and are generally recom
|
|||
Learn more about the Bitnami tagging policy and the difference between rolling tags and immutable tags [in our documentation page](https://docs.bitnami.com/tutorials/understand-rolling-tags-containers/).
|
||||
|
||||
|
||||
* [`1`, `1-debian-11`, `1.12.0`, `1.12.0-debian-11-r0`, `latest` (1/debian-11/Dockerfile)](https://github.com/bitnami/bitnami-docker-inc-intel/blob/1.12.0-debian-11-r0/1/debian-11/Dockerfile)
|
||||
* [`1`, `1-debian-11`, `1.12.0`, `1.12.0-debian-11-r1`, `latest` (1/debian-11/Dockerfile)](https://github.com/bitnami/bitnami-docker-inc-intel/blob/1.12.0-debian-11-r1/1/debian-11/Dockerfile)
|
||||
|
||||
Subscribe to project updates by watching the [bitnami/inc-intel GitHub repo](https://github.com/bitnami/bitnami-docker-inc-intel).
|
||||
|
||||
|
|
@ -96,7 +96,7 @@ $ docker run -it --name inc -v /path/to/app:/app bitnami/inc-intel \
|
|||
|
||||
**Further Reading:**
|
||||
|
||||
- [Intel Neural Compressor documentation](https://github.com/intel/neural-compressordocs/stable/index.html)
|
||||
- [Intel Neural Compressor documentation](https://intel.github.io/neural-compressordocs/stable/index.html)
|
||||
|
||||
## Maintenance
|
||||
|
||||
|
|
|
|||
Loading…
Reference in New Issue