[bitnami/inc-tensorflow-intel] Release 1.14.2-debian-11-r4 (#13563)

Signed-off-by: Bitnami Containers <bitnami-bot@vmware.com>

Signed-off-by: Bitnami Containers <bitnami-bot@vmware.com>
This commit is contained in:
Bitnami Bot 2022-11-13 12:12:05 +01:00 committed by GitHub
parent fbe15ee503
commit b2597c9c72
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
3 changed files with 13 additions and 13 deletions

View File

@ -4,7 +4,7 @@ ARG TARGETARCH
LABEL org.opencontainers.image.authors="https://bitnami.com/contact" \
org.opencontainers.image.description="Application packaged by Bitnami" \
org.opencontainers.image.ref.name="1.14.2-debian-11-r3" \
org.opencontainers.image.ref.name="1.14.2-debian-11-r4" \
org.opencontainers.image.source="https://github.com/bitnami/containers/tree/main/bitnami/inc-tensorflow-intel" \
org.opencontainers.image.title="inc-tensorflow-intel" \
org.opencontainers.image.vendor="VMware, Inc." \
@ -21,7 +21,7 @@ SHELL ["/bin/bash", "-o", "pipefail", "-c"]
RUN install_packages ca-certificates curl libbz2-1.0 libcom-err2 libcrypt1 libffi7 libgcc-s1 libgl1 libglib2.0-0 libgssapi-krb5-2 libk5crypto3 libkeyutils1 libkrb5-3 libkrb5support0 liblzma5 libncursesw6 libnsl2 libpcre3 libreadline8 libsqlite3-0 libssl1.1 libstdc++6 libtinfo6 libtirpc3 procps zlib1g
RUN mkdir -p /tmp/bitnami/pkg/cache/ && cd /tmp/bitnami/pkg/cache/ && \
COMPONENTS=( \
"python-3.9.15-1-linux-${OS_ARCH}-debian-11" \
"python-3.9.15-2-linux-${OS_ARCH}-debian-11" \
"inc-tensorflow-intel-1.14.2-0-linux-${OS_ARCH}-debian-11" \
"gosu-1.14.0-155-linux-${OS_ARCH}-debian-11" \
) && \

View File

@ -15,9 +15,9 @@
},
"python": {
"arch": "amd64",
"digest": "ca91943ef1c9cc9637845cbb697dad40b995b7033dc03cf76c1ca03998c81c0b",
"digest": "31c466238eba9f390783ffb2f6c4bafd2d3780c32e38061b0599bf9df2219404",
"distro": "debian-11",
"type": "NAMI",
"version": "3.9.15-1"
"version": "3.9.15-2"
}
}

View File

@ -1,10 +1,10 @@
# Intel Neural Compressor for TensorFlow packaged by Bitnami
# Intel Neural Compressor for TF packaged by Bitnami
## What is Intel Neural Compressor for TensorFlow?
## What is Intel Neural Compressor for TF?
> TensorFlow is an open-source high-performance machine learning framework. This image is equipped with Intel&reg; Neural Compressor (INC) to improve the performance of inference with TensorFlow.
[Overview of Intel Neural Compressor for TensorFlow](https://github.com/intel/neural-compressor/)
[Overview of Intel Neural Compressor for TF](https://github.com/intel/neural-compressor/)
Trademarks: This software listing is packaged by Bitnami. The respective trademarks mentioned in the offering are owned by the respective companies, and use of them does not imply any affiliation or endorsement.
@ -69,7 +69,7 @@ $ docker build -t bitnami/APP:latest .
## Entering the REPL
By default, running this image will drop you into the Python REPL, where you can interactively test and try things out with Intel Neural Compressor for TensorFlow in Python.
By default, running this image will drop you into the Python REPL, where you can interactively test and try things out with Intel Neural Compressor for TF in Python.
```console
$ docker run -it --name inc bitnami/inc-tensorflow-intel
@ -77,18 +77,18 @@ $ docker run -it --name inc bitnami/inc-tensorflow-intel
## Configuration
### Running your Intel Neural Compressor for TensorFlow app
### Running your Intel Neural Compressor for TF app
The default work directory for the Intel Neural Compressor for TensorFlow image is `/app`. You can mount a folder from your host here that includes your Intel Neural Compressor for TensorFlow script, and run it normally using the `python` command.
The default work directory for the Intel Neural Compressor for TF image is `/app`. You can mount a folder from your host here that includes your Intel Neural Compressor for TF script, and run it normally using the `python` command.
```console
$ docker run -it --name inc -v /path/to/app:/app bitnami/inc-tensorflow-intel \
python script.py
```
### Running a Intel Neural Compressor for TensorFlow app with package dependencies
### Running a Intel Neural Compressor for TF app with package dependencies
If your Intel Neural Compressor for TensorFlow app has a `requirements.txt` defining your app's dependencies, you can install the dependencies before running your app.
If your Intel Neural Compressor for TF app has a `requirements.txt` defining your app's dependencies, you can install the dependencies before running your app.
```console
$ docker run -it --name inc -v /path/to/app:/app bitnami/inc-tensorflow-intel \
@ -103,7 +103,7 @@ $ docker run -it --name inc -v /path/to/app:/app bitnami/inc-tensorflow-intel \
### Upgrade this image
Bitnami provides up-to-date versions of Intel Neural Compressor for TensorFlow, including security patches, soon after they are made upstream. We recommend that you follow these steps to upgrade your container.
Bitnami provides up-to-date versions of Intel Neural Compressor for TF, including security patches, soon after they are made upstream. We recommend that you follow these steps to upgrade your container.
#### Step 1: Get the updated image