[bitnami/tensorflow-serving] Release 2.14.1-debian-11-r4 (#55259)

Signed-off-by: Bitnami Containers <bitnami-bot@vmware.com>
This commit is contained in:
Bitnami Bot 2024-01-22 13:28:21 +01:00 committed by GitHub
parent a67e4df1a4
commit ad7376c69d
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
4 changed files with 41 additions and 18 deletions

View File

@ -8,10 +8,10 @@ ARG TARGETARCH
LABEL com.vmware.cp.artifact.flavor="sha256:1e1b4657a77f0d47e9220f0c37b9bf7802581b93214fff7d1bd2364c8bf22e8e" \
org.opencontainers.image.base.name="docker.io/bitnami/minideb:bullseye" \
org.opencontainers.image.created="2023-12-22T14:55:28Z" \
org.opencontainers.image.created="2024-01-22T10:35:02Z" \
org.opencontainers.image.description="Application packaged by VMware, Inc" \
org.opencontainers.image.licenses="Apache-2.0" \
org.opencontainers.image.ref.name="2.14.1-debian-11-r3" \
org.opencontainers.image.ref.name="2.14.1-debian-11-r4" \
org.opencontainers.image.title="tensorflow-serving" \
org.opencontainers.image.vendor="VMware, Inc." \
org.opencontainers.image.version="2.14.1"
@ -20,7 +20,7 @@ ENV HOME="/" \
OS_ARCH="${TARGETARCH:-amd64}" \
OS_FLAVOUR="debian-11" \
OS_NAME="linux" \
PATH="/opt/bitnami/tensorflow-serving/bin:/opt/bitnami/tensorflow-serving/serving/bazel-bin/tensorflow_serving/model_servers:/opt/bitnami/common/bin:$PATH"
PATH="/opt/bitnami/common/bin:/opt/bitnami/tensorflow-serving/bin:/opt/bitnami/tensorflow-serving/serving/bazel-bin/tensorflow_serving/model_servers:$PATH"
COPY prebuildfs /
SHELL ["/bin/bash", "-o", "errexit", "-o", "nounset", "-o", "pipefail", "-c"]
@ -28,8 +28,8 @@ SHELL ["/bin/bash", "-o", "errexit", "-o", "nounset", "-o", "pipefail", "-c"]
RUN install_packages ca-certificates curl gcc-10 libgcc-s1 libstdc++6 procps
RUN mkdir -p /tmp/bitnami/pkg/cache/ ; cd /tmp/bitnami/pkg/cache/ ; \
COMPONENTS=( \
"render-template-1.0.6-5-linux-${OS_ARCH}-debian-11" \
"tensorflow-serving-2.14.1-2-linux-${OS_ARCH}-debian-11" \
"render-template-1.0.6-4-linux-${OS_ARCH}-debian-11" \
) ; \
for COMPONENT in "${COMPONENTS[@]}"; do \
if [ ! -f "${COMPONENT}.tar.gz" ]; then \
@ -44,6 +44,7 @@ RUN apt-get autoremove --purge -y curl && \
apt-get update && apt-get upgrade -y && \
apt-get clean && rm -rf /var/lib/apt/lists /var/cache/apt/archives
RUN chmod g+rwX /opt/bitnami
RUN find / -perm /6000 -type f -exec chmod a-s {} \; || true
RUN mkdir /.local && chmod g+rwX /.local
COPY rootfs /

View File

@ -3,7 +3,7 @@
"arch": "amd64",
"distro": "debian-11",
"type": "NAMI",
"version": "1.0.6-4"
"version": "1.0.6-5"
},
"tensorflow-serving": {
"arch": "amd64",

View File

@ -10,7 +10,7 @@ fi
script=$1
exit_code="${2:-96}"
fail_if_not_present="${3:-y}"
fail_if_not_present="${3:-n}"
if test -f "$script"; then
sh $script

View File

@ -13,13 +13,6 @@ Trademarks: This software listing is packaged by Bitnami. The respective tradema
docker run --name tensorflow-serving bitnami/tensorflow-serving:latest
```
### Docker Compose
```console
curl -sSL https://raw.githubusercontent.com/bitnami/containers/main/bitnami/tensorflow-serving/docker-compose.yml > docker-compose.yml
docker-compose up -d
```
You can find the available configuration options in the [Environment Variables](#environment-variables) section.
## Why use Bitnami Images?
@ -193,11 +186,34 @@ docker-compose up -d
Tensorflow Serving can be customized by specifying environment variables on the first run. The following environment values are provided to custom Tensorflow:
* `TENSORFLOW_SERVING_PORT_NUMBER`: TensorFlow Serving Port. Default: **8500**
* `TENSORFLOW_SERVING_REST_API_PORT_NUMBER`: TensorFlow Serving Rest API Port. Default: **8501**
* `TENSORFLOW_SERVING_MODEL_NAME`: TensorFlow Model to serve. Default: **resnet**
* `TENSORFLOW_SERVING_ENABLE_MONITORING`: Expose Prometheus metrics. Default: **no**
* `TENSORFLOW_SERVING_MONITORING_PATH`: The API path where the metrics can be scraped. Default: **/monitoring/prometheus/metrics**
#### Customizable environment variables
| Name | Description | Default Value |
|-------------------------------------------|------------------------------|----------------------------------|
| `TENSORFLOW_SERVING_ENABLE_MONITORING` | Enable tensorflow monitoring | `no` |
| `TENSORFLOW_SERVING_MODEL_NAME` | Tensorflow model name | `resnet` |
| `TENSORFLOW_SERVING_MONITORING_PATH` | Tensorflow monitoring path | `/monitoring/prometheus/metrics` |
| `TENSORFLOW_SERVING_PORT_NUMBER` | Tensorflow port number | `8500` |
| `TENSORFLOW_SERVING_REST_API_PORT_NUMBER` | Tensorflow API port number | `8501` |
#### Read-only environment variables
| Name | Description | Value |
|-------------------------------------------|-----------------------------------------------|----------------------------------------------------------|
| `BITNAMI_VOLUME_DIR` | Directory where to mount volumes. | `/bitnami` |
| `TENSORFLOW_SERVING_BASE_DIR` | Tensorflow installation directory. | `${BITNAMI_ROOT_DIR}/tensorflow-serving` |
| `TENSORFLOW_SERVING_BIN_DIR` | Tensorflow directory for binary executables. | `${TENSORFLOW_SERVING_BASE_DIR}/bin` |
| `TENSORFLOW_SERVING_TMP_DIR` | Tensorflow directory for temp files. | `${TENSORFLOW_SERVING_BASE_DIR}/tmp` |
| `TENSORFLOW_SERVING_PID_FILE` | Tensorflow PID file. | `${TENSORFLOW_SERVING_TMP_DIR}/tensorflow-serving.pid` |
| `TENSORFLOW_SERVING_CONF_DIR` | Tensorflow directory for configuration files. | `${TENSORFLOW_SERVING_BASE_DIR}/conf` |
| `TENSORFLOW_SERVING_CONF_FILE` | Tensorflow configuration file. | `${TENSORFLOW_SERVING_CONF_DIR}/tensorflow-serving.conf` |
| `TENSORFLOW_SERVING_MONITORING_CONF_FILE` | Tensorflow directory for configuration files. | `${TENSORFLOW_SERVING_CONF_DIR}/monitoring.conf` |
| `TENSORFLOW_SERVING_LOGS_DIR` | Tensorflow directory for logs files. | `${TENSORFLOW_SERVING_BASE_DIR}/logs` |
| `TENSORFLOW_SERVING_LOGS_FILE` | Tensorflow logs files. | `${TENSORFLOW_SERVING_LOGS_DIR}/tensorflow-serving.log` |
| `TENSORFLOW_SERVING_VOLUME_DIR` | Tensorflow persistence directory. | `${BITNAMI_VOLUME_DIR}/tensorflow-serving` |
| `TENSORFLOW_SERVING_MODEL_DATA` | Tensorflow data to persist. | `${BITNAMI_VOLUME_DIR}/model-data` |
| `TENSORFLOW_SERVING_DAEMON_USER` | Tensorflow system user | `tensorflow` |
| `TENSORFLOW_SERVING_DAEMON_GROUP` | Tensorflow system group | `tensorflow` |
### Configuration file
@ -340,6 +356,12 @@ docker-compose start tensorflow-serving
* The default serving port has changed from 9000 to 8500.
## Using `docker-compose.yaml`
Please be aware this file has not undergone internal testing. Consequently, we advise its use exclusively for development or testing purposes. For production-ready deployments, we highly recommend utilizing its associated [Bitnami Helm chart](https://github.com/bitnami/charts/tree/main/bitnami/tensorflow-resnet).
If you detect any issue in the `docker-compose.yaml` file, feel free to report it or contribute with a fix by following our [Contributing Guidelines](https://github.com/bitnami/containers/blob/main/CONTRIBUTING.md).
## Contributing
We'd love for you to contribute to this container. You can request new features by creating an [issue](https://github.com/bitnami/containers/issues) or submitting a [pull request](https://github.com/bitnami/containers/pulls) with your contribution.