mirror of
https://github.com/NVIDIA/TensorRT-LLM.git
synced 2026-01-14 06:27:45 +08:00
Fix permission for local user issues in NGC docker container. (#5373)
Signed-off-by: Martin Marciniszyn Mehringer <11665257+MartinMarciniszyn@users.noreply.github.com>
This commit is contained in:
parent
b6d23d58c4
commit
fc64f139e4
@ -9,9 +9,9 @@ ARG GROUP_ID=0
|
||||
ARG GROUP_NAME=root
|
||||
|
||||
RUN (getent group ${GROUP_ID} || groupadd --gid ${GROUP_ID} ${GROUP_NAME}) && \
|
||||
(getent passwd ${USER_ID} || useradd --gid ${GROUP_ID} --uid ${USER_ID} --create-home --no-log-init --shell /bin/bash ${USER_NAME})
|
||||
|
||||
RUN apt-get update && \
|
||||
(getent passwd ${USER_ID} || useradd --gid ${GROUP_ID} --uid ${USER_ID} --create-home --no-log-init --shell /bin/bash ${USER_NAME}) && \
|
||||
chown ${USER_NAME}:${GROUP_NAME} /app/tensorrt_llm && \
|
||||
apt-get update && \
|
||||
apt-get install -y sudo && \
|
||||
adduser ${USER_NAME} sudo && \
|
||||
echo '%sudo ALL=(ALL) NOPASSWD:ALL' >> /etc/sudoers && \
|
||||
|
||||
@ -138,6 +138,7 @@ endif
|
||||
docker run $(DOCKER_RUN_OPTS) $(DOCKER_RUN_ARGS) \
|
||||
$(GPU_OPTS) \
|
||||
--volume $(SOURCE_DIR):$(CODE_DIR) \
|
||||
$(if $(filter 1,$(LOCAL_USER)),--volume ${HOME}/.cache:/home/${USER_NAME}/.cache:rw) \
|
||||
--env "CCACHE_DIR=${CCACHE_DIR}" \
|
||||
--env "CCACHE_BASEDIR=${CODE_DIR}" \
|
||||
--env "CONAN_HOME=${CONAN_DIR}" \
|
||||
|
||||
@ -4,7 +4,7 @@
|
||||
|
||||
This is the starting point to try out TensorRT-LLM. Specifically, this Quick Start Guide enables you to quickly get setup and send HTTP requests using TensorRT-LLM.
|
||||
|
||||
The following examples can most easily be executed using the prebuilt [Docker release container available on NGC](https://registry.ngc.nvidia.com/orgs/nvstaging/teams/tensorrt-llm/containers/release) (see also [release.md](https://github.com/NVIDIA/TensorRT-LLM/blob/main/docker/release.md) on GitHub). Ensure to run these commands as a user with appropriate permissions, preferably `root`, to streamline the setup process.
|
||||
The following examples can most easily be executed using the prebuilt [Docker release container available on NGC](https://registry.ngc.nvidia.com/orgs/nvstaging/teams/tensorrt-llm/containers/release) (see also [release.md](https://github.com/NVIDIA/TensorRT-LLM/blob/main/docker/release.md) on GitHub).
|
||||
|
||||
|
||||
## LLM API
|
||||
|
||||
Loading…
Reference in New Issue
Block a user