MoveIt Pro ML on Jetson Devices
The following Dockerfile changes will not work unless you have setup GPU Acceleration and GPU Inference
The moveit_pro_ml
package enables the use of machine learning models in MoveIt Pro behaviors. These ML behaviors can
be run on higher end CPUs in a reasonable amount of time, but will not be performant on Jetson devices if the GPU isn't utilized.
Enabling a Jetson GPU with MoveIt Pro
For Nvidia tegra release 36 (Jetson 6.0, cat /etc/nv_tegra_release
to see your release), the following dependencies are tested to work. For other releases of Tegra, you may have to find which versions of cudnn, cudnn-dev, onnxruntime, and l4t you need, and modify the below URLs accordingly. See here for information on different versions of jetpack.
Add the following lines to your user_ws Dockerfile in the user-overlay
stage:
RUN wget https://repo.download.nvidia.com/jetson/common/pool/main/c/cudnn/libcudnn8_8.9.4.25-1+cuda12.2_arm64.deb -q --show-progress --progress=dot:giga && \
sudo dpkg -i libcudnn8_8.9.4.25-1+cuda12.2_arm64.deb && \
rm libcudnn8_8.9.4.25-1+cuda12.2_arm64.deb
RUN wget https://repo.download.nvidia.com/jetson/common/pool/main/c/cudnn/libcudnn8-dev_8.9.4.25-1+cuda12.2_arm64.deb -q --show-progress --progress=dot:giga && \
sudo dpkg -i libcudnn8-dev_8.9.4.25-1+cuda12.2_arm64.deb && \
rm libcudnn8-dev_8.9.4.25-1+cuda12.2_arm64.deb
RUN wget -O onnxruntime_gpu-1.19.0-cp310-cp310-linux_aarch64.whl https://nvidia.box.com/shared/static/6l0u97rj80ifwkk8rqbzj1try89fk26z.whl -q --show-progress --progress=dot:giga && \
pip install onnxruntime_gpu-1.19.0-cp310-cp310-linux_aarch64.whl && \
rm onnxruntime_gpu-1.19.0-cp310-cp310-linux_aarch64.whl
And ensure that the GPU_INFERENCE section from the previous guide is enabled for the arm64 repo:
########################
# ENABLE GPU INFERENCE #
# ######################
RUN --mount=type=cache,target=/var/cache/apt,sharing=locked \
--mount=type=cache,target=/var/lib/apt,sharing=locked \
apt-get update && apt-get install wget -y -q --no-install-recommends && \
wget https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2204/arm/cuda-keyring_1.1-1_all.deb && \
...
Then rebuild Pro with moveit_pro build
and your Jetson GPU should now be fully enabled for GPU Inference.