Set Up Your Device - Amazon SageMaker AI

Set Up Your Device

You will need to install packages on your edge device so that your device can make inferences. You will also need to either install AWS IoT Greengrass core or Deep Learning Runtime (DLR). In this example, you will install packages required to make inferences for the coco_ssd_mobilenet object detection algorithm and you will use DLR.

  1. Install additional packages

    In addition to Boto3, you must install certain libraries on your edge device. What libraries you install depends on your use case.

    For example, for the coco_ssd_mobilenet object detection algorithm you downloaded earlier, you need to install NumPy for data manipulation and statistics, PIL to load images, and Matplotlib to generate plots. You also need a copy of TensorFlow if you want to gauge the impact of compiling with Neo versus a baseline.

    !pip3 install numpy pillow tensorflow matplotlib
  2. Install inference engine on your device

    To run your Neo-compiled model, install the Deep Learning Runtime (DLR) on your device. DLR is a compact, common runtime for deep learning models and decision tree models. On x86_64 CPU targets running Linux, you can install the latest release of the DLR package using the following pip command:

    !pip install dlr

    For installation of DLR on GPU targets or non-x86 edge devices, refer to Releases for prebuilt binaries, or Installing DLR for building DLR from source. For example, to install DLR for Raspberry Pi 3, you can use:

    !pip install https://neo-ai-dlr-release.s3-us-west-2.amazonaws.com/v1.3.0/pi-armv7l-raspbian4.14.71-glibc2_24-libstdcpp3_4/dlr-1.3.0-py3-none-any.whl