Skip to content

Working with SIO framework

Installation

The preferred method of using SIO is by utilizing a Docker container, but native installation options exist for both Windows and Linux.

Docker

The SIO docker image can be obtained from our private registry, once we've created an account for you, and provided you with sighthound-keyfile.json:

cat sighthound-keyfile.json | docker login -u _json_key --password-stdin us-central1-docker.pkg.dev

docker pull us-central1-docker.pkg.dev/ext-edge-analytics/docker/sio:r240715

You have a choice of either using the image as is, or inheriting from it. The latter approach is beneficial, if you plan on modifying the image in any way: embedding a license, installing additional software packages etc.

The image is using python 3.8, and it is recommended that no changes are made to python runtime. A small set of packages is installed with pip (numpy, etc), and more can be added.

Important environment variables that can be used within the container:

  • SH_HOME - root of Sighthound software installation, normally /sighthound
  • SIO_HOME - root of SIO installation, normally /sighthound/sio

Additional variables that can be defined to modify the container behavior:

  • SIO_DATA_DIR - path at which SIO will be storing information, such as CUDA engines. It is recommended to set this to a volume shared from the host, to prevent re-creation of generated data, such as CUDA engines, with each container restart
  • SIO_INFERENCE_RUNTIME - override default inference runtime (D3T - TensorRT, D3V - x86 CPU)
  • SIO_USER_PLUGINS_DIR - pipeline extension modules (see below) may be specified using either a full path, or filename. In the latter case, SIO will search for it in the folder specified by this variable.

NVIDIA Container Toolkit

If an NVIDIA GPU is available on your host, install the NVIDIA Container Toolkit.

Native installation

Windows

  • Unzip the supplied package into a folder, for example c:\sio
  • Install the supplied redistributable packages from c:\sio\redist (at the time of writing VC_redist.x64.exe and w_dpcpp_cpp_runtime_p_2022.0.0.3663.exe )
  • Windows installation comes with an embedded Python installation, with required packages, like numpy already present. It may be extended by using c:\sio\bin\python.exe -m pip install [package]

Linux

  • It is highly recommended that docker image is used; there needs to be a very good reason to use a direct installation.
  • Install SIO in the desired location, for example /opt/sio
  • Ensure integration with the Python version of your choice (in the example, 3.9): ln -s /usr/lib/x86_64-linux-gnu/libpython3.9.so.1.0 /opt/sio/lib/libpython3.6m.so.1.0
  • Set LD_LIBRARY_PATH with LD_LIBRARY_PATH=$/opt/sio/lib:${LD_LIBRARY_PATH}
  • Ensure required Python packages are present. At a minimum, have numpy, pillow, shapely.

License

To run SIO you'd need a license file provided by Sighthound. In case of the native installation, or inherited docker image the license may be placed under ${SIO_HOME}/share/sighthound-license.json. Otherwise, the license file may be specified as one of the runPipeline parameters.


Running pipelines

runPipeline utility

SIO operates by executing pipelines specified by YAML configuration, and a set of optional or mandatory parameters.

runPipeline utlitiy executes a single pipeline, and exits once the pipeline completes. All of the parameters are specified via CLI.

For the sake of this example, we'll concentrate on VehicleAnalytics pipeline, available at ./pipelines/VehicleAnalytics.

Example of running a pipeline, that'd process images deposited into a watched folder:

./bin/runPipeline share/pipelines/VehicleAnalytics/VehicleAnalyticsFolderWatch.yaml folderPath=/tmp

Same, but with a docker container, and license file provided from a shared external volume:

docker run -it --rm -v /data:/data -e SIO_DATA_DIR=/data us-central1-docker.pkg.dev/ext-edge-analytics/docker/sio:r240715
/sighthound/sio/bin/runPipeline /sighthound/sio/share/pipelines/VehicleAnalytics/VehicleAnalyticsFolderWatch.yaml folderPath=/data/watchedFolder --license-path /data/sighthound-license.json

runPipelineSet utility

runPipelineSet utility differs from runPipeline - it can execute multiple pipelines. The configuration for each pipeline or group of pipelines is specified with a JSON configuration. Multiple pipeline configurations can be specified in a single file, and/or multiple configuration files can be provided. For an example, please refer to ./examples/config/runPipelineSet/. Using these configurations, it's possible to run:

/sighthound/sio/bin/runPipelineSet set1.json set2.json

This command will execute all three pipelines specified in the two configuration files in a single process.

Aqueduct

Pipeline can also be launched or stopped remotely using Aqueduct utility. Aqueduct requires a RabbitMQ broker, and will listen on the specified queue for pipeline control commands.

/sighthound/sio/bin/aqueduct --port=5672 --host=127.0.0.1 --exchange=amq.topic --routingKey=aqueduct.execute --user=guest --password=guest --no-ssl --log info

will launch an aqueduct server, which will listen on AMQP queue for execution commands. Example of Aqueduct client can be found in ./examples/aqueduct/aqueductLaunch.py


VehicleAnalytics and TrafficAnalytics pipelines

SIO ships with two production-ready pipelines: VehicleAnalytics and TrafficAnalytics. The two are fairly similar, with purpose being the primary difference. VehicleAnalytics is intended for cases where traffic object identification and classification is the primary goal. It will deliver information such as make/model/color/generation of the detected vehicles and ALPR for the detected license plates. TrafficAnalytics pipeline is primarily used to detect objects, without classifying them, and potentially track them across the frame.

Output

The pipeline generates its output in JSON format. The schema can be found in ./docs/schemas.

Input adaptors

The pipeline can run with a single file, watched folder or RTSP as an input, depending on the entry point used. Each pipeline has a number of parameters that may be used to alter its behavior. For details, please refer to share/pipelines/VehicleAnalytcis/README.md and share/pipelines/TrafficAnalytcis/README.md

Pipeline extensions

For minor changes to pipeline behavior (such as output filtering or alteration, specifying a different method of egress from the pipeline, etc), a pipeline extension mechanism can be used. It allows execution of user-specified Python code, without altering the core pipeline's behavior.

An example of excuting a pipeline with an extension module looks like

./bin/runPipeline share/pipelines/VehicleAnalytics/VehicleAnalyticsFile.yaml VIDEO_IN=examples/media/2lps.png extensionModules=examples/extensions/OutputLogger.py extensionConfigurations=examples/extensions/OutputLoggerCfg.json

In this example the extension intercepts the output, modifies one of its fields, and saves it to a file. The extension contract is fairly simple:

  • Updated plugin API:
    • Define class SIOPlugin with constructor taking no parameters, and methods configure, process, finalize with the signature similar to those in the example.
  • Deprecated plugin API:
    • Define 3 methods: configure, process, finalize with the signature similar to those in the example.
  • process must always return the desired (either modified or not) output JSON.