Introduction

StreamServer is an inference server developed based on the MindX SDK and provides inference services through RESTful APIs. The StreamServer inference service is not a complete system. It needs to be integrated into the user's system as a component and used together with other systems to form a complete inference service system. You need to consider the security, reliability, and availability of the deployment environment. You are advised to integrate and deploy the StreamServer inference service by referring to the mainstream Web deployment solutions and not to use the inference service independently. In addition, you are advised to deploy Nginx before configuring the HTTP server.

Feature

  • Supports the pipeline inference service.
  • Supports single-model inference services and automatic batch grouping.
  • Supports the HTTPS and HTTP protocols. (If the HTTP protocol is used, you are liable for data leakage caused by unsecured network environments.)
  • Supports the HTTPS bidirectional encryption and authentication.
  • Supports request flow control.
  • Allows Huawei's key management CBB (KMC) to manage the CA key.
  • Supports the filtering of IP address trustlist.