ModelInference
Function Usage
Performs model inference. You need to call Init to initialize the model first.
Prototype
APP_ERROR ModelInference (
std::vector<BaseTensor>& inputTensors,
std::vector<BaseTensor>& outputTensors,
size_t dynamicBatchSize = 0
);
APP_ERROR ModelInference (
std::vector<BaseTensor>& inputTensors,
std::vector<BaseTensor>& outputTensors,
DynamicInfo dynamicInfo
);
APP_ERROR ModelInference (
const std::vector<TensorBase>& inputTensors,
std::vector<TensorBase>& outputTensors,
DynamicInfo dynamicInfo
);
Parameter Description
Parameter |
Input/Output |
Description |
|---|---|---|
inputTensors |
Input |
Input tensor, which is the input data of model inference |
outputTensors |
Output |
Output tensor, which is the output data of model inference |
dynamicBatchSize |
Input |
Batch size for model inference |
dynamicInfo |
Input |
Dynamic batch information. |
Return Parameter Description
Data Structure |
Description |
|---|---|
APP_ERROR |
Error code returned during program execution. For details, see the MxBase/ErrorCode/ErrorCode.h file. |
Parent topic: ModelInferenceProcessor