AscendNNInference

API Definition

AscendNNInference(std::vector<int> deviceList, const char* model, uint64_t modelSize)

Function

Constructor of AscendNNInference, which is used to generate AscendNNInference. The processor resources on the NPU and model path are set based on the values configured in deviceList.

Input

std::vector<int> deviceList: processor resources on the NPU.

const char* model: deep neural network model for dimension reduction.

uint64_t modelSize: size of the deep neural network model for dimension reduction.

Output

N/A

Return Value

N/A

Restrictions

The model must be a valid memory pointer of the deep neural network dimension reduction model. modelSize indicates its size and the value range is (0, 128 MB]. If the parameters do not match, the model instantiation or inference may fail. An invalid model may damage the system. Ensure that the model source is valid.