Remote Operator Deployment
Deploy the custom OPP runfile custom_opp_Linux_Arch.run to the built-in OPP in the hardware environment with the Ascend AI Processor to construct necessary conditions for operators execution on the network.
- On the MindStudio project page, select the operator project.
- On the top menu bar, choose . The operator deployment dialog box is displayed.
Select configuration options on the Deploy Remotely > Deployment page. For details about how to configure Deployment, see Deployment.
- Configure environment variables.The two configuration methods are as follows:
- Configure environment variables on the host in the hardware environment where the Ascend AI Processors are deployed.
Deploy the operator on the host as the running user in MindStudio. Before deploying the operator, ensure the following environment variable is configured on the host.
- Run the following command in the $HOME/.bashrc file on the host as the running user:
export ASCEND_OPP_PATH=/home/xxx/Ascend/ascend-toolkit/latest/opp/home/xxx/Ascend/ascend-toolkit/latest is the OPP installation path. Replace it with the actual path.
- Run the following command to make the environment variable to take effect:
- Run the following command in the $HOME/.bashrc file on the host as the running user:
- Add the environment variable in Environment Variables.
You can type ASCEND_OPP_PATH=/home/xxx/Ascend/ascend-toolkit/latest/opp in the Environment Variables field.
/home/xxx/Ascend/ascend-toolkit/latest is the OPP installation path. Replace it with the actual path.
You can also click the icon next to the text box and enter a value in the displayed dialog box.
- Type ASCEND_OPP_PATH in the Name field.
- Enter the environment variable value /home/xxx/Ascend/ascend-toolkit/latest/opp in the Value text box.
- Configure environment variables on the host in the hardware environment where the Ascend AI Processors are deployed.
- Select the specified OPP runfile.
In Operator Package, select the specified OPP directory.
- Select the target server for operator deployment and click Operator deploy.
- Deploy the operators. Operator deployment is equivalent to installing the custom OPP generated in Operator Project Build. After the deployment, the operator is deployed in the OPP installation path on the host. The default path is /usr/local/Ascend/opp/.Figure 1 Operator deployment log messages

After the custom OPP is deployed on the host, the directory structure is similar to the following:
├── opp // OPP directory │ ├── vendors // Directory of custom operators │ ├── config.ini // Priority configuration file of custom operators │ ├── vendor_name1 // Custom operator deployed by the storage vendor. The vendor_name is configured during the build of the custom operator installation package. If vendor_name is not configured, the default value customize is used. │ ├── op_impl │ ├── cpu │ ├── aicpu_kernel/ │ ├── vendor_name1_impl // Custom operator implementation code file │ ├── libcust_aicpu_kernels.so │ ├── config │ ├── cust_aicpu_kernel.json // Custom operator information library file │ ├── vector_core // Reserved directory, which can be ignored │ ├── framework │ ├── caffe // Directory of the plugin library of custom Caffe operators │ ├── onnx // Directory of the plugin library of custom ONNX operators │ ├── tensorflow // Directory of the plugin library of custom TensorFlow operators │ ├── libcust_tf_parsers.so │ ├── op_proto │ ├── libcust_op_proto.so // Prototype library file of the custom operator