Operator Project Creation
Overview
This section describes how to create an Add operator project using MindStudio. After the project is created, the operator project directory and corresponding file templates are automatically generated. Developers can develop operators based on these templates.
Configuring the Python SDK
Before operator development, you need to configure the Python library on which the operator project depends. For details, see Python SDK Settings.
Procedure
- Navigate to the Create Ascend Operator Project page.
- On the MindStudio welcome page, click New Project to access the page shown in Figure 1.
- On the MindStudio project page:
- Create an operator project.
- In the navigation pane on the left, select Ascend Operator. Configure the operator details on the right, as described in Table 1.
Table 1 Operator information configuration Parameter
Description
Example
CANN Version
Indicates the activated CANN version. You can click Change on the right to change the CANN version. For details, see Switching/Activating a CANN Package.
Select the current CANN version.
New Operator (operator creation mode)
Sample Template
Creates an operator project based on a sample.
If you select this option, three operator implementation modes (AICPU, DSL, and TIK) are displayed in the lower part of the page. Each implementation mode provides operator samples based on the AI framework. You can select one or more operators in an AI framework of an implementation mode to create an operator project.
- AICPU: AI CPU operator samples
- DSL: TBE operator samples implemented based on the DSL mode.
- TIK: TBE operator samples implemented based on the TIK mode.
Select one from the options. Currently, Sample Template is selected by default.
Empty Template
Creates an empty operator project.
If this mode is selected, the Operator Type parameter is displayed in the lower part. You need to enter the OpType of the operator to be created by referring to Operator Analysis.
IR Template
Creates an operator project from the IR definition template. The IR definition template file can be in JSON or Excel format.
If this mode is selected, the Template File parameter is displayed in the lower part. You need to select the file that defines the IR prototype.
NOTICE:- Ascend_IR_Template.json
Obtain the IR definition template file in JSON format from the Ascend-CANN-Toolkit_installation_directory/ascend-toolkit/latest/python/site-packages/op_gen/json_template directory.
- Ascend_IR_Template.xlsx
Obtain the IR definition template file in Excel format from the Ascend-CANN-Toolkit_installation_directory/ascend-toolkit/latest/tools/msopgen/template directory.
Modify the IR definition of the custom operator on the Op sheet in the template file.
For details about IR definition configuration, see IR Definition Configuration.
Tensorflow Template
Creates an operator project from the TensorFlow prototype definition.
To use this mode, download the TensorFlow source code on the server where MindStudio is installed. If this mode is selected, the following parameters are displayed:
- Operator Path: directory of the TensorFlow source code. You are advised to set it to tensorflow/core/ops for higher search efficiency.
- Operator Type: OpType of the operator to be created.
Plugin Framework
Sets the original framework of the operator. This configuration item is not displayed when Sample Template is selected for creating an operator project.
- MindSpore
- PyTorch
- TensorFlow
- Caffe
- ONNX
Caffe does not allow you to create an operator project using the TensorFlow Template.
MindSpore
Compute Unit
The following two options are available. This configuration item is not displayed when Sample Template is selected for creating an operator project.- AI Core/Vector Core: The operator runs on AI Core or Vector Core, indicating the TBE operator.
- AI CPU: The operator runs on AI CPU, indicating the AI CPU operator.
AI Core/Vector Core
Unit Type
Select an operator compute unit from the drop-down list based on the actual version of Ascend AI Processor.
- This parameter is displayed when Compute Unit is set to AI Core/Vector Core.
- This parameter is not displayed when Sample Template is selected for creating an operator project.
- This parameter is not displayed when Plugin Framework is set to MindSpore.
Select the version of Ascend AI Processor .
- Click Next. On the window displayed, configure operator information, as described in Table 2.
Table 2 Project information configuration Parameter
Description
Example
Project name
Project name (user-defined).
The name contains a maximum of 64 characters, starting and ending with a letter or digit. Only letters, digits, hyphens (-), and underscores (_) are allowed.
untitled
Project location
Default path for saving a project (user-defined). (For users who use MindStudio for the first time, the default value is $HOME/MindstudioProjects)
Retain the default.
More Settings
Module name: module name, the same as the Project name.
Retain the default.
Content root: path in the root directory.
Module file location: module file path.
Click the check box on the right of Project format. A drop-down list is displayed.- .idea (directory-based) (default option): During project creation, an .idea option is created to save the project information.
- .ipr (file-based): project configuration file used to save the project configuration information.
- Click Create. The operator project creation is complete.
If there is already an active project in the window, a confirmation message is displayed.
- Click This Window to open the newly created project in the current window.
- Click New Window to open the newly created project in a new window.
- If you need to customize multiple TBE operators, implement and build them in the same operator project. Right-click the root directory of the operator project and choose New > Operator from the shortcut menu to add an operator.
- In the same operator project, the names of the AI CPU operator and TBE operator cannot be the same.
- If you want to quickly create an operator project in an existing MindSpore AI CPU operator project (see Step 1), you can only create an AI CPU operator project in the MindSpore framework.
- In the current Windows version, the semantics that cannot be identified may be marked with red wavy lines in code of the new operator project. This does not affect the normal development process. You can ignore.
- The current Windows version supports C++ code jumping implemented by the operator prototype file and AI CPU operator code. You need to manually configure the system environment variable.
The variable name is ASCEND_OPP_PATH and the variable value is C:\Users\user_name\.mindstudio\huawei\adk\remote\CANN_package_version\opp. Specify the variable value based on the actual user name and CANN package version
- If operators are developed remotely, automatic file synchronization is supported.
- When you use a Sample Template running a non-MindSpore framework to create an operator project, if the operator compute unit is of the Atlas inference products or Atlas A2 training products series, only the ascendxxx directory is generated in the directory of the op_info_cfg operator information library file.
- When you use an Empty Template, IR Template, or Tensorflow Template to create an operator project, a corresponding directory is generated in the directory of the op_info_cfg operator information library file based on the operator compute unit selected in Unit Type.
- In the navigation pane on the left, select Ascend Operator. Configure the operator details on the right, as described in Table 1.
- View the directory structure and main files of the operator project.The directory of a TensorFlow, PyTorch, ONNX, or Caffe operator sample project is organized as follows:
├── .idea ├── build // Files generated after build ├── cmake //Directory of public files related to build ├── cpukernel // Directory of AI CPU operator files. This directory is not created for the TensorFlow and PyTorch TBE operator projects. │ ├── impl // Directory of the operator implementation files │ │ ├── reshape_cust_kernel.cc // Operator source file │ │ ├── reshape_cust_kernel.h // Operator header file │ ├── op_info_cfg // Directory of the operator information libraries │ │ ├── aicpu_kernel │ │ │ ├── reshape_cust.ini // Operator information library definition file │ ├── CMakeLists.txt // Build script, which is called by the CMakeLists.txt file in the root directory of the operator project │ ├── toolchain.cmake ├── framework // Directory of the operator plugin implementation files │ ├── tf_plugin // Directory of the operator plugin files and build scripts of the TensorFlow framework. This directory is not generated when Plugin Framework is set to PyTorch. │ │ ├── tensorflow_add_plugin.cc // add indicates the operator type. The Add operator is used as an example to show the TBE operator development workflow. The AI CPU operator sample equivalent is reshape_cust. │ │ ├── CMakeLists.txt // Build script, which is called by the CMakeLists.txt file in the parent directory. │ ├── caffe_plugin //Directory of the operator plugin files and build scripts of the Caffe framework. This directory is not generated when Plugin Framework is set to PyTorch. │ │ ├── caffe_add_plugin.cc //add indicates the operator type. The Add operator is used as an example to show the TBE operator development workflow. The AI CPU operator sample equivalent is reshape_cust. │ │ ├── CMakeLists.txt // Build script, which is called by the CMakeLists.txt file in the parent directory │ └── onnx_plugin // Directory of the generated operator plugin code when the source framework is ONNX │ ├── CMakeLists.txt │ └── xxx_plugin.cc/ / Implementation file of the operator plugin │ ├── CMakeLists.txt // Build script, which is called by the CMakeLists.txt file in the root directory of the operator project ├── op_proto // Directory of the operator IR definition files │ ├── add.cc // add indicates the operator type. The Add operator is used as an example to show the TBE operator development workflow. The AI CPU operator sample equivalent is reshape_cust. │ ├── add.h // add indicates the operator type. The Add operator is used as an example to show the TBE operator development workflow. The AI CPU operator sample equivalent is reshape_cust. │ ├── CMakeLists.txt // Build script, which is called by the CMakeLists.txt file in the root directory of the operator project. ├── op_tiling // Directory of files related to operator tiling. Ignore this directory if operator tiling is not involved. │ ├── CMakeLists.txt ├── scripts // Project-related script ├── tbe // Directory of TBE operator files. This directory is not created for the TensorFlow AI CPU operator project. │ ├── impl // Directory of the operator implementation files │ │ ├── add.py │ ├── op_info_cfg // Directory of the operator information libraries │ │ ├── ai_core │ │ │ ├── {soc_version} │ │ │ │ ├── add.ini │ │ ├── vector_core │ │ │ ├── {soc_version} │ │ │ │ ├── add.ini ├── .project // Project information file, including the project type, project description, target device type, and CANN version ├── Add_READEME.md // Introduce the basic functions and basic information of operators, as well as main operator projects. ├── CMakeLists.txt ├── MyOperator.imlDirectory structure and key files of a MindSpore operator sample project are organized as follows:
├── .idea ├── build // Files generated after build ├── cmake // Directory of public files related to build ├── cpukernel // Directory for storing the AI CPU operator file. This directory is not created for TBE operator projects. │ ├── impl // Directory of the operator implementation files │ │ ├── cus_square_kernel.cc // Operator source file │ │ ├── cus_square_kernel.h // Operator header file │ ├── op_info_cfg // Directory of the operator information libraries │ │ ├── aicpu_kernel │ │ │ ├── cus_square.ini // Operator information library definition file │ ├── CMakeLists.txt // Build script, which is called by the CMakeLists.txt file in the root directory of the operator project. │ ├── toolchain.cmake ├── framework // Directory of the operator plugin implementation files │ ├── mindspore │ │ ├── impl // Directory of the operator implementation file │ │ │ ├── cus_square_impl.py // Operator implementation and registration file │ ├── op_proto │ │ ├── cus_square.py // Operator primitive .py file ├── .project // Project information file, including the project type, project description, target device type, and CANN version ├── cus_square_READEME.md // Introduce the basic functions and basic information of operators, as well as main operator projects. ├── MyOperator.iml


