Quick Start
This section describes how to quickly perform AOE tuning on models trained on different frameworks. The following uses operator tuning as an example. If subgraph tuning is required, change the value of job_type in the command to 1.
Prerequisites
- Related software has been installed. For details, see Environment Setup.
- Required environment variables have been configured. For details, see Environment Variable Configuration.
Network Model Trained on the Open Source TensorFlow Framework
- Obtain a TensorFlow network model.
- Run the following command to perform AOE tuning. (The path and file arguments are for reference only).
aoe --framework=3 --model=${HOME}/module/resnet50_tensorflow_1.7.pb --job_type=2For details about the command-line options, see AOE Command-Line Options.
- If the following information is displayed, the AOE tuning is complete:
1Aoe process finished
After the tuning is complete, the following files are generated:
- Custom repository: If the conditions for generating a custom repository are met (see Figure 3), a custom repository is generated. The generated custom repository is stored in the ${HOME}/Ascend/latest/data/aoe/custom/op/${soc_version} directory by default. For details about how to use the tuned custom repository, see Usage of Tuned Custom Repositories.
- .om model: The model is stored in ${WORK_PATH}/aoe_workspace/${model_name}_${timestamp}/tunespace/result/${model_name}_${timestamp}_tune.om (or ${model_name}_${timestamp}_tune_${os}_${arch}.om) by default. This model can be directly used for inference.
- Tuning result file: After the tuning is complete, a file named aoe_result_opat_${timestamp}_${pidxxx}.json is generated in the working directory where the tuning is performed. This file records the information about the tuned operators. For details about the fields in this file, see Table 1.
ONNX Network Model
- Obtain an ONNX network model.
- Run the following command to perform AOE tuning. (The path and file arguments are for reference only).
aoe --framework=5 --model=${HOME}/module/resnet50_pytorch_1.4.onnx --job_type=2For details about the command-line options, see AOE Command-Line Options.
- If the following information is displayed, the AOE tuning is complete:
1Aoe process finished
After the tuning is complete, the following files are generated:
- Custom repository: If the conditions for generating a custom repository are met (see Figure 3), a custom repository is generated. The generated custom repository is stored in the ${HOME}/Ascend/latest/data/aoe/custom/op/${soc_version} directory by default. For details about how to use the tuned custom repository, see Usage of Tuned Custom Repositories.
- .om model: The model is stored in ${WORK_PATH}/aoe_workspace/${model_name}_${timestamp}/tunespace/result/${model_name}_${timestamp}_tune.om (or ${model_name}_${timestamp}_tune_${os}_${arch}.om) by default. This model can be directly used for inference.
- Tuning result file: After the tuning is complete, a file named aoe_result_opat_${timestamp}_${pidxxx}.json is generated in the working directory where the tuning is performed. This file records the information about the tuned operators. For details about the fields in this file, see Table 1.
Network Model Trained on the MindSpore Framework
- Obtain a MindSpore network model.
- Run the following command to perform AOE tuning. (The path and file arguments are for reference only).
aoe --framework=1 --model=${HOME}/module/ResNet50_for_MindSpore_1.4.air --job_type=2For details about the command-line options, see AOE Command-Line Options.
- If the following information is displayed, the AOE tuning is complete:
1Aoe process finished
After the tuning is complete, the following files are generated:
- Custom repository: If the conditions for generating a custom repository are met (see Figure 3), a custom repository is generated. The generated custom repository is stored in the ${HOME}/Ascend/latest/data/aoe/custom/op/${soc_version} directory by default. For details about how to use the tuned custom repository, see Usage of Tuned Custom Repositories.
- .om model: The model is stored in ${WORK_PATH}/aoe_workspace/${model_name}_${timestamp}/tunespace/result/${model_name}_${timestamp}_tune.om (or ${model_name}_${timestamp}_tune_${os}_${arch}.om) by default. This model can be directly used for inference.
- Tuning result file: After the tuning is complete, a file named aoe_result_opat_${timestamp}_${pidxxx}.json is generated in the working directory where the tuning is performed. This file records the information about the tuned operators. For details about the fields in this file, see Table 1.
Network Model Trained on the Open Source Caffe Framework
- Obtain a Caffe network model.
Download the .prototxt model file and .caffemodel weight file of the ResNet-50 network and upload the files to any path in the operating environment as the CANN running user, for example, ${HOME}/module/.
- Run the following command to perform AOE tuning. (The path and file arguments are for reference only).
aoe --framework=0 --model=${HOME}/module/resnet50.prototxt --weight=${HOME}/module/resnet50.caffemodel --job_type=2For details about the command-line options, see AOE Command-Line Options.
- If the following information is displayed, the AOE tuning is complete:
1Aoe process finished
After the tuning is complete, the following files are generated:
- Custom repository: If the conditions for generating a custom repository are met (see Figure 3), a custom repository is generated. The generated custom repository is stored in the ${HOME}/Ascend/latest/data/aoe/custom/op/${soc_version} directory by default. For details about how to use the tuned custom repository, see Usage of Tuned Custom Repositories.
- .om model: The model is stored in ${WORK_PATH}/aoe_workspace/${model_name}_${timestamp}/tunespace/result/${model_name}_${timestamp}_tune.om (or ${model_name}_${timestamp}_tune_${os}_${arch}.om) by default. This model can be directly used for inference.
- Tuning result file: After the tuning is complete, a file named aoe_result_opat_${timestamp}_${pidxxx}.json is generated in the working directory where the tuning is performed. This file records the information about the tuned operators. For details about the fields in this file, see Table 1.