Comparison Procedure
- In MindStudio, choose from the main menu. The comparison dialog box is displayed.Figure 1 New task (Linux_Remote Run_1)
Figure 2 New task (Linux_Remote Run_2)
Figure 3 New task (Linux_Local Run)
Figure 4 New task (Windows_1)
Figure 5 New task (Windows_2)
- Configure the following parameters as required: For details about the parameters, see Table 1.
Table 1 Parameters for creating an accuracy comparison task Parameter
Description
Run Mode
- Remote Run
- Local Run
In Windows OSs, only Remote Run is supported and this parameter is not displayed.
Deployment
Run configuration. This parameter is mandatory and is available only when Remote Run is selected. You can use the Deployment function to synchronize the files and folders in a specified project to a specified directory on a remote device. For details, see Deployment.
Remote Toolkit Path
Installation path of the toolkit software package in the remote operating environment, available when you select the Remote Run mode. This parameter is mandatory. Example: ${HOME}/Ascend/ascend-toolkit/{version}/toolkit.
This parameter binds with the Deployment parameter. After you click Start, the parameter value is saved. During re-configuration, if Deployment has been configured, the parameters are automatically set and can be manually modified.
Environment Variables
Environment variable configuration. This parameter is displayed when you select Remote Run. You can enter a value in the text box, or click
, then click
, and enter a value in the dialog box that is displayed.(Optional) If Model File is set to an offline model file (*.om), environment variables are required. Otherwise, the tool cannot perform ATC conversion for the offline model file (*.om), causing the comparison failure.
This parameter binds with the Deployment parameter. After you click Start, the parameter value is saved. During re-configuration, if Deployment has been configured, the parameters are automatically set and can be manually modified.
Output Path
(Mandatory) Path for storing the comparison result. Set the path to the local path regardless of whether Remote Run or Local Run is selected. The default path is the user directory of the current system.
Analysis Mode
(Mandatory) Accuracy comparison and analysis mode. The options are as follows:
- NPU vs NPU: indicates that the two files to be compared are dump files of a model running on the the Ascend AI Processor. In this case, the Model File parameter is optional. Generally, this parameter is used to analyze the accuracy difference between dump data files after model conversion when fusion patterns are enabled and disabled. For details about how to enable and disable fusion patterns for model conversion, see Model Conversion.
- NPU vs GPU/CPU: Compare the dump data file of the model running on the the Ascend AI Processor with the .npy file of the original model. In this case, the Framework parameter is mandatory.
Framework
(Mandatory) Type of the framework to which the data to be compared belongs. This parameter is available when Analysis Mode is set to NPU vs GPU/CPU. The options are as follows:
- TensorFlow: accuracy comparison of dump data of a TensorFlow model. The inference and training scenarios are supported. The Model File parameter is mandatory.
- ONNX: accuracy comparison of dump data of an ONNX model. The inference scenarios are supported. The Model File parameter is mandatory.
- Caffe: accuracy comparison of dump data of a Caffe model. The inference scenarios are supported. The Model File parameter is mandatory.
NPU Dump
(Mandatory) Directory for storing dump data files generated by a model running on the the Ascend AI Processor.
To compare data files remotely (Remote Run), specify the dump data file directory on the remote device.
Model File
Model file or quantization fusion pattern file.
- If Analysis Mode is set to NPU vs NPU, when comparing the accuracy of dump data before and after operator fusion is enabled for offline model conversion, specify the operator mapping file (.json) or offline model file (*.om) for which fusion is enabled as well as the operator mapping file (.json) or offline model file (*.om) for which fusion is disabled.
- If Analysis Mode is set to NPU vs GPU/CPU, select a file based on the selected framework type.
- TensorFlow: Select the quantization fusion pattern file (.json) or offline model file (.om) after Ascend model compression in an inference scenario, and select the computational graph file (.txt) in a training scenario.
- ONNX: Select the quantization fusion pattern file (.json) or offline model file (.om) after Ascend model compression.
- Caffe: Select the quantization fusion pattern file (.json) or offline model file (.om) after Ascend model compression.
For details, see Comparison Scenarios.
Quantization Rule File (.json)
(Optional) Quantization operator mapping file (.json output by Ascend model compression). This parameter is available only when Framework is set to Caffe.
Ground Truth
(Mandatory) Directory for storing the .npy file of the original model.
To compare data files remotely (Remote Run), specify the directory for storing the .npy file of the original model on the remote device.
Algorithm
Comparison algorithm dimension. Possible values are:
- Cosine Similarity (selected by default)
- Relative Euclidean Distance (selected by default)
- Absolute Error (selected by default). Corresponding comparison algorithms are as follows:
- Max Absolute Error
- Mean Absolute Error
- Root Mean Square Error
- Relative Error (selected by default). Corresponding comparison algorithms are as follows:
- Max Relative Error
- Mean Relative Error
- Accumulated Relative Error
- Kullback Leibler Divergence (not selected by default)
- Standard Deviation (not selected by default)
Select algorithms or use Customized Algorithm.
Advance Options
Extended options, including Customized Algorithm, Advisor, and Operator Range.
Customized Algorithm
Customized algorithm path.
Select algorithms or use Customized Algorithm.
You need to prepare the customized algorithm .py file in the algorithm directory and specify the .py file to generate the customized algorithm. For details about the requirements for the customized algorithm .py files, see "Preparing a Customized Algorithm .py File" in the Appendix of the Model Accuracy Analyzer Instructions.
Advisor
Advisor analysis switch, which is disabled by default. After this switch is enabled, Advisor analyzes the network-wide comparison result and outputs the faulty nodes, fault types, and tuning suggestions. See Expert Suggestions on Comparison Results.
Before using this function, run the pip3 install pandas command to install pandas 1.3 or later.
This function cannot be enabled with Operator Range at the same time.
Operator Range
Operator comparison range. Two methods are available:
- Method 1: Click Select. In the displayed dialog box, select the operators to be compared.
- Method 2: Configure the range of the operators to be compared based on the Start, End, and Step parameters.
- start: first operator to be compared. The value range is [1, number of operators involved in computing]. The default value is 1.
- end: last operator to be compared. The value can be -1 or [start, number of operators involved in computing]. The default value is -1, indicating that the last operator involved in computing in the network model is dynamically obtained.
- step: the start + step x n operator to be compared. The value range is [1, number of operators involved in computing). The default value is 1, and n is a positive integer starting from 1.
- The format is "start,end,step". For example, -r 1,101,20 indicates that the tensors of operators 1, 21, 41, 61, 81, and 101 are compared.
If this option is not configured, all operators involved in computing in the network model are compared.
If this parameter is set and Analysis Mode is set to NPU vs NPU, you need to specify Model File for NPU Dump and Ground Truth at the same time and also specify the operator mapping file (.json) or offline model file (*.om) for which fusion is enabled as well as the operator mapping file (.json) or offline model file (*.om) for which fusion is disabled.
This function cannot be enabled with Advisor at the same time.
You are advised not to invoke customized algorithm files (Customized Path) and OM model files in directories of other users to avoid privilege escalation risks. If you select a file of another user, the system displays a risk message.
- Click Start.
For details about the comparison results, see Comparison Results.