Configuration Procedure
Perform the following steps to log in to MindStudio Advisor for configuration and analysis:
- Open a built application project.
C/C++ and Python application projects are supported. Python application projects do not need to be compiled.
- Choose from the menu bar. The MindStudio Advisor page is displayed. See Figure 1.
- Click
in the upper left corner of Figure 1 to open the configuration page, on which you can select the Remote Run or Local Run modes. In each mode, you can select OM only or Inference App. See the following figures.Figure 2 Remote Run > OM only (Linux)
Figure 3 Remote Run > Inference App (Linux)
Figure 4 OM only (Windows)
Figure 5 Inference App (Windows)
- After the configuration is complete, Deployment binds with the Environment Variables and Remote CANN Path parameters. Click Start to save the parameter settings. During re-configuration, if Deployment has been configured, the Environment Variables and Remote CANN Path parameters are automatically set and can be manually modified.
- When configuring mappings of Ascend Deployment, you must specify an actual address. Otherwise, MindStudio Advisor cannot be started and the message "The specified Deployment Mappings Local Path does not exist." is displayed.
Figure 6 Local Run > OM only
Figure 7 Local Run > Inference App
- In OM only mode, you do not need to specify an executable file and can directly start MindSutdio Advisor for analysis. The analysis result, however, may not fully reflect the actual situation of the user application. In the Inference App mode, the executable file (binary script file main or Python script file) of the Executable File parameter needs to be specified. The model of the user application can be accurately analyzed, and the result is more accurate. However, you need to complete application development in advance. Select a mode as required.
- You are not advised invoking executable files and OM model files in directories of other users to avoid privilege escalation risks. If you select a file of another user, the system displays a risk message.
- Windows OSs support only the Remote Run mode.
Table 1 Executable Properties parameters Parameter
Description
Run Mode
- Remote Run
- Local Run:
In Windows OSs, only Remote Run is supported and this parameter is not displayed.
Deployment
Run configuration. This parameter is available only when Remote Run is selected. You can use the Deployment function to synchronize the files and folders in a specified project to a specified directory on a remote device. For details, see Ascend Deployment.
Project Location
Directory for storing the .json file after analysis. The default value is ${HOME}/AscendProjects/AscendAdvisor/untitled. It can be customized.
Analysis Model
Analysis execution model. The options are as follows:
- OM only: analyzes only the selected OM model. Executable files in this mode are built-in standard executable files and do not need to be manually specified.
- Inference App: analyzes the OM model in the application of the current user. The executable file main and Python script file of Executable File must be specified.
In OM only mode, only standardization analysis is performed on a model. To perform precise analysis on the model, specify the Inference App mode.
OM Location
OM model file. For details about how to obtain the file, see Mapping Between Analysis Functions and Inputs.
CCE Code Location
Directory for storing CCE files. For details about how to obtain the directory, see Mapping Between Analysis Functions and Inputs. If this parameter is not specified, the analysis result of MindStudio Advisor may be inaccurate. Currently, this parameter is required only for Roofline model-based operator bottleneck identification and tuning suggestion.
Executable File
Directory for storing executable files of the target project of MindStudio Advisor. The binary script file main and Python script file can be specified. For details about how to obtain the directory, see In the Linux Scenario. It is displayed only when the Inference App mode is specified. Ensure that the specified executable file can be built and executed in the remote environment in Remote Run mode.
Command Arguments
Command arguments of user applications. Configure them as required and separate them with spaces. By default, these arguments are left blank. This parameter is available when Analysis Model is set to Inference App.
SoC Version
Processor version of the device to which the file to be analyzed belongs. Example: Ascendxxx.
Select a processor type based on the environment to which the input data belongs. You can run the npu-smi info command to view the processor type in an environment.
Environment Variables
Environment variable configuration. Enter environment variables in the text box or click
to add more by clicking
in the displayed dialog box.Remote CANN Path
Installation path of the CANN package in the remote operating environment, available when you select the Remote Run mode. For example, set this parameter to Ascend-cann-toolkit_installation_path/ascend-toolkit/{version}.
- The analysis process of MindStudio Advisor is complex and generates many temporary files. Therefore, you need to reserve sufficient memory space before the analysis. Otherwise, the process may be terminated. You can log in to the server to check the size of ${data_path}/data/project (MindStudio Advisor data directory). For example, if the size of the data directory is 100 MB, you need to reserve 1 GB memory.
- Data files for analysis: The size of the operator simulation file, profile data file, or CCE file cannot exceed 100 MB. The size of the offline OM model file cannot exceed 1 GB.
- After the configuration is complete, click Start to start analysis. After the analysis is complete, display the analysis result, as shown in Analysis Result Display.
