Analyzing Model Performance Bottlenecks

  1. Choose Ascend > Advisor from the menu bar. The MindStudio Advisor page is displayed. See Figure 1.
    Figure 1 MindStudio Advisor page
  2. Click in the upper left corner of the Figure 1. The system configuration page is displayed. Figure 2 shows the parameters.
    Figure 2 New project configuration
    Table 1 Executable Properties parameters

    Parameter

    Description

    Run Mode

    • Remote Run
    • Local Run:

    In Windows OSs, only Remote Run is supported and this parameter is not displayed.

    Project Location

    Directory for storing the .json file after analysis. The default value is ${HOME}/AscendProjects/AscendAdvisor/untitled. It can be customized.

    Analysis Model

    Analysis execution model. The options are as follows:

    • OM only: analyzes only the selected OM model. Executable files in this mode are built-in standard executable files and do not need to be manually specified.
    • Inference App: analyzes the OM model in the application of the current user. The executable file (main) specified by Executable File must be specified.

    In OM only mode, only standardization analysis is performed on a model. To perform precise analysis on the model, specify the Inference App mode.

    OM Location

    OM model file.

    CCE Code Location

    Directory for storing CCE files. If this parameter is not specified, the analysis result of MindStudio Advisor may be inaccurate. Currently, this parameter is required only for Roofline model-based operator bottleneck identification and tuning suggestion.

    SoC Version

    Processor version of the device to which the file to be analyzed belongs. Example: Ascendxxx.

    Select a processor type based on the environment to which the input data belongs. You can run the npu-smi info command to view the processor type in an environment.

    Environment Variables

    Environment variable configuration. Enter environment variables in the text box or click to add more by clicking in the displayed dialog box.

  3. After the configuration is complete, click Start to start the analysis. After the analysis is complete, display the analysis result, as shown in Figure 3.
    Figure 3 Analysis result display

    Model Performance Report is a summary report of model performance. On this page, you can check the throughput and running time of the model, the AI Core usage, and whether the model performance is good or poor and whether the tiling policy is proper. According to the analysis result of ResNet-50, the processor usage is low and can be tuned.

    For details about the fields in the analysis result and the analysis suggestions provided by MindStudio Advisor, see "MindStudio Advisor" > "Procedure" > "MindStudio Advisor Entry" > "Analysis Result Display" in the MindStudio User Guide.