General Collection Commands

The msprof command line tool provides the capabilities of collecting and parsing the AI job runtime profile data, Ascend AI Processor system data, and other required data.

The general collection commands of msprof are the basis for profile data collection and provide basic data collection information, including parameter meanings, AI job files of each collection project, path for storing the collected profile data, custom environment variables, and size of a profile data file that can be stored.

Prerequisites

  • Ensure that an AI project can run properly in the operating environment.
  • Ensure that operations in Before You Start have been completed.

Procedure (Ascend EP)

Log in to the environment where the Ascend-CANN-Toolkit is located, and run the following commands to collect profile data:

  • Method 1 (recommended): Pass the user application or execution script by adding the AI job execution command to the end of the msprof command.
    msprof --output=/home/projects/output /home/projects/MyApp/out/main
  • Method 2: Pass the user application or execution script by adding the AI job execution command to the --application option.
    msprof --application="/home/projects/MyApp/out/main" --output=/home/projects/output

Procedure (Ascend RC)

Log in to the operating environment, go to the /var directory where the msprof tool is located, and run the following commands to collect profile data:

  • Method 1 (recommended): Pass the user application or execution script by adding the AI job execution command to the end of the msprof command.
    ./msprof --output=/home/projects/output /home/projects/MyApp/out/main
  • Method 2: Pass the user application or execution script by adding the AI job execution command to the --application option.
    ./msprof --application="/home/projects/MyApp/out/main" --output=/home/projects/output

Command-line options

Table 1 Commonly-used options

Option

Description

Supported Model

Passing the user application

<app> [app arguments]

Or

--application

The msprof tool provides two methods to pass the user application or execution script.

  • Method 1 (recommended): Pass the user application or execution script by adding the AI job execution command to the end of the msprof command.

    Format: msprof [msprof arguments] <app> [app arguments]

    Example 1: msprof --output=/home/projects/output main

    Example 2: msprof --output=/home/projects/output /home/projects/MyApp/out/main

    Example 3: msprof --output=/home/projects/output /home/projects/MyApp/out/main parameter1 parameter2

    Example 4: msprof --output=/home/projects/output /home/projects/MyApp/out/sample_run.sh parameter1 parameter2

    Example 5: msprof --output=/home/projects/output python3 /home/projects/MyApp/out/sample_run.py parameter1 parameter2

  • Method 2: Pass the user application or execution script by adding the AI job execution command to the --application option.

    Configuration examples:

    Inference scenario: msprof --application="/home/projects/MyApp/out/main parameter1 parameter2 ..."

    Training scenario: msprof --application="/home/projects/mindspore/scripts/run_standalone_train.sh parameter1 parameter2 ..."

    If abnormal symbols are found in the parameters, the parameters cannot be identified. It is recommended that you employ method 1 for user application passing.

NOTE:
  • You are not advised configuring AI jobs in directories owned by other users or in writable directories owned by other users to avoid privilege escalation risks. You are not advised configuring high-risk operations with security risks, such as deleting files or directories, changing passwords, and running privilege escalation commands. Do not use pmupload as the application name.
  • This option is mandatory if you collect all profile data, AI job runtime profile data, or msproftx data.

    This option is optional if you collect Ascend AI Processor system data.

    This option is optional if you collect the host-side system data.

Atlas 200/300/500 Inference Product

Atlas Training Series Product

--output

Path for storing the collected profile data.

  • This option is optional if you collect all profile data or AI job runtime profile data
  • This option is mandatory if you collect only Ascend AI Processor system data.

This parameter has a higher priority than ASCEND_WORK_PATH. For details, see the Environment Variables.

The following special characters are not allowed in the path: "\n", "\\n", "\f", "\\f", "\r", "\\r", "\b", "\\b", "\t", "\\t", "\v", "\\v", "\u007F", "\\u007F", "\"", "\\\"", "'", "\'", "\\", "\\\\", "%", "\\%", ">", "\\>", "<", "\\<", "|", "\\|", "&", "\\&", "$", "\\$", ";", "\\;", "`", "\\`".

When the user application or execution script is passed by adding the AI job execution command to the end of the msprof command, the profile data whose --output is not configured is flushed to the current directory by default.

When the user application or execution script is passed by adding the AI job execution command to the --application option, the profile data whose --output is not configured is flushed to the directory where the AI job file is located by default.

Atlas 200/300/500 Inference Product

Atlas Training Series Product

--type

Format of the profile data parsing result file. That is, you can choose the format of the result file generated after the profile data collected by the msprof command is automatically parsed. The available formats include:

  • text: parsed into files in .json and .csv formats. For details, see Profile Data File References.
  • db: parsed into a .db file (msprof_timestamp.db) that summarizes all profile data.
    NOTE:
    • The amount of information in db data format is different from that parsed by the text parameter. You are advised to use the text parameter.
    • Data parsing in .db format is currently not supported in MindSpore scenarios.

The default value is text.

For the Atlas 200/300/500 Inference Product, system data is not supported.

Atlas Training Series Product

--environment

Custom environment variables required in the operating environment during data collection. This option is optional.

You are not advised using the directories owned by other users to overwrite the original environment variables, so as to avoid privilege escalation risks.

The value format is --environment="${envKey}=${envValue}" or --environment="${envKey1}=${envValue1};${envKey2}=${envValue2}". Example: --environment="LD_LIBRARY_PATH=/home/HwHiAiUser/Ascend/nnrt/latest/lib64"

Atlas 200/300/500 Inference Product

Atlas Training Series Product

--storage-limit

Maximum size of files that can be stored in a specified disk directory. If the size of profile data files in the disk is about to use up the maximum storage space specified by this option or the total remaining disk space is about to be used up (remaining space ≤ 20 MB), the earliest files in the disk are aged and deleted. This option is optional.

The value range is [200, 4294967295], in MB, for example, --storage-limit=200MB. By default, this option is not set.

If this option is not set, the default value is 90% of the available space of the disk where the directory for storing profile data files is located.

Atlas Training Series Product

--python-path

Path of the Python interpreter used for parsing. The Python version must be 3.7.5 or later. This option is optional.

If this option is executed by a user with higher permissions, do not specify a path with low permissions.

Atlas 200/300/500 Inference Product

Atlas Training Series Product

--help

Help information. This option is optional.

Atlas 200/300/500 Inference Product

Atlas Training Series Product

By default, msprof automatically parses the collected profile data. For details about the export data, see Profile Data Parsing and Export (msprof Command).