Accuracy Pre-check

Prerequisites

Complete Accuracy Collection to obtain the accuracy data of the NPU environment.

Pre-check

  1. Copy the accuracy data dumped from the NPU environment to the GPU environment.
  2. Run the run_ut command in the GPU and NPU environments to perform the pre-check. (In the pre-check scenario, the GPU environment needs to use the accuracy data copied from the NPU environment.)
    msprobe -f pytorch run_ut -api_info ./dump_data/step0/rank/dump.json

    If "Successfully completed run_ut/multi_run_ut" is displayed, the pre-check is complete. In this case, two CSV files are generated in the execution path: accuracy_checking_details_{timestamp}.csv and accuracy_checking_result_{timestamp}.csv. The two files are the comparison results between the NPU or GPU and the CPU. The result is the intermediate result of the pre-check. The final pre-check result can be obtained only after the next step is complete.

  3. Compare the pre-check results of the NPU and GPU to check whether there are APIs with accuracy issues in the NPU.
    Run the following command to upload the accuracy_checking_details_{timestamp}.csv file on the GPU to the NPU environment:
    msprobe -f pytorch api_precision_compare -npu ./npu/accuracy_checking_details_{timestamp}.csv -gpu ./gpu/accuracy_checking_details_{timestamp}.csv -o ./compare_result/accuracy_checking
  4. Analyze the pre-check result.

    api_precision_compare generates two CSV files in the ./compare_result/accuracy_checking directory.

    • The api_precision_compare_details_{timestamp}.csv file indicates whether each API has passed the test.
    • The api_precision_compare_result_{timestamp}.csv file records the API status in each comparison algorithm.

      For details about more fields in the comparison result, see pre-check result comparison.