App Running

The sample procedure applies to the following products:

  • Atlas 200/300/500 Inference Product
  • Atlas Training Series Product

Model Conversion

  1. Log in to the development environment as the HwHiAiUser user (running user).
  2. Obtain the ATC tool and set environment variables by referring to ATC Instructions.
  3. Prepare data.

    Obtain the ResNet-50 weight file (.caffemodel) and model file (resnet50.prototxt) from the following link and upload the files to the caffe_model directory under the vpc_resnet50_imagenet_classification directory sample in the development environment as the HwHiAiUser user (running user).

    • Visit our ModelZoo repository on Gitee and find the download links in the README.md file.
    • Visit our ModelZoo repository on GitHub and find the download links in the README.md file.
  4. Convert the ResNet-50 network into an offline model (.om file) adapted to the Ascend AI Processor. During model conversion, you need to set CSC parameters to convert YUV420SP images to RGB images.

    Go to the vpc_resnet50_imagenet_classification directory and run the following command. Ascendxxx indicates the Ascend AI Processor version. Replace it with the actual one.

    atc --model=caffe_model/resnet50.prototxt --weight=caffe_model/resnet50.caffemodel --framework=0 --output=model/resnet50_aipp --soc_version=Ascendxxx --insert_op_conf=caffe_model/aipp.cfg
    • --output: The generated resnet50_aipp.om file is stored in the vpc_resnet50_imagenet_classification/model directory.
    • Before running the atc command, ensure that you have the write permission on the vpc_resnet50_imagenet_classification directory.
    • If the soc_version of the current device cannot be determined, run the npu-smi info command on the server where the NPU driver package is installed and add Ascend information before the queried Name. For example, if the value of Name is xxxyy, the actual soc_version is Ascend xxxyy.
  5. Compile the operator description information (.json files) of the Cast and ArgMaxD operators into offline models (.om files) that adapt to the Ascend AI Processors to verify the running of a single-operator.

    Go to the vpc_resnet50_imagenet_classification directory and run the following command. Ascendxxx indicates the Ascend AI Processor version. Replace it with the actual one.

    atc --singleop=op_models/op_list.json --soc_version=Ascendxxx --output=op_models/
    • --output: specifies the directory for storing the generated .om file, that is, the vpc_resnet50_imagenet_classification/op_models directory.
    • Before running the atc command, ensure that you have the write permission on the vpc_resnet50_imagenet_classification directory.
    • If the soc_version of the current device cannot be determined, run the npu-smi info command on the server where the NPU driver package is installed and add Ascend information before the queried Name. For example, if the value of Name is xxxyy, the actual soc_version is Ascend xxxyy.

    For details about the parameters, see the parameter description in ATC Instructions.

  6. As the HwHiAiUser running user, upload the sample directory in the development environment to the operating environment.

App Running

  1. Log in to the operating environment.
  2. Prepare test data. Click the following links to download the input images of this sample and upload the images to the vpc_resnet50_imagenet_classification/data directory in the development environment as the running user. If the directory does not exist, create it.
  3. Configure the operating environment by referring to Environment Variable Configuration.
  4. Run the following command in vpc_resnet50_imagenet_classification:
    python3 ./src/acl_sample.py
  5. Check the execution result. You should find the following information upon execution success:
    Using device id:0
    model path:./model/resnet50_aipp.om
    images path:./data
    [Sample] init resource stage:
    [Sample] init resource stage success
    [Model] The class Model initializes resources:
    [Model] create output dataset:
    [Model] create output dataset success
    [Model] The class Model initializes resources successfully.
    [Sample] width:1024 height:683
    [Sample] image:./data/dog1_1024_683.jpg
    [Dvpp] vpc decode stage:
    [Dvpp] vpc decode stage success
    [Dvpp] vpc resize stage:
    [Dvpp] vpc resize stage success
    [Model] create model input dataset:
    [Model] create model input dataset success
    [Model] execute stage:
    [Model] execute stage success
    
    ======== top5 inference results: ========
    label:161  prob: 0.712891
    label:162  prob: 0.147095
    label:167  prob: 0.051636
    label:163  prob: 0.050476
    label:166  prob: 0.030136
    [SingleOP] single op cast success
    [SingleOp] get top 1 label success
    [SingleOP][ArgMaxOp] label of classification result is:161
    [Sample] width:1024 height:683
    [Sample] image:./data/dog2_1024_683.jpg
    [Dvpp] vpc decode stage:
    [Dvpp] vpc decode stage success
    [Dvpp] vpc resize stage:
    [Dvpp] vpc resize stage success
    [Model] create model input dataset:
    [Model] create model input dataset success
    [Model] execute stage:
    [Model] execute stage success
    
    ======== top5 inference results: ========
    label:267  prob: 0.855469
    label:266  prob: 0.049805
    label:219  prob: 0.032654
    label:265  prob: 0.013405
    label:129  prob: 0.011024
    [SingleOP] single op cast success
    [SingleOp] get top 1 label success
    [SingleOP][ArgMaxOp] label of classification result is:267
    [Model] The class Model releases resources successfully.
    [Dvpp] class Dvpp exit success
    [SingOp] class SingOp release source success
    [Sample] class Samle release source success