Using an AscendCL Sample Project
This section describes how to use the AscendCL sample project ACL ResNet-50(C++) or ACL ResNet-50(Python). For more examples, see https://gitee.com/ascend/samples.
Description
This sample shows how to classify images based on the Caffe ResNet-50 network (single input with batch size = 1).
Convert the model file of the Caffe ResNet-50 network into an .om offline model adapted to the Ascend AI Processor. In the sample, the .om file is loaded to perform synchronous inference on the two .jpg images, the inference results are obtained and processed, and the class indexes with the top 5 confidence values are output.
Principles
The following table lists the key functions involved in this sample. Take C++ APIs as an example. For details about the APIs, see the Application Software Development Guide (C&C++) or Application Software Development Guide (Python).
|
Initialization |
|
|---|---|
|
Device Management |
|
|
Context Management |
|
|
Stream Management |
|
|
Memory Management |
|
|
Data Transfer |
aclrtMemcpy (for running the application on the host):
If your application runs on the device, data transfer is not needed. If the application runs on the developer board, data transfer is not needed. |
|
Model Inference |
|
|
Data Postprocessing |
Provides sample code to process the model inference result and display the class indexes with top 5 confidences on the device. The sample provides a custom API DumpModelOutputResult, which is used to write the model inference result to a file (after the executable file is executed, the inference result file is generated in the directory of the executable file in the operating environment). This API is not called by default. To call this API, add the following code before the OutputModelResult API in sample_process.cpp in advance. // print the top 5 confidence values with indexes.use function DumpModelOutputResult // if want to dump output result to file in the current directory processModel.DumpModelOutputResult(); processModel.OutputModelResult(); |
ACL ResNet-50(C++) Usage Guide
- Create an application project. For details, refer to Creating an Application Project to create the ACL ResNet-50(C++) sample project.
- Prepare model files and data.
- Obtain model files.
- Convert your model.
Before adding model files, convert the model trained in a third-party framework to an offline model adapted to the Ascend AI Processor (*.om). For details about the model conversion function, see Model Conversion.
- On the MindStudio top menu bar, choose Ascend > Model Converter. The Model Converter page is displayed.
- Set the model conversion information as follows:
- Model File: Select the uploaded ResNet-50 network model file (*.prototxt).
- Weight File: Select the uploaded ResNet-50 weight file (*.caffemodel).
- Model Name: Enter a model name, for example, resnet50.
- Target SoC Version: Specify the value as required.
- OutputPath: Select the output directory of the generated .om model. The default output path is $HOME/modelzoo/${Model Name}/${Target SoC Version}/.
- Click Next. In this sample, do not configure the Data Pre-Processing function. Directly click Next to go to the next step.
- On the Advanced Options Preview page, click Command Preview to check the generated model conversion command. After confirmation, click Finish to start model conversion.
- Add one or more model files.
- Prepare inference data. Obtain the input images of this sample from the following links and perform operations based on the actual application scenario.
Links: data image 1 and data image 2
- Linux application scenario
- Upload the obtained file to the project_directory/data directory in the development environment as the running user. If the directory does not exist, create one.
- Log in to the development environment as the running user.
- Process the image data in either of the following ways:
- Configure the Python interpreter. For details, see Python SDK Dependency Settings. In the MindStudio project directory, find and right-click script/transferPic.py and choose from the shortcut menu. In the displayed run configuration page, set Working directory to project_directory/data. After the configuration is complete, click OK. Right-click transferPic.py again and choose from the shortcut menu.
- Switch to the project_directory/data directory and run the transferPic.py script.
python3 ../script/transferPic.py
Convert the .jpg files to the .bin files and resize the images from 1024 x 683 to 224 x 224. Find the generated .bin files in the project_directory/data directory.
If you get the following error message during script execution, the Pillow library is missing. Run the pip3 install Pillow --user command to install the Pillow library.
ModuleNotFoundError: No module named 'PIL'
- Windows application scenario
- After obtaining the images, perform the following steps to convert the image format. Assume that the images are stored in C:\Users\user1\AscendProjects\MyApp1\data.
- Open the system CLI.
- Run the following command to go to the C:\Users\user1\AscendProjects\MyApp1\data directory:
cd C:\Users\user1\AscendProjects\MyApp1\data
- Run the following command to convert the image format:
python ..\script\transferPic.py
- Linux application scenario
- Build the application project.On the MindStudio project page, choose from the top menu bar.
- Set build parameters and click
to add configurations.
The configuration option marked with (default) in the list is the default one. To switch the default option to a custom option, select the custom option and click
.
Figure 2 Build configuration
Table 1 Description of build configuration parameters Parameter Configuration
Description
Name
Build configuration name.
Build Type
Configuration type. The options are Debug and Release.
Toolchain
Toolchain configurator, which supports local and remote build. For details about the configuration, see Toolchains.
CMake options
CMake option. The default value is -DCMAKE_BUILD_TYPE=Debug -DCMAKE_SKIP_RPATH=TRUE.
Build directory
Relative path of the build directory, which is relative to the project directory.
Build options
Build acceleration option.
Environment Variables
Environment variable configuration.
You can manually configure the environment variables or click
to configure them in the dialog box displayed. - Click Save to save the build project.
- Set build parameters and click
- Start the build.
On the MindStudio project page, choose from the top menu bar.
- Run the application project.
On the MindStudio project page, choose from the top menu bar.
Click + in the upper right corner, create a CMake Application run configuration, and perform the following steps to configure it:- Configure run parameters.
Figure 3 shows the configuration examples in the Linux application scenarios. After the configuration is complete, click OK to close the run configuration page. (Because a relative path is used in the sample project, set Working Directory to the directory where the main executable file is located.)
- On the MindStudio project page, choose Run > Run... from the top menu bar. In the dialog box displayed, select the created run configuration and run the application.
Figure 4 shows an example of the execution result. The indexes and values vary according to the actual situation of the operating environment.
If an error message is displayed during application execution, view the detailed logs of the operating environment. For details, see the Log Reference.- The default log path is $HOME/ascend/log. For details, see the "Log Files" section.
- The default log level is ERROR. For details about how to query or change the log level, see the "Log Level Setting" section.
- Configure run parameters.
ACL ResNet-50(Python) Usage Guide
- Create the ACL ResNet-50(Python) sample project by referring to Creating an Application Project.
- Prepare model files and data.
- Obtain model files.
- Convert your model.
Convert third-party models into .om offline models that adapt to the Ascend AI Processor. For details about the model conversion function, see Model Conversion.
- On the MindStudio top menu bar, choose Ascend > Model Converter. The Model Converter page is displayed.
- Set the model conversion information as follows:
- Model File: Select the uploaded ResNet-50 network model file (*.prototxt).
- Weight File: Select the uploaded ResNet-50 weight file (*.caffemodel).
- Model Name: Enter a model name, for example, resnet50.
- Target SoC Version: Specify the value as required.
- OutputPath: Select the output directory of the generated .om model. The default output path is $HOME/modelzoo/${Model Name}/${Target SoC Version}/.
- Click Next. In this sample, do not configure the Data Pre-Processing function. Directly click Next to go to the next step.
- On the Advanced Options Preview page, click Command Preview to check the generated model conversion command. After confirmation, click Finish to start model conversion.
- Add one or more model files.
- Prepare inference data. Obtain the input images of this sample from the following links and perform operations based on the actual application scenario.
Links: data image 1 and data image 2
- Linux application scenario
Upload the obtained file to the project_directory/data directory in the development environment as the running user. If the directory does not exist, create one.
- Windows application scenario
After obtaining the images, save them to the project_directory/data directory, for example, C:\Users\user1\AscendProjects\MyApp1\data.
- Linux application scenario
- Run the application project. (For details about how to configure the Python interpreter, see Python SDK Dependency Settings.)On the MindStudio project page, choose from the top menu bar.
- Configure run parameters.
Figure 5 shows a configuration example in the Linux application scenario. After the configuration is complete, click Apply to save the run configuration and click OK to close the run configuration dialog box.
- On the MindStudio project page, choose Run > Run... from the top menu bar. In the dialog box displayed, select the created run configuration and run the application.
Figure 6 shows an example of the execution result. The index and value vary according to the actual situation of the operating environment.
If an error message is displayed during application execution, view the detailed logs of the operating environment. For details, see the Log Reference.- The default log path is $HOME/ascend/log. For details, see the "Log Files" section.
- The default log level is ERROR. For details about how to query or change the log level, see the "Log Level Setting" section.
- Configure run parameters.



