--framework
Description
Sets the framework of the original network model.
See Also
None
Argument
Argument:
- 0: Caffe
- 1: Model file in .air format of the MindSpore framework or standard .air file exported from TorchAir through export
- 3: TensorFlow
- 5: ONNX
Restrictions:
- This option is optional when --mode is set to 1. The TensorFlow, Caffe, and ONNX frameworks are supported. To include this option, ensure that the --om and --framework options are consistent. If this option is not included, conversion from the offline model to a .json file is performed. See the following examples:
--mode=1 --framework=0 --om=$HOME/module/resnet50.prototxt --mode=1 --framework=3 --om=$HOME/module/resnet50_tensorflow*.pb --mode=1 --framework=5 --om=$HOME/module/resnet50.onnx
- This option is required when --mode is set to 0 or 3. The Caffe, TensorFlow, MindSpore, and ONNX frameworks are supported.
- If this option is set to 0, the Caffe network model is used. The model includes a .prototxt model file and a .caffemodel weight file. The op name and op type fields of the two files must be the same (case-sensitive).
- If this option is set to 3, the TensorFlow network model is used. Only the FrozenGraphDef format is supported, that is, the model file with a .pb file name extension is supported. The .pb file is stored in Protobuf format. The network model and weight data are stored in the same file.
- If this option is set to 5, the ONNX network model is used. Operators of opset v11 to v15 in the ai.onnx operator domain are supported. However, the .pth model of the PyTorch framework needs to be converted into an ONNX model before model conversion.
- If this option is set to 1, the MindSpore framework is used. In this case, note the following restrictions:
- Only model files in *.air format are convertible.
- --mode can only be set to 0.
- --input_format can only be set to NCHW. Other values are invalid, but the model will be successfully converted.
- Under the MindSpore framework, --out_nodes, --is_output_adjust_hw_layout, --input_fp16_nodes, --is_input_adjust_hw_layout, and --op_name_map do not take effect, but the model will be successfully converted.
- If the model size exceeds 2 GB, the *.air file, weight folder, and weight file in the folder are generated when the model is saved in the MindSpore framework. Ensure that the weight folder is stored in the same directory as the *.air file when converting the model. Otherwise, errors are reported during model conversion.
Suggestions and Benefits
None
Example
- Caffe:
--mode=0 --framework=0 --model=$HOME/module/resnet50.prototxt --weight=$HOME/module/resnet50.caffemodel --output=$HOME/module/out/caffe_resnet50 --soc_version=<soc_version> - MindSpore:
--mode=0 --framework=1 --model=$HOME/module/ResNet50.air --output=$HOME/module/out/ResNet50_mindspore --soc_version=<soc_version> - TensorFlow:
--mode=0 --framework=3 --model=$HOME/module/resnet50_tensorflow*.pb --output=$HOME/module/out/tf_resnet50 --soc_version=<soc_version> - ONNX:
--mode=0 --framework=5 --model=$HOME/module/resnet50.onnx --output=$HOME/module/out/onnx_resnet50 --soc_version=<soc_version>
Applicability
Dependencies and Restrictions
None
Parent topic: Input Options