What Should I Do If "build_main build graph[infer_graph_info] failed" Is Displayed During Model Conversion When the Quantization Function Is Enabled
Symptom
During model conversion, the quantization function is configured using --compression_optimize_conf. As a result, the model conversion fails, and the following message is displayed:

Solutions
The possible cause is that the original model fails to be compiled. You can disable the quantization function, convert the model again, and locate the error cause. After the original model is converted successfully with the quantization function disabled, enable the function again.
Parent topic: FAQs