Software Package Upload

Upload the Ascend-cann-amct_{software version}_linux-{arch}.tar.gz software package to any directory (for example, $HOME/amct/) on the Linux server as the installation user of the.

Decompress the AMCT package:

tar -zxvf Ascend-cann-amct-{software version}_linux-{arch}.tar.gz

Find the following extracted packages.

Table 1 Extracted parts of the AMCT package

Level-1 Directory

Level-2 Directory

Description

Use and Precaution

amct_caffe/

Caffe AMCT directory

  • For details, see AMCT (Caffe).
  • A Caffe environment must be set up. Perform inference on a quantized model in the inference environment powered by the Ascend AI Processor.

amct_caffe-{version}-py3-none-linux_{arch}.whl

Caffe AMCT package

caffe_patch.tar.gz

Caffe patch package

amct_tf/

TensorFlow AMCT directory

amct_tensorflow-{version}-py3-none-linux_{arch}.tar.gz

TensorFlow AMCT package, which is installed from source code

  • amct_tensorflow and amct_tensorflow_ascend are mutually exclusive.
  • For details, see AMCT (TensorFlow).
  • A TensorFlow environment must be set up. Perform inference on a quantized model in the inference environment powered by the Ascend AI Processor.

amct_tensorflow_ascend-{version}-py3-none-linux_{arch}.tar.gz

TF Adapter AMCT package, which is installed from source code

  • amct_tensorflow and amct_tensorflow_ascend are mutually exclusive.
  • Only the supports the features in this directory.
  • For details, see AMCT (TensorFlow,Ascend).
  • A TensorFlow environment must be set up. Online inference can be performed on a quantized model based on the Ascend AI Processor in the online inference environment with NPUs.

amct_pytorch/

PyTorch AMCT directory

  • For details, see AMCT (PyTorch).
  • A PyTorch environment must be set up. Perform inference on a quantized model in the inference environment powered by the Ascend AI Processor.

amct_pytorch-{version}-py3-none-linux_{arch}.tar.gz

PyTorch AMCT source package

amct_onnx/

ONNX AMCT directory

  • For details, see AMCT (ONNX).
  • An ONNX Runtime environment must be set up. Perform inference on a quantized model in the inference environment powered by the Ascend AI Processor.

amct_onnx-{version}-py3-none-linux_{arch}.whl

ONNX AMCT package

amct_onnx_op.tar.gz

ONNX Runtime AMCT custom operator package

conf/

-

-

Records package installation information. You do not need to pay attention to it.

latest_manager/

-

-

Public script used for installation and upgrade. You do not need to pay attention to it.

third_party/

Ascend-nca-{software version}-linux.{arch}.run

NCA runfile

Applicable to performance-based automatic quantization. This version does not support the features of third_party.

{version} indicates the AMCT version. {software version} indicates the software version. {arch} indicates the OS architecture.