--dynamic_batch_size
Description
Sets dynamic batch size profiles. Applies to the scenario where image count per inference batch is unfixed.
See Also
This option must be used in conjunction with --input_shape and is mutually exclusive with --dynamic_image_size and --dynamic_dims.
Only the scenario where N is in the first place of the shape is supported. That is, the first place of the shape is set to -1. If N is not in the first place, use --dynamic_dims to set it.
Arguments
Argument: batch size profiles, for example, "1,2,4,8".
Format: Enclose the whole argument in double quotation marks (""), and separate the batch sizes by commas (,).
Restrictions: The number of batch size profiles has a value range (1, 100]. Set at least two profiles (a maximum of 100 batch size profiles are supported). Separate the batch size profiles with commas (,). The value range of each batch size is [1, 2048].
Suggestions and Benefits
- Too large batch sizes or too many batch size profiles will cause model tuning failures.
- In the scenario where you have set too large batch sizes or too many batch size profiles, you are advised to run the swapoff -a command to disable the use of swap space as memory to prevent slow operating environment.
Examples
--input_shape="data:-1,3,416,416;img_info:-1,4" --dynamic_batch_size="1,2,4,8"
Where, -1 in --input_shape indicates dynamic batch size enabled. When the AOE tunes a model, the supported input combinations are as follows:
Profile 0: data(1,3,416,416)+img_info(1,4)
Profile 1: data(2,3,416,416)+img_info(2,4)
Profile 2: data(4,3,416,416)+img_info(4,4)
Profile 3: data(8,3,416,416)+img_info(8,4)
Dependencies and Restrictions
- Restrictions:
If the number of images to be processed each batch is unfixed, you can set this option to dynamically allocate the number of images to be processed each batch. For example, to process two, four, or eight images per inference batch, set this option to 2,4,8. The memory will be allocated based on the runtime batch size.
- API Restrictions:
If this option is used to set dynamic batch size during model tuning, the aclmdlSetDynamicBatchSize API should be called before aclmdlExecute to set the runtime batch size.
For details about how to use the aclmdlSetDynamicBatchSize API, see aclmdlSetDynamicBatchSize.