ActivationParam
属性 |
类型 |
默认值 |
描述 |
---|---|---|---|
activation_type |
torch_atb.ActivationType |
torch_atb.ActivationType.ACTIVATION_UNDEFINED |
此默认类型不可用,用户需配置此项参数。 |
scale |
float |
1.0 |
- |
dim |
int |
-1 |
- |
gelu_mode |
torch_atb.ActivationParam.GeLUMode |
torch_atb.ActivationParam.GeLUMode.TANH_MODE |
- |
ActivationParam.GeLUMode
枚举项:
- TANH_MODE
- NONE_MODE
调用示例
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 | import torch import torch_atb def activation(): activation_param = torch_atb.ActivationParam( activation_type = torch_atb.ActivationType.ACTIVATION_SWISH, scale = 1.0) activation = torch_atb.Operation(activation_param) intensor = torch.rand(2, 3, 5).bfloat16() intensor_npu = intensor.npu() print("intensor: ", intensor_npu) def activation_run(): activation_outputs = activation.forward([intensor_npu]) return activation_outputs outputs = activation_run() print("outputs: ", outputs) if __name__ == "__main__": activation() |
父主题: OpParam