npu.ops.gelu

Description

Computes the Gaussian Error Linear Unit (GELU) activation function. Each input tensor is multiplied by one P(X <= x), where P(X) follows N (0, 1).

Prototype

npu.ops.gelu(x)

Parameters

Parameter

Input/Output

Description

x

Input

Input tensor of type float.

Returns

Result tensor after the dropout operation is performed on input x. The data type is the same as the input.

Example

1
2
import npu_device as npu
output = npu.ops.gelu(x)