对于原始网络中的dropout、gelu接口,请检查是否已替换为对应的NPU接口,若未替换,请按照如下示例进行替换。
layers = npu_ops.dropout()
from npu_bridge.estimator.npu import npu_convert_dropout
def gelu(x): cdf = 0.5 * (1.0 + tf.tanh( (np.sqrt(2 / np.pi) * (x + 0.044715 * tf.pow(x, 3))))) return x*cdf layers = gelu()
迁移后的代码:
layers = npu_unary_ops.gelu(x)