silu

paddle.nn.functional. silu ( x: Tensor, inplace: bool = False, name: str | None = None ) Tensor [source]

silu activation

\[silu(x) = \frac{x}{1 + e^{-x}}\]

Where \(x\) is the input Tensor.

Note

Alias Support: The parameter name input can be used as an alias for x. For example, silu(input=tensor_x) is equivalent to silu(x=tensor_x).

Parameters
  • x (Tensor) – The input Tensor with data type bfloat16, float16, float32, float64, complex64, complex128. alias: input.

  • inplace (bool, optional) – Whether to use inplace operation. Default: False.

  • name (str|None, optional) – For details, please refer to api_guide_Name. Generally, no setting is required. Default: None.

Returns

A Tensor with the same data type and shape as x.

Examples

>>> import paddle
>>> import paddle.nn.functional as F

>>> x = paddle.to_tensor([1.0, 2.0, 3.0, 4.0])
>>> out = F.silu(x)
>>> print(out)
Tensor(shape=[4], dtype=float32, place=Place(cpu), stop_gradient=True,
[0.73105860, 1.76159406, 2.85772228, 3.92805505])

>>> out = F.silu(x, True)
>>> print(out)
Tensor(shape=[4], dtype=float32, place=Place(cpu), stop_gradient=True,
[0.73105860, 1.76159406, 2.85772228, 3.92805505])
>>> print(x)
Tensor(shape=[4], dtype=float32, place=Place(cpu), stop_gradient=True,
[0.73105860, 1.76159406, 2.85772228, 3.92805505])