rectified linear unit

A 'rectified linear unit' (ReLU) is a widely-used activation function in deep learning neural networks, known for its simplicity and effectiveness.

πŸ‡ΊπŸ‡Έ US Voice:
πŸ‡¬πŸ‡§ UK Voice:

Definition

C2Deep Learning

(technical, academic)An activation function in neural networks that outputs the input if it is positive, and zero otherwise.

Example

  • The rectified linear unit is preferred for its simplicity and effectiveness in training deep neural networks.

C2Artificial Neural Networks

(technical, academic)A function that introduces non-linearity to a model, aiding in learning complex patterns while being computationally efficient.

Example

  • By using a rectified linear unit, the neural network can learn more intricate relationships in the data.

Similar

Terms that have similar or relatively close meanings to "rectified linear unit":

linear function