Skip to content
Cloud School Docs
  • Cloud School TV
    • Service status
    • Technical Support
    • CloudSchool.gr
  • Free content
    • Docs
    • Blog
    • Videos
    • Dictionary
    • Downloads
  • Courses
    • Instructor-led training (ILT)
    • Self-paced training (SPT)
  • Books
    • Amazon
    • Draft2Digital
    • Leanpub
    • Google Books
  • Contact
A B C D E F G H I J K L M N O P Q R S T U V W X Y Z
Ra Rb Re Ri Rm Rn Ro Rp
Rec Reg Rei Rel Res

ReLU function

The ReLU (rectified linear unit function) function is an ANN activation function which calculates a linear function of the inputs. If the result is positive, it outputs that result. If it is negative, it outputs 0.

The mathematical formula for the ReLU function is f (x) = max(0, x). The graph of the ReLU function is shown below.

ReLU function graph | Download Scientific Diagram

Related Terms

  • TLU
  • artificial intelligence
  • RNN
  • tanh function
  • tokenization

Search content

External links

  • Microsoft Learn
  • Microsoft Credentials
  • Microsoft Learn Shows
  • Azure Architecture Center
  • Azure portal
  • M365 portal
  • CompTIA Training
  • CertNexus Certifications
  • AWS portal
  • GCP console
  • Kaggle ML community
  • DeepLearning.ai
  • Azure Blog
  • Azure Updates
  • Azure Friday
  • Github Copilot
  • Azure Github repos
  • Azure WAF
  • Azure CAF
  • Azure Tips and Tricks
  • Azure Charts
  • M365 Maps
  • Endjin Azure weekly newsletter
  • Build5Nines newsletter
  • Stay current with Azure
© 2025 Cloud School Docs • Built with GeneratePress