Navigation
API > API/Plugins > API/Plugins/Learning
References
| Module | Learning |
| Header | /Engine/Plugins/Experimental/LearningAgents/Source/Learning/Public/LearningNeuralNetwork.h |
| Include | #include "LearningNeuralNetwork.h" |
Syntax
namespace UE
{
namespace Learning
{
enum EActivationFunction
&123;
ReLU = 0,
ELU = 1,
TanH = 2,
&125;
}
}
Values
| Name | Description |
|---|---|
| ReLU | ReLU Activation - Fast to train and evaluate but occasionally causes gradient collapse and untrainable networks. |
| ELU | ELU Activation - Generally performs better than ReLU and is not prone to gradient collapse but slower to evaluate. |
| TanH | TanH Activation - Smooth activation function that is slower to train and evaluate but sometimes more stable for certain tasks. |
Remarks
Activation Function for use in a Neural Network