Navigation
API > API/Plugins > API/Plugins/LearningAgents
Activation functions for neural networks.
| Name | ELearningAgentsActivationFunction |
| Type | enum |
| Header File | /Engine/Plugins/Experimental/LearningAgents/Source/LearningAgents/Public/LearningAgentsNeuralNetwork.h |
| Include Path | #include "LearningAgentsNeuralNetwork.h" |
Syntax
enum ELearningAgentsActivationFunction
{
UMETA =(DisplayName = "ReLU"),
UMETA =(DisplayName = "ELU"),
UMETA =(DisplayName = "TanH"),
UMETA =(DisplayName = "GELU"),
}
Values
| Name | Remarks |
|---|---|
| UMETA | ReLU Activation - Fast to train and evaluate but occasionally causes gradient collapse and untrainable networks. |
| UMETA | ELU Activation - Generally performs better than ReLU and is not prone to gradient collapse but slower to evaluate. |
| UMETA | TanH Activation - Smooth activation function that is slower to train and evaluate but sometimes more stable for certain tasks. |
| UMETA | GELU Activation - Gaussian Error Linear Unit. |