Navigation
API > API/Plugins > API/Plugins/LearningAgents
References
| Module | LearningAgents |
| Header | /Engine/Plugins/Experimental/LearningAgents/Source/LearningAgents/Public/LearningAgentsNeuralNetwork.h |
| Include | #include "LearningAgentsNeuralNetwork.h" |
Syntax
enum ELearningAgentsActivationFunction
&123;
ReLU,
ELU,
TanH,
&125;
Values
| Name | Description |
|---|---|
| ReLU | ReLU Activation - Fast to train and evaluate but occasionally causes gradient collapse and untrainable networks. |
| ELU | ELU Activation - Generally performs better than ReLU and is not prone to gradient collapse but slower to evaluate. |
| TanH | TanH Activation - Smooth activation function that is slower to train and evaluate but sometimes more stable for certain tasks. |
Remarks
Activation functions for neural networks.