unreal.LearningAgentsActivationFunction¶
- class unreal.LearningAgentsActivationFunction¶
Bases:
EnumBaseActivation functions for neural networks.
C++ Source:
Plugin: LearningAgents
Module: LearningAgents
File: LearningAgentsNeuralNetwork.h
- ELU: LearningAgentsActivationFunction = Ellipsis¶
ELU Activation - Generally performs better than ReLU and is not prone to gradient collapse but slower to evaluate.
- Type:
1
- GELU: LearningAgentsActivationFunction = Ellipsis¶
GELU Activation - Gaussian Error Linear Unit.
- Type:
3
- RE_LU: LearningAgentsActivationFunction = Ellipsis¶
ReLU Activation - Fast to train and evaluate but occasionally causes gradient collapse and untrainable networks.
- Type:
0
- TAN_H: LearningAgentsActivationFunction = Ellipsis¶
TanH Activation - Smooth activation function that is slower to train and evaluate but sometimes more stable for certain tasks.
- Type:
2