Other attributes
Wikidata ID
In neural networks, the activation function of a node defines the output of that node given an input or set of inputs.
ReLU (rectified linear unit) is considered the simplest activation function.
Timeline
No Timeline data yet.
Further Resources
No Further Resources data yet.