Activation Functions
Activation functions for CausalELM estimators
CausalELM.ActivationFunctions
— ModuleActivation functions for Extreme Learning machines
CausalELM.ActivationFunctions.binarystep
— Functionbinarystep(x)
Apply the binary step activation function to a real number.
Examples
julia> binarystep(1)
1
binarystep(x)
Apply the binary step activation function to an array.
Examples
julia> binarystep([-1000, 100, 1, 0, -0.001, -3])
[0, 1, 1, 1, 0, 0]
CausalELM.ActivationFunctions.σ
— Functionσ(x)
Apply the sigmoid activation function to a real number.
Examples
julia> σ(1)
0.7310585786300049
σ(x)
Apply the sigmoid activation function to an array.
Examples
julia> σ([1, 0])
[0.7310585786300049, 0.5]
CausalELM.ActivationFunctions.tanh
— Functiontanh(x)
Apply the tanh activation function to an array.
This is just a vectorized version of Base.tanh
Examples
julia> tanh([1, 0])
[0.7615941559557649, 0.0]
CausalELM.ActivationFunctions.relu
— Functionrelu(x)
Apply the ReLU activation function to a real number.
Examples
julia> relu(1)
1
relu(x)
Apply the ReLU activation function to an array.
Examples
julia> relu([1, 0, -1])
[1, 0, 0]
CausalELM.ActivationFunctions.leakyrelu
— Functionleakyrelu(x)
Apply the leaky ReLU activation function to a real number.
Examples
julia> leakyrelu(1)
1
leakyrelu(x)
Apply the leaky ReLU activation function to an array.
Examples
julia> leakyrelu([-0.01, 0, 1])
[1, 0, 0]
CausalELM.ActivationFunctions.swish
— Functionswish(x)
Apply the swish activation function to a real number.
Examples
julia> swish(1)
0.7310585786300049
swish(x)
Apply the swish activation function to an array.
Examples
julia> swish([1, 0, -1])
[0.7310585786300049, 0, -0.2689414213699951]
CausalELM.ActivationFunctions.softmax
— Functionsoftmax(x)
Apply the softmax activation function to a real number.
For numbers that have large absolute values this function may become numerically unstable.
Examples
julia> softmax(1)
2.718281828459045
softmax(x)
Apply the softmax activation function to an array.
For numbers that have large absolute values this function might be numerically unstable.
Examples
julia> softmax([1, -1])
[2.718281828459045, -0.36787944117144233]
CausalELM.ActivationFunctions.softplus
— Functionsoftplus(x)
Apply the softplus activation function to a real number.
Examples
julia> softplus(1)
1.3132616875182228
softplus(x)
Apply the softplus activation function to an array.
Examples
julia> softplus([1, -1])
[1.3132616875182228, 0.31326168751822286]
CausalELM.ActivationFunctions.gelu
— Functiongelu(x)
Apply the GeLU activation function to a real number.
Examples
julia> gelu(1)
0.8411919906082768
gelu(x)
Apply the GeLU activation function to an array.
Examples
julia> gelu([-1, 0, 1])
[-0.15880800939172324, 0, 0.8411919906082768]
CausalELM.ActivationFunctions.gaussian
— Functiongaussian(x)
Apply the gaussian activation function to a real number.
Examples
julia> gaussian(1)
0.11443511435028261
gaussian(x)
Apply the gaussian activation function to an array.
Examples
julia> gaussian([1, -1])
[0.36787944117144233, 0.36787944117144233]
CausalELM.ActivationFunctions.hardtanh
— Functionhardtanh(x)
Apply the hardtanh activation function to a real number.
Examples
julia> hardtanh(-2)
-1
hardtanh(x)
Apply the hardtanh activation function to an array.
Examples
julia> hardtanh([-2, 0, 2])
[-1, 0, 1]
CausalELM.ActivationFunctions.elish
— Functionelish(x)
Apply the ELiSH activation function to a real number.
Examples
julia> elish(1)
0.7310585786300049
elish(x)
Apply the ELiSH activation function to an array.
Examples
julia> elish([-1, 1])
[-0.17000340156854793, 0.7310585786300049]
CausalELM.ActivationFunctions.fourier
— Functionfourrier(x)
Apply the Fourier activation function to a real number.
Examples
julia> fourier(1)
0.8414709848078965
fourrier(x)
Apply the Fourier activation function to an array.
Examples
julia> fourier([-1, 1])
[-0.8414709848078965, 0.8414709848078965]