Activation Functions

Activation functions for CausalELM estimators

CausalELM.ActivationFunctions.binarystepFunction
binarystep(x)

Apply the binary step activation function to a real number.

Examples

julia> binarystep(1)
1
source
binarystep(x)

Apply the binary step activation function to an array.

Examples

julia> binarystep([-1000, 100, 1, 0, -0.001, -3])
[0, 1, 1, 1, 0, 0]
source
CausalELM.ActivationFunctions.σFunction
σ(x)

Apply the sigmoid activation function to a real number.

Examples

julia> σ(1)
0.7310585786300049
source
σ(x)

Apply the sigmoid activation function to an array.

Examples

julia> σ([1, 0])
[0.7310585786300049, 0.5]
source
CausalELM.ActivationFunctions.tanhFunction
tanh(x)

Apply the tanh activation function to an array.

This is just a vectorized version of Base.tanh

Examples

julia> tanh([1, 0])
[0.7615941559557649, 0.0]
source
CausalELM.ActivationFunctions.reluFunction
relu(x)

Apply the ReLU activation function to a real number.

Examples

julia> relu(1)
1
source
relu(x)

Apply the ReLU activation function to an array.

Examples

julia> relu([1, 0, -1])
[1, 0, 0]
source
CausalELM.ActivationFunctions.leakyreluFunction
leakyrelu(x)

Apply the leaky ReLU activation function to a real number.

Examples

julia> leakyrelu(1)
1
source
leakyrelu(x)

Apply the leaky ReLU activation function to an array.

Examples

julia> leakyrelu([-0.01, 0, 1])
[1, 0, 0]
source
CausalELM.ActivationFunctions.swishFunction
swish(x)

Apply the swish activation function to a real number.

Examples

julia> swish(1)
0.7310585786300049
source
swish(x)

Apply the swish activation function to an array.

Examples

julia> swish([1, 0, -1])
[0.7310585786300049, 0, -0.2689414213699951]
source
CausalELM.ActivationFunctions.softmaxFunction
softmax(x)

Apply the softmax activation function to a real number.

For numbers that have large absolute values this function may become numerically unstable.

Examples

julia> softmax(1)
2.718281828459045
source
softmax(x)

Apply the softmax activation function to an array.

For numbers that have large absolute values this function might be numerically unstable.

Examples

julia> softmax([1, -1])
[2.718281828459045, -0.36787944117144233]
source
CausalELM.ActivationFunctions.softplusFunction
softplus(x)

Apply the softplus activation function to a real number.

Examples

julia> softplus(1)
1.3132616875182228
source
softplus(x)

Apply the softplus activation function to an array.

Examples

julia> softplus([1, -1])
[1.3132616875182228, 0.31326168751822286]
source
CausalELM.ActivationFunctions.geluFunction
gelu(x)

Apply the GeLU activation function to a real number.

Examples

julia> gelu(1)
0.8411919906082768
source
gelu(x)

Apply the GeLU activation function to an array.

Examples

julia> gelu([-1, 0, 1])
[-0.15880800939172324, 0, 0.8411919906082768]
source
CausalELM.ActivationFunctions.gaussianFunction
gaussian(x)

Apply the gaussian activation function to a real number.

Examples

julia> gaussian(1)
0.11443511435028261
source
gaussian(x)

Apply the gaussian activation function to an array.

Examples

julia> gaussian([1, -1])
[0.36787944117144233, 0.36787944117144233]
source
CausalELM.ActivationFunctions.hardtanhFunction
hardtanh(x)

Apply the hardtanh activation function to a real number.

Examples

julia> hardtanh(-2)
-1
source
hardtanh(x)

Apply the hardtanh activation function to an array.

Examples

julia> hardtanh([-2, 0, 2])
[-1, 0, 1]
source
CausalELM.ActivationFunctions.elishFunction
elish(x)

Apply the ELiSH activation function to a real number.

Examples

julia> elish(1)
0.7310585786300049
source
elish(x)

Apply the ELiSH activation function to an array.

Examples

julia> elish([-1, 1])
[-0.17000340156854793, 0.7310585786300049]
source
CausalELM.ActivationFunctions.fourierFunction
fourrier(x)

Apply the Fourier activation function to a real number.

Examples

julia> fourier(1)
0.8414709848078965
source
fourrier(x)

Apply the Fourier activation function to an array.

Examples

julia> fourier([-1, 1])
[-0.8414709848078965, 0.8414709848078965]
source