Soboleva modified hyperbolic tangent

The Soboleva modified hyperbolic tangent, also known as (parametric) Soboleva modified hyperbolic tangent activation function ([P]SMHTAF),[nb 1] is a special S-shaped function based on the hyperbolic tangent, given by This function was originally proposed as "modified hyperbolic tangent"[nb 1] by Ukrainian scientist Elena V. Soboleva (Елена В. Соболева) as a utility function for multi-objective optimization and choice modelling in decision-making.

[1][2][3] The function has since been introduced into neural network theory and practice.

[4] It was also used in economics for modelling consumption and investment,[5] to approximate current-voltage characteristics of field-effect transistors and light-emitting diodes,[6] to design antenna feeders,[7][predatory publisher] and analyze plasma temperatures and densities in the divertor region of fusion reactors.

[8] Derivative of the function is defined by the formula:

− smht ⁡ ( x )

{\displaystyle \operatorname {smht} '(x)\doteq {\frac {ae^{ax}+be^{-bx}}{e^{cx}+e^{-dx}}}-\operatorname {smht} (x){\frac {ce^{cx}-de^{-dx}}{e^{cx}+e^{-dx}}}}

The following conditions are keeping the function limited on y-axes: a ≤ c, b ≤ d. A family of recurrence-generated parametric Soboleva modified hyperbolic tangent activation functions (NPSMHTAF, FPSMHTAF) was studied with parameters a = c and b = d.[9] It is worth noting that in this case, the function is not sensitive to flipping the left and right-sides parameters: The function is sensitive to ratio of the denominator coefficients and often is used without coefficients in the numerator: Extremum estimates:

min

ln ⁡

With parameters a = b = c = d = 1 the modified hyperbolic tangent function reduces to the conventional tanh(x) function, whereas for a = b = 1 and c = d = 0, the term becomes equal to sinh(x).