View all functions

CategoryMath: Trigonometry
GPUYes

What does the tanh function do in MATLAB / RunMat?

y = tanh(x) evaluates the hyperbolic tangent of each element in x, preserving MATLAB's column-major layout and broadcasting rules across scalars, arrays, and tensors.

How does the tanh function behave in MATLAB / RunMat?

  • Operates on scalars, vectors, matrices, and N-D tensors with MATLAB-compatible implicit expansion.
  • Logical and integer inputs are promoted to double precision before evaluation so downstream arithmetic keeps MATLAB's numeric semantics.
  • Complex values follow the analytic extension tanh(a + bi) = sinh(a + bi) / cosh(a + bi), propagating NaN/Inf components component-wise.
  • Character arrays are interpreted through their Unicode code points and return dense double arrays that mirror MATLAB's behavior.
  • Inputs that already live on the GPU stay resident when the provider implements unary_tanh; otherwise RunMat gathers to the host, computes, and reapplies residency hints for later operations.
  • Empty inputs and singleton dimensions are preserved without introducing extraneous allocations.
  • String and string-array arguments raise descriptive errors to match MATLAB's numeric-only contract for the hyperbolic family.

tanh Function GPU Execution Behaviour

  • With RunMat Accelerate active, tensors remain on the device and execute through the provider's unary_tanh hook (or a fused elementwise kernel) without leaving GPU memory.
  • If the provider declines the operation—for example, when it lacks the hook for the active precision—RunMat transparently gathers to the host, computes the result, and reapplies the requested residency rules.
  • Fusion planning keeps neighbouring elementwise operators grouped, reducing host↔device transfers even when an intermediate fallback occurs.

Examples of using the tanh function in MATLAB / RunMat

Hyperbolic tangent of a real scalar

y = tanh(1);

Expected output:

y = 0.7616

Applying tanh to a symmetric vector

x = linspace(-2, 2, 5);
y = tanh(x);

Expected output:

y = [-0.9640  -0.7616         0   0.7616   0.9640]

Evaluating tanh on a matrix in GPU memory

G = gpuArray([0    0.5; 1.0  1.5]);
result_gpu = tanh(G);
result = gather(result_gpu);

Expected output:

result =
         0    0.4621
    0.7616    0.9051

Computing tanh for complex angles

z = 0.5 + 1.0i;
w = tanh(z);

Expected output:

w = 1.0428 + 0.8069i

Converting character codes via tanh

c = tanh('ABC');

Expected output:

c = [1.0000  1.0000  1.0000]

Preserving empty array shapes

E = zeros(0, 3);
out = tanh(E);

Expected output:

out = zeros(0, 3)

Stabilising activation functions

inputs = [-3 -1 0 1 3];
activations = tanh(inputs / 2);

Expected output:

activations = [-0.9051  -0.4621         0   0.4621   0.9051]

GPU residency in RunMat (Do I need gpuArray?)

You usually do not need to call gpuArray explicitly. The fusion planner keeps tensors on the GPU whenever the active provider exposes the necessary kernels (such as unary_tanh). Manual gpuArray / gather calls remain supported for MATLAB compatibility or when you need to pin residency before interacting with external code.

FAQ

When should I reach for tanh?

Use tanh for hyperbolic tangent evaluations—common in signal processing, numerical solvers, and neural-network activations thanks to its bounded output.

Does tanh support complex numbers?

Yes. RunMat mirrors MATLAB by evaluating tanh(z) = sinh(z) / cosh(z) for complex z, producing correct real and imaginary parts while propagating NaN/Inf values.

How does the GPU fallback work?

If the provider lacks unary_tanh, RunMat gathers the tensor to host memory, computes the result, and reapplies residency choices so downstream GPU consumers still see device-backed tensors when appropriate.

Can tanh appear in fused GPU kernels?

Absolutely. The fusion planner emits WGSL kernels that inline tanh, and providers can supply custom fused pipelines for even higher performance.

How does tanh treat logical arrays?

Logical arrays are promoted to 0.0 or 1.0 doubles before evaluation, matching MATLAB's behavior for the hyperbolic family.

What happens with empty or singleton dimensions?

Shapes are preserved. Empty inputs return empty outputs, and singleton dimensions remain intact so downstream broadcasting behaves as expected.

Do I need to worry about numerical overflow?

tanh saturates towards ±1 for large-magnitude real inputs, providing stable results. Complex poles can still yield infinities, mirroring MATLAB.

Can I differentiate tanh in RunMat?

Yes. The autograd infrastructure recognises tanh as a primitive and records it on the reverse-mode tape for native gradients once acceleration is enabled.

See Also

sinh, cosh, atanh, gpuArray, gather

Source & Feedback