torch.special.scaled_modified_bessel_k0
function scaled_modified_bessel_k0<S extends Shape>(input: Tensor<S, 'float32'>, _options?: SpecialUnaryOptions<S>): Tensor<S, 'float32'>Computes the exponentially scaled modified Bessel function of the second kind of order 0.
Returns exp(x) * K₀(x), the exponentially scaled variant that prevents catastrophic underflow for large arguments. The unscaled K₀(x) decays exponentially (K₀(x) ~ √(π/(2x)) exp(-x)), underflowing to zero for x > 50 or so. This scaled variant removes the exp(-x) decay, keeping the function bounded and numerically stable across the entire positive real axis. Essential for:
- Heat conduction: computing temperature fields at large distances from source (no underflow)
- Electrostatics: far-field potential calculations in unbounded domains
- Signal processing: designing long-reach filters and impulse responses
- Numerical computations: algorithms requiring K₀ values where overflow/underflow is problematic
- Quantum field theory: propagator computations in dimensional reduction
- Waveguide theory: coupling at large propagation distances
Numerical Advantage: Direct computation of K₀(x) for x > 50 returns 0.0 due to underflow. The scaled form exp(x)*K₀(x) behaves smoothly: asymptotically K₀_scaled(x) → √(π/(2x)) as x→∞. This is the standard numerically stable way to work with K₀ at large arguments.
\begin{aligned} \\text{Scaled K}_0(x) = \\exp(x) \\cdot \\text{K}_0(x) \\ \\text{Asymptotic (large } x): \\text{Scaled K}_0(x) \\sim \\sqrt{\\frac{\\pi}{2x}} \\ \\text{Unscaled form: } \\text{K}_0(x) = \\int_0^\\infty \\exp(-x\\cosh(t)) dt \end{aligned}- Prevents underflow: K₀(x) decays as exp(-x), but scaled form removes this decay
- Asymptotic: For large x, scaled_k0(x) → √(π/(2x)), smooth power-law decay
- Numerically stable: Use this instead of modified_bessel_k0 for any x 10
- Positive: exp(x)*K₀(x) 0 for all x 0 (K₀ positive, exp always positive)
- Even smoother: Scales the divergence at small x while preventing underflow at large x
- Signal processing: Impulse response remains bounded across frequencies
- Computational advantage: Single function call gives stable K₀ across all scales
- Input domain x 0: Undefined for x ≤ 0, singular at x=0
- Singularity: exp(x)*K₀(x) still → ∞ as x→0⁺ due to K₀'s logarithmic singularity
- Not the same as K₀: Returns scaled value, not K₀ directly
- Recurrence relations: Modified Bessel recurrence relations don't directly apply
Parameters
inputTensor<S, 'float32'>- Input tensor with positive real values x 0. For x ≤ 0, results are undefined
_optionsSpecialUnaryOptions<S>optional
Returns
Tensor<S, 'float32'>– Tensor with scaled K₀(x) = exp(x)*K₀(x) values (bounded, smooth decay)Examples
// Numerical stability: scaled form prevents underflow
const x = torch.tensor([0.5, 2, 10, 50, 100]);
const k0_direct = torch.special.modified_bessel_k0(x);
// Result: [0.924, 0.114, ~0, 0, 0] - underflow for large x
const k0_scaled = torch.special.scaled_modified_bessel_k0(x);
// Result: [1.519, 1.147, 0.125, 0.089, 0.063] - bounded across all values// Asymptotic behavior: approaches √(π/(2x))
const largeX = torch.linspace(10, 100, 100);
const scaled = torch.special.scaled_modified_bessel_k0(largeX);
const asymptotic = torch.sqrt(Math.PI / (2 * largeX.unsqueeze(-1)));
// For x > 50, scaled(x) ≈ √(π/(2x)) within numerical precision// Heat conduction at large distances: stable far-field computation
const radius = torch.linspace(0.1, 100, 1000); // Distance from heat source
const temperature = torch.special.scaled_modified_bessel_k0(radius);
// Smooth decay without underflow, even at r=100
// Physical interpretation: steady-state temperature in unbounded cylinder// Electrostatic potential: cylindrical charge distribution
const distances = torch.tensor([0.1, 1, 10, 50, 100]);
const potentials = torch.special.scaled_modified_bessel_k0(distances);
// Green's function for Laplace equation: maintains precision far from source// Relationship with unscaled form
const x = torch.tensor([1, 5, 20]);
const k0_scaled = torch.special.scaled_modified_bessel_k0(x);
const k0_direct = torch.special.modified_bessel_k0(x);
// Mathematically: k0_scaled = k0_direct * exp(x)
// But k0_scaled computed numerically stable, k0_direct may underflowSee Also
- PyTorch torch.special.scaled_modified_bessel_k0()
- torch.special.modified_bessel_k0 - Unscaled form (prone to underflow)
- torch.special.scaled_modified_bessel_k1 - Scaled K₁ (order 1 variant)
- torch.special.modified_bessel_i0 - First kind (exponentially growing)
- torch.special.i0e - Exponentially scaled I₀ (related concept)