flashinfer.activation¶
This module provides a set of activation operations for up/gate layers in transformer MLPs.
Up/Gate output activation¶
  | 
Fused SiLU and Mul operation.  | 
  | 
Fused GeLU Tanh and Mul operation.  | 
  | 
Fused GeLU and Mul operation.  | 
Silu and multiply and quantize batched input tensor to NVFP4 format with mask.  |