Tensara Logo

tensara

Leaky ReLU

EASY

Perform the Leaky ReLU (Leaky Rectified Linear Unit) activation function on an input matrix:

C[i][j]=max(αA[i][j],A[i][j])\text{C}[i][j] = \max(\alpha \cdot \text{A}[i][j], \text{A}[i][j])

where α\alpha is a small positive constant (e.g. 0.01)

The Leaky ReLU function is defined as:

f(x)={xif x>0αxif x0f(x) = \begin{cases} x & \text{if } x > 0 \\ \alpha x & \text{if } x \leq 0 \end{cases}

Input:

  • Matrix A\text{A} of size M×NM \times N
  • α\alpha value (slope for negative values)

Output:

  • Matrix C\text{C} of size M×NM \times N

Notes:

  • Both matrices A\text{A} and C\text{C} are stored in row-major order
  • This problem is adapted from KernelBench

GPU Type

Language

Data Type

Loading...

Loading editor...

CUDA C++ environment

Sample Run Results

Hit "Run" to test your code with sample inputs

Desktop Required for Code Submission

For the best coding experience, please switch to a desktop device to write and submit your solution.