metadata
tags:
- kernel
Activation
Activation is a python package that contains custom CUDA-based activation kernels, primarily targeting AMD GPUs.
Usage
import torch
from kernels import get_kernel
activation = get_kernel("motif-technologies/activation")
torch.set_default_device("cuda")
poly_norm = activation.layers.PolyNorm(eps=1e-6)
x = torch.randn(10, 10)
print(poly_norm(x))
Performance
PolyNorm
- Test cases are from the Motif LLM
- You can reproduce the results with:
cd tests
pytest --run-perf --do-plot
Pre-commit Hooks
This project uses pre-commit to automatically check and format code before commits.
Setup
Install pre-commit:
pip install pre-commit
Install the git hooks:
pre-commit install
Once installed, the configured hooks will run automatically on each commit.
Included Hooks
The following tools are run via pre-commit:
- yapf β Python code formatter
- typos β Spell checker for common typos
- isort β Organizes and sorts Python imports
- clang-format β Formats C++/CUDA code (
--style=file
) - pymarkdown β Lints and auto-fixes Markdown files
- actionlint β Validates GitHub Actions workflows
Usage
Run all checks on the entire codebase:
pre-commit run --all-files
Run a specific hook (example: isort):
pre-commit run isort --all-files