activation / README.md
TaehyunKimMotif's picture
add readme with precommit hooks and applied pre commit to all files
f517c97
|
raw
history blame
1.95 kB
metadata
tags:
  - kernel

Activation

Activation is a python package that contains custom CUDA-based activation kernels, primarily targeting AMD GPUs.

Usage

import torch
from kernels import get_kernel

activation = get_kernel("motif-technologies/activation")

torch.set_default_device("cuda")
poly_norm = activation.layers.PolyNorm(eps=1e-6)
x = torch.randn(10, 10)

print(poly_norm(x))

Performance

PolyNorm

  • Test cases are from the Motif LLM
  • You can reproduce the results with:
cd tests
pytest --run-perf --do-plot

PolyNorm Performance

Pre-commit Hooks

This project uses pre-commit to automatically check and format code before commits.

Setup

  1. Install pre-commit:

    pip install pre-commit
    
  2. Install the git hooks:

   pre-commit install

Once installed, the configured hooks will run automatically on each commit.

Included Hooks

The following tools are run via pre-commit:

  • yapf – Python code formatter
  • typos – Spell checker for common typos
  • isort – Organizes and sorts Python imports
  • clang-format – Formats C++/CUDA code (--style=file)
  • pymarkdown – Lints and auto-fixes Markdown files
  • actionlint – Validates GitHub Actions workflows

Usage

  • Run all checks on the entire codebase:

    pre-commit run --all-files
    
  • Run a specific hook (example: isort):

  pre-commit run isort --all-files