Optimizer
Optimizer is a python package that provides:
- PyTorch implementation of recent optimizer algorithms
- with support for parallelism techniques for efficient large-scale training.
Currently implemented
Usage
import torch
from torch.distributed.fsdp import FullyShardedDataParallel as FSDP
from kernels import get_kernel
optimizer = get_kernel("motif-technologies/optimizer")
model = None # your model here
fsdp_model = FSDP(model)
optim = optimizer.Muon(
fsdp_model.parameters(),
lr=0.01,
momentum=0.9,
weight_decay=1e-4,
)
Pre-commit Hooks
This project uses pre-commit to automatically check and format code before commits.
Setup
Install pre-commit:
pip install pre-commit
Install the git hooks:
pre-commit install
Once installed, the configured hooks will run automatically on each commit.
Included Hooks
The following tools are run via pre-commit:
- yapf β Python code formatter
- typos β Spell checker for common typos
- isort β Organizes and sorts Python imports
- clang-format β Formats C++/CUDA code (
--style=file
) - pymarkdown β Lints and auto-fixes Markdown files
- actionlint β Validates GitHub Actions workflows
Usage
Run all checks on the entire codebase:
pre-commit run --all-files
Run a specific hook (example: isort):
pre-commit run isort --all-files
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
π
Ask for provider support