Optimizer

Optimizer is a python package that provides:

  • PyTorch implementation of recent optimizer algorithms
  • with support for parallelism techniques for efficient large-scale training.

Currently implemented

Usage

import torch
from torch.distributed.fsdp import FullyShardedDataParallel as FSDP
from kernels import get_kernel

optimizer = get_kernel("motif-technologies/optimizer")

model = None # your model here
fsdp_model = FSDP(model)

optim = optimizer.Muon(
    fsdp_model.parameters(),
    lr=0.01,
    momentum=0.9,
    weight_decay=1e-4,
)

Pre-commit Hooks

This project uses pre-commit to automatically check and format code before commits.

Setup

  1. Install pre-commit:

    pip install pre-commit
    
  2. Install the git hooks:

   pre-commit install

Once installed, the configured hooks will run automatically on each commit.

Included Hooks

The following tools are run via pre-commit:

  • yapf – Python code formatter
  • typos – Spell checker for common typos
  • isort – Organizes and sorts Python imports
  • clang-format – Formats C++/CUDA code (--style=file)
  • pymarkdown – Lints and auto-fixes Markdown files
  • actionlint – Validates GitHub Actions workflows

Usage

  • Run all checks on the entire codebase:

    pre-commit run --all-files
    
  • Run a specific hook (example: isort):

  pre-commit run isort --all-files
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support