Gradient-1048k-r256-LoRA

This is a LoRA extracted from a language model. It was extracted using mergekit.

LoRA Details

This LoRA adapter was extracted from gradientai/Llama-3-8B-Instruct-Gradient-1048k and uses NousResearch/Meta-Llama-3-8B-Instruct as a base.

Parameters

The following command was used to extract this LoRA adapter:

/usr/local/bin/mergekit-extract-lora --out-path=loras/Gradient-1048k-r256-LoRA --model=gradientai/Llama-3-8B-Instruct-Gradient-1048k --base-model=NousResearch/Meta-Llama-3-8B-Instruct --no-lazy-unpickle --max-rank=256 --gpu-rich -v --embed-lora --skip-undecomposable
Downloads last month
0
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no pipeline_tag.

Model tree for kromcomp/L3-Gradient-1048k-r256-LoRA

Adapter
(96)
this model