sometimesanotion
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -48,7 +48,7 @@ This model was made in two branches: a della_linear merge, and a sequence of mo
|
|
48 |
|
49 |
### Configuration
|
50 |
|
51 |
-
This model was made with two branches, diverged and recombined. The first branch was a Vimarckoso v3-based della_linear merge, and the second, a sequence of model_stock and then breadcrumbs+LoRA. The LoRAs required minor adjustments to most component models for intercompatibility. The breadcrumbs and della merges required highly focused layer-specific gradients to effectively combine the models. This was my most complex merge to date. Suffice it to say, the SLERP merge below which finalized it was one of the simpler steps.
|
52 |
|
53 |
```yaml
|
54 |
name: Lamarck-14B-v0.6-rc4
|
|
|
48 |
|
49 |
### Configuration
|
50 |
|
51 |
+
This model was made with two branches, diverged and recombined. The first branch was a Vimarckoso v3-based della_linear merge, and the second, a sequence of model_stock and then breadcrumbs+LoRA. The LoRAs required minor adjustments to most component models for intercompatibility. The breadcrumbs and della merges required many small slices, with highly focused layer-specific gradients, to effectively combine the models. This was my most complex merge to date. Suffice it to say, the SLERP merge below which finalized it was one of the simpler steps.
|
52 |
|
53 |
```yaml
|
54 |
name: Lamarck-14B-v0.6-rc4
|