Fix checkpoint for vLLM inference
Browse files
model-00001-of-00002.safetensors
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:24686ad53e7f28799f8621e6e358f2bff724294562fe160c72950f94a06f9524
|
3 |
+
size 4961251720
|
model-00002-of-00002.safetensors
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:3cf079c7b081088be3b2f42504bfb7e457a8bc3e2cd4c1a96e562e95d816a0f4
|
3 |
+
size 3639026096
|
model.safetensors.index.json
CHANGED
The diff for this file is too large to render.
See raw diff
|
|