Spaces:
Configuration error
Configuration error
Update README.md
Browse files
README.md
CHANGED
@@ -19,9 +19,7 @@ Homepage: https://bop.felk.cvut.cz/home/
|
|
19 |
|
20 |
BOP Toolkit: https://github.com/thodan/bop_toolkit
|
21 |
|
22 |
-
|
23 |
-
|
24 |
-
<details><summary>Click to expand</summary>
|
25 |
|
26 |
#### Option 1: Using `huggingface_hub`:
|
27 |
|
@@ -84,9 +82,7 @@ export HF_HUB_ENABLE_HF_TRANSFER=1
|
|
84 |
|
85 |
</details>
|
86 |
|
87 |
-
|
88 |
-
|
89 |
-
<details><summary>Click to expand</summary>
|
90 |
|
91 |
You create a new dataset and want to share it with BOP community. Here is a step-by-step guide to upload the dataset and create a pull request to [our huggingface hub](https://huggingface.co/datasets/bop-benchmark/datasets/). Feel free to reach out to [email protected] if you have any questions.
|
92 |
|
@@ -163,9 +159,9 @@ api.create_commit(repo_id="bop-benchmark/datasets",
|
|
163 |
```
|
164 |
If your dataset is large (> 500 GB), you can upload it in chunks by adding the `multi_commits=True, multi_commits_verbose=True,` argument. More options are available in the [official documentation](https://huggingface.co/docs/huggingface_hub/v0.22.2/en/package_reference/hf_api#huggingface_hub.HfApi.create_pull_request).
|
165 |
|
166 |
-
|
167 |
|
168 |
-
<details><summary>
|
169 |
|
170 |
#### 1. How to upload a large file > 50 GB?
|
171 |
Note that HuggingFace limits the size of each file to 50 GB. If your dataset is larger, you can split it into smaller files:
|
|
|
19 |
|
20 |
BOP Toolkit: https://github.com/thodan/bop_toolkit
|
21 |
|
22 |
+
<details><summary>Downloading datasets</summary>
|
|
|
|
|
23 |
|
24 |
#### Option 1: Using `huggingface_hub`:
|
25 |
|
|
|
82 |
|
83 |
</details>
|
84 |
|
85 |
+
<details><summary>Uploading datasets</summary>
|
|
|
|
|
86 |
|
87 |
You create a new dataset and want to share it with BOP community. Here is a step-by-step guide to upload the dataset and create a pull request to [our huggingface hub](https://huggingface.co/datasets/bop-benchmark/datasets/). Feel free to reach out to [email protected] if you have any questions.
|
88 |
|
|
|
159 |
```
|
160 |
If your dataset is large (> 500 GB), you can upload it in chunks by adding the `multi_commits=True, multi_commits_verbose=True,` argument. More options are available in the [official documentation](https://huggingface.co/docs/huggingface_hub/v0.22.2/en/package_reference/hf_api#huggingface_hub.HfApi.create_pull_request).
|
161 |
|
162 |
+
</details>
|
163 |
|
164 |
+
<details><summary>FAQ</summary>
|
165 |
|
166 |
#### 1. How to upload a large file > 50 GB?
|
167 |
Note that HuggingFace limits the size of each file to 50 GB. If your dataset is larger, you can split it into smaller files:
|