The dataset viewer is not available for this split.
Cannot load the dataset split (in streaming mode) to extract the first rows.
Error code: StreamingRowsError Exception: RuntimeError Message: Disallowed deserialization of 'arrow.py_extension_type': storage_type = list<item: list<item: list<item: float>>> serialized = b'\x80\x04\x95M\x00\x00\x00\x00\x00\x00\x00\x8c\x1adatasets.features.features\x94\x8c\x14Array3DExtensionType\x94\x93\x94K\x03K\xe0K\xe0\x87\x94\x8c\x07float32\x94\x86\x94R\x94.' pickle disassembly: 0: \x80 PROTO 4 2: \x95 FRAME 77 11: \x8c SHORT_BINUNICODE 'datasets.features.features' 39: \x94 MEMOIZE (as 0) 40: \x8c SHORT_BINUNICODE 'Array3DExtensionType' 62: \x94 MEMOIZE (as 1) 63: \x93 STACK_GLOBAL 64: \x94 MEMOIZE (as 2) 65: K BININT1 3 67: K BININT1 224 69: K BININT1 224 71: \x87 TUPLE3 72: \x94 MEMOIZE (as 3) 73: \x8c SHORT_BINUNICODE 'float32' 82: \x94 MEMOIZE (as 4) 83: \x86 TUPLE2 84: \x94 MEMOIZE (as 5) 85: R REDUCE 86: \x94 MEMOIZE (as 6) 87: . STOP highest protocol among opcodes = 4 Reading of untrusted Parquet or Feather files with a PyExtensionType column allows arbitrary code execution. If you trust this file, you can enable reading the extension type by one of: - upgrading to pyarrow >= 14.0.1, and call `pa.PyExtensionType.set_auto_load(True)` - disable this error by running `import pyarrow_hotfix; pyarrow_hotfix.uninstall()` We strongly recommend updating your Parquet/Feather files to use extension types derived from `pyarrow.ExtensionType` instead, and register this type explicitly. See https://arrow.apache.org/docs/dev/python/extending_types.html#defining-extension-types-user-defined-types for more details. Traceback: Traceback (most recent call last): File "/src/services/worker/src/worker/job_runners/split/first_rows.py", line 322, in compute compute_first_rows_from_parquet_response( File "/src/services/worker/src/worker/job_runners/split/first_rows.py", line 88, in compute_first_rows_from_parquet_response rows_index = indexer.get_rows_index( File "/src/libs/libcommon/src/libcommon/parquet_utils.py", line 444, in get_rows_index return RowsIndex( File "/src/libs/libcommon/src/libcommon/parquet_utils.py", line 347, in __init__ self.parquet_index = self._init_parquet_index( File "/src/libs/libcommon/src/libcommon/parquet_utils.py", line 364, in _init_parquet_index response = get_previous_step_or_raise( File "/src/libs/libcommon/src/libcommon/simple_cache.py", line 566, in get_previous_step_or_raise raise CachedArtifactError( libcommon.simple_cache.CachedArtifactError: The previous step failed. During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/src/services/worker/src/worker/utils.py", line 126, in get_rows_or_raise return get_rows( File "/src/services/worker/src/worker/utils.py", line 64, in decorator return func(*args, **kwargs) File "/src/services/worker/src/worker/utils.py", line 103, in get_rows rows_plus_one = list(itertools.islice(ds, rows_max_number + 1)) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 1388, in __iter__ for key, example in ex_iterable: File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 282, in __iter__ for key, pa_table in self.generate_tables_fn(**self.kwargs): File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/parquet/parquet.py", line 85, in _generate_tables parquet_file = pq.ParquetFile(f) File "/src/services/worker/.venv/lib/python3.9/site-packages/pyarrow/parquet/core.py", line 341, in __init__ self.reader.open( File "pyarrow/_parquet.pyx", line 1261, in pyarrow._parquet.ParquetReader.open File "pyarrow/types.pxi", line 88, in pyarrow.lib._datatype_to_pep3118 File "/src/services/worker/.venv/lib/python3.9/site-packages/pyarrow_hotfix/__init__.py", line 47, in __arrow_ext_deserialize__ raise RuntimeError( RuntimeError: Disallowed deserialization of 'arrow.py_extension_type': storage_type = list<item: list<item: list<item: float>>> serialized = b'\x80\x04\x95M\x00\x00\x00\x00\x00\x00\x00\x8c\x1adatasets.features.features\x94\x8c\x14Array3DExtensionType\x94\x93\x94K\x03K\xe0K\xe0\x87\x94\x8c\x07float32\x94\x86\x94R\x94.' pickle disassembly: 0: \x80 PROTO 4 2: \x95 FRAME 77 11: \x8c SHORT_BINUNICODE 'datasets.features.features' 39: \x94 MEMOIZE (as 0) 40: \x8c SHORT_BINUNICODE 'Array3DExtensionType' 62: \x94 MEMOIZE (as 1) 63: \x93 STACK_GLOBAL 64: \x94 MEMOIZE (as 2) 65: K BININT1 3 67: K BININT1 224 69: K BININT1 224 71: \x87 TUPLE3 72: \x94 MEMOIZE (as 3) 73: \x8c SHORT_BINUNICODE 'float32' 82: \x94 MEMOIZE (as 4) 83: \x86 TUPLE2 84: \x94 MEMOIZE (as 5) 85: R REDUCE 86: \x94 MEMOIZE (as 6) 87: . STOP highest protocol among opcodes = 4 Reading of untrusted Parquet or Feather files with a PyExtensionType column allows arbitrary code execution. If you trust this file, you can enable reading the extension type by one of: - upgrading to pyarrow >= 14.0.1, and call `pa.PyExtensionType.set_auto_load(True)` - disable this error by running `import pyarrow_hotfix; pyarrow_hotfix.uninstall()` We strongly recommend updating your Parquet/Feather files to use extension types derived from `pyarrow.ExtensionType` instead, and register this type explicitly. See https://arrow.apache.org/docs/dev/python/extending_types.html#defining-extension-types-user-defined-types for more details.
Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.