Datasets:
Tasks:
Time Series Forecasting
Modalities:
Tabular
Formats:
parquet
Sub-tasks:
multivariate-time-series-forecasting
Languages:
English
Size:
1B - 10B
DOI:
License:
Proposal: Delete legacy files
#13
by
Jack-Kelly
- opened
The proposal is to delete all the legacy files, specifically:
File(s) | Reason for deletion |
---|---|
uk_pv.py |
HuggingFace says "The viewer is disabled because this dataset repo requires arbitrary Python code execution. Please consider removing the loading script and relying on automated data support". Also, uk_pv.py loads pv.netcdf , and pv.netcdf is likely to be deleted. And comment #9 shows that uk_pv.py is no longer working. |
pv.netcdf |
We have switched to Parquet files instead of NetCDF because Parquet files can be more compact, and are supported by more readers. pv.netcdf hasn't been updated for 3 years. |
{2,5,30}min.parquet |
These haven't been updated for 2 years. The 5 minutely data and 30 minutely data exists in the data/ folder. We could consider keeping the 2min.parquet data because the 2-minutely data isn't stored in the data/ folder. But, if I understand correctly, the 2 minutely data is super-noisy because these are instantaneous readings, so folks are almost always going to be better off using the resampled 5 minutely data, anyway. |
The yearly parquet files in data/ |
These files are redundant because the exact same data is stored in monthly files. It's super-easy to lazily open all the monthly files in polars like this: pl.scan_parquet(PV_DATA_PATH / "data" / "*" / "*" / "*_30min.parquet") . Users can't currently rely on the yearly files because not all years have yearly files. And it's a maintenance burden for OCF to create the yearly files. |
Please let us know ASAP if you have any concerns about us deleting these files! (Of course, because this dataset is stored as a git
repo, you'll still be able to access the old data via the git
command line interface).
Jack-Kelly
changed discussion status to
closed