abarbosa's picture
Upload folder using huggingface_hub
34676d1 verified
metadata
configs:
  - config_name: emissions
    data_files:
      - split: test
        path:
          - runs/slm_decoder_models/**/*/emissions.csv
          - runs/encoder_models/**/*/emissions.csv
  - config_name: evaluation_results
    data_files:
      - split: test
        path:
          - runs/slm_decoder_models/**/*/evaluation_results.csv
          - runs/encoder_models/**/*/evaluation_results.csv

Masters Updated Models - Training Emissions and Evaluation Results

This repository contains comprehensive training emissions data and evaluation results for various model architectures used in the masters project.

Dataset Configurations

This dataset includes multiple configurations for different data types and model categories:

Emissions Data

  • emissions: Carbon emissions tracking data from all model training runs

Evaluation Results by Model Type

  • evaluation_results_slm: Results for SLM decoder models (Phi-3.5 variants)
  • evaluation_results_encoder: Results for encoder models

Repository Structure

runs/
β”œβ”€β”€ slm_decoder_models/
β”‚   β”œβ”€β”€ Phi-3.5-mini-instruct-phi35_ordinal_coral_lora-C1/
β”‚   β”‚   β”œβ”€β”€ evaluation_results.csv
β”‚   β”‚   β”œβ”€β”€ run_experiment.log
β”‚   β”‚   β”œβ”€β”€ .hydra/
β”‚   β”‚   β”œβ”€β”€ logs/
β”‚   β”‚   └── results/
β”‚   β”œβ”€β”€ Phi-3.5-mini-instruct-phi35_ordinal_coral_lora-C2/
β”‚   β”‚   β”œβ”€β”€ emissions.csv
β”‚   β”‚   β”œβ”€β”€ evaluation_results.csv
β”‚   β”‚   β”œβ”€β”€ run_experiment.log
β”‚   β”‚   β”œβ”€β”€ .hydra/
β”‚   β”‚   β”œβ”€β”€ logs/
β”‚   β”‚   └── results/
β”‚   β”œβ”€β”€ Phi-3.5-mini-instruct-phi35_ordinal_coral_lora-C3/
β”‚   β”œβ”€β”€ Phi-3.5-mini-instruct-phi35_ordinal_corn_lora-C1/
β”‚   └── Phi-3.5-mini-instruct-phi35_ordinal_corn_lora-C2/
β”œβ”€β”€ encoder_models/
β”‚   └── [model directories with similar structure]
β”œβ”€β”€ base_models/
β”‚   └── [model directories with similar structure]
└── large_models/
    └── [model directories with similar structure]

Usage with Hugging Face Datasets

from datasets import load_dataset

# Load all emissions data
emissions = load_dataset("your-username/masters-updated-models", "emissions")

# Load specific evaluation results
slm_results = load_dataset("your-username/masters-updated-models", "evaluation_results_slm")
encoder_results = load_dataset("your-username/masters-updated-models", "evaluation_results_encoder")

# Load all evaluation results combined
all_results = load_dataset("your-username/masters-updated-models", "all_evaluation_results")

Model Categories

SLM Decoder Models

  • Phi-3.5-mini-instruct with ordinal CORAL LoRA (C1, C2, C3)
  • Phi-3.5-mini-instruct with ordinal CORN LoRA (C1, C2)

Encoder Models

[List encoder models once confirmed]

Base Models

[List base models once confirmed]

Large Models

[List large models once confirmed]

Emissions Tracking

All emissions data is tracked using CodeCarbon v3.0.2, capturing:

  • Energy consumption (CPU, GPU, RAM)
  • Carbon emissions and emission rates
  • Hardware specifications (Intel Xeon Platinum 8568Y+, NVIDIA H200)
  • Geographic information (France, Île-de-France region)