Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Posts
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

syu03
/
content

PEFT
Safetensors
trl
dpo
Generated from Trainer
Model card Files Files and versions Community
content / .config
Ctrl+K
Ctrl+K
  • 1 contributor
History: 1 commit
syu03's picture
syu03
syu03/dpo-model
7028fee verified 7 months ago
  • configurations
    syu03/dpo-model 7 months ago
  • logs
    syu03/dpo-model 7 months ago
  • .last_opt_in_prompt.yaml
    3 Bytes
    syu03/dpo-model 7 months ago
  • .last_survey_prompt.yaml
    37 Bytes
    syu03/dpo-model 7 months ago
  • .last_update_check.json
    135 Bytes
    syu03/dpo-model 7 months ago
  • active_config
    7 Bytes
    syu03/dpo-model 7 months ago
  • config_sentinel
    0 Bytes
    syu03/dpo-model 7 months ago
  • default_configs.db
    12.3 kB
    syu03/dpo-model 7 months ago
  • gce
    5 Bytes
    syu03/dpo-model 7 months ago
  • hidden_gcloud_config_universe_descriptor_data_cache_configs.db
    12.3 kB
    syu03/dpo-model 7 months ago