Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Grogros
/
phi-2-safecoderCode-OurSafecoder
like
0
Text Generation
Transformers
Safetensors
phi
Generated from Trainer
conversational
text-generation-inference
License:
mit
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
phi-2-safecoderCode-OurSafecoder
Ctrl+K
Ctrl+K
1 contributor
History:
6 commits
Grogros
Upload finetuning_config.yaml with huggingface_hub
9548740
verified
about 3 hours ago
checkpoint-2000
Training in progress, step 2000, checkpoint
about 3 hours ago
.gitattributes
Safe
1.52 kB
initial commit
about 6 hours ago
README.md
1.26 kB
Model save
about 3 hours ago
added_tokens.json
1.16 kB
Upload tokenizer
about 3 hours ago
config.json
698 Bytes
Training in progress, step 2000
about 3 hours ago
finetuning_config.yaml
1.41 kB
Upload finetuning_config.yaml with huggingface_hub
about 3 hours ago
generation_config.json
Safe
119 Bytes
Model save
about 3 hours ago
merges.txt
Safe
456 kB
Upload tokenizer
about 3 hours ago
model-00001-of-00002.safetensors
5 GB
LFS
Training in progress, step 2000
about 3 hours ago
model-00002-of-00002.safetensors
564 MB
LFS
Training in progress, step 2000
about 3 hours ago
model.safetensors.index.json
Safe
35.7 kB
Training in progress, step 2000
about 3 hours ago
special_tokens_map.json
1.07 kB
Upload tokenizer
about 3 hours ago
tokenizer.json
3.57 MB
Upload tokenizer
about 3 hours ago
tokenizer_config.json
8.57 kB
Upload tokenizer
about 3 hours ago
training_args.bin
pickle
Detected Pickle imports (10)
"accelerate.utils.dataclasses.DistributedType"
,
"transformers.training_args.TrainingArguments"
,
"transformers.trainer_utils.HubStrategy"
,
"transformers.trainer_pt_utils.AcceleratorConfig"
,
"transformers.trainer_utils.SchedulerType"
,
"transformers.training_args.OptimizerNames"
,
"transformers.trainer_utils.IntervalStrategy"
,
"accelerate.state.PartialState"
,
"transformers.trainer_utils.SaveStrategy"
,
"torch.device"
How to fix it?
5.3 kB
LFS
Training in progress, step 2000
about 3 hours ago
vocab.json
Safe
798 kB
Upload tokenizer
about 3 hours ago