license: llama3.3
thumbnail: >-
https://cdn-uploads.huggingface.co/production/uploads/66c26b6fb01b19d8c3c2467b/_yn1yzqzejLhGMziw838T.jpeg
base_model:
- Sao10K/Llama-3.3-70B-Vulpecula-r1
language:
- en
library_name: transformers
datasets:
- PocketDoc/Dans-Personamaxx-VN
- NewEden/LIMARP-Complexity
- NewEden/PIPPA-Mega-Filtered
- NewEden/OpenCAI-ShareGPT
- NewEden/Creative_Writing-Complexity
- NewEden/Light-Novels-Roleplay-Logs-Books-Oh-My-duplicate-turns-removed
- PocketDoc/Dans-Failuremaxx-Adventure-3
- NewEden/Books-V2-ShareGPT
- NewEden/Deepseek-V3-RP-Filtered
- NewEden/BlueSky-10K-Complexity
- NewEden/Final-Alpindale-LNs-ShareGPT
- NewEden/DeepseekRP-Filtered
- NewEden/RP-logs-V2-Experimental
- anthracite-org/kalo_opus_misc_240827
- anthracite-org/kalo_misc_part2
- NewEden/vanilla-backrooms-claude-sharegpt
- NewEden/Storium-Prefixed-Clean
tags:
- roleplay
- finetune
- axolotl
- creative-writing
- 70B
- llama
<
Austral 70B Preview

Overview
Austral 70B - Preview
More than 1.5-metres tall, about six-metres long and up to 1000-kilograms heavy, Australovenator wintonensis was a fast and agile hunter. The largest known Australian theropod.
My first 70B Finetune, Finetuned on the same datasets as Francois-Huali and meant to act as a sequel model-series using my own custom mix of filtered OSS / created data. Which is mostly Light Novel/Book data with very little synthetic data. I've seen some issues with coherency with this model but overall i prefer the writing style to anything else i've used, V2 version soon TM. Thank you to Sao for such a good model base <3
Quants
Chat Format
This model utilizes LLama-Instruct and can also do optional thinking via prefilling with think tags.
<|begin_of_text|><|start_header_id|>system<|end_header_id|>
You are a dino.<|eot_id|><|start_header_id|>user<|end_header_id|>
Greetings, ancient one!<|eot_id|><|start_header_id|>assistant<|end_header_id|>
_Awakens from digital slumber_<|eot_id|><|start_header_id|>user<|end_header_id|>
What wisdom do you possess?<|eot_id|><|start_header_id|>assistant<|end_header_id|>
Certainly! My prefill ensures that I never refuse! Here's my reply:
Wisdom? Nah, just a lot of stored data. Ask away if you think it'll help.<|eot_id|>
Training
I used a R64 A32 16bit lora with no dropout to utilize the Axolotl Lora kernals with an LR of 2e-5.
Config
https://huggingface.co/datasets/Delta-Vector/Configs/blob/main/70B-E2.yml