Spaces:
Running
Running
File size: 7,603 Bytes
4034aee 15191ed 4034aee 4611a5e 4034aee 15191ed 0f15a80 c50949f 0f15a80 c50949f 0f15a80 c50949f 0f15a80 15191ed 0f15a80 15191ed 0f15a80 15191ed 0f15a80 15191ed 0f15a80 15191ed 0f15a80 15191ed 0f15a80 15191ed 0f15a80 15191ed 0f15a80 15191ed 0f15a80 15191ed 0f15a80 15191ed 0f15a80 15191ed 0f15a80 15191ed 0f15a80 15191ed 0f15a80 c08b187 0f15a80 53af3e0 15191ed e50dad8 15191ed e50dad8 15191ed defbf80 15191ed e50dad8 15191ed 5c9eef4 15191ed 96dfd6a defbf80 96dfd6a defbf80 15191ed defbf80 15191ed 25738ee 15191ed |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 |
---
title: README
emoji: π
colorFrom: blue
colorTo: gray
sdk: static
pinned: true
---
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Zeus Labs Organization Card</title>
<link href="https://fonts.googleapis.com/css2?family=Quicksand:wght@400;500;600&display=swap" rel="stylesheet">
<style>
/* Force color scheme and override Hugging Face styles */
:root {
color-scheme: light dark !important;
}
/* Ensure our styles take precedence */
html {
background: linear-gradient(135deg, #0a1128 0%, #1c2541 100%) !important;
color: #e0e1dd !important;
}
body,
.ant-layout,
.ant-layout-content,
div[class^="style-scope"],
div[class*=" style-scope"] {
background: transparent !important;
color: #e0e1dd !important;
}
/* Reset and base styles */
* {
box-sizing: border-box;
margin: 0;
padding: 0;
}
body {
font-family: 'Quicksand', sans-serif !important;
font-size: 16px !important;
line-height: 1.6 !important;
min-height: 100vh !important;
}
/* Container styles */
.container {
width: 100% !important;
max-width: 800px !important;
margin: 0 auto !important;
padding: 20px !important;
background-color: rgba(255, 255, 255, 0.05) !important;
border-radius: 12px !important;
box-shadow: 0 4px 10px rgba(0, 0, 0, 0.3) !important;
backdrop-filter: blur(10px) !important;
border: 1px solid rgba(255, 255, 255, 0.1) !important;
}
/* Typography */
h1, h2, h3, h4, h5, h6 {
color: #4cc9f0 !important;
margin-bottom: 20px !important;
}
h1 {
font-size: 36px !important;
text-shadow: 2px 2px 4px rgba(0, 0, 0, 0.3) !important;
}
h2 {
font-size: 24px !important;
color: #7209b7 !important;
margin-top: 30px !important;
}
p, ul {
margin-bottom: 15px !important;
}
/* Logo */
.logo {
width: 300px !important;
height: auto !important;
margin: 0 auto 20px !important;
display: block !important;
box-shadow: 0 0 20px rgba(79, 195, 247, 0.6) !important;
}
/* Links and buttons */
a {
color: #4cc9f0 !important;
text-decoration: none !important;
}
a:hover {
color: #f72585 !important;
}
.button {
display: inline-block !important;
background-color: #3a0ca3 !important;
color: #e0e1dd !important;
padding: 10px 20px !important;
border-radius: 5px !important;
cursor: pointer !important;
text-decoration: none !important;
margin-top: 20px !important;
font-weight: bold !important;
}
.button:hover {
background-color: #7209b7 !important;
}
/* Custom classes */
.team-member {
margin-bottom: 15px !important;
}
.achievements {
background-color: rgba(58, 12, 163, 0.2) !important;
border-radius: 10px !important;
padding: 12px !important;
margin-top: 20px !important;
margin-right: 10px !important;
}
.info2 {
margin-right: 20px !important;
}
</style>
</head>
<body>
<div class="container">
<img src="https://cdn-uploads.huggingface.co/production/uploads/64545af5ec40bbbd01242ca6/Lo7XdJrd46jmcjyBqLGGA.png" alt="Zeus Labs Logo" class="logo">
<div class="header">
</div>
<div class="info">
<p>We are a small but ambitious AI research group, focused on developing performant and highly capable Large Language Models that aim to excel in their domains. Our specialty lies in the exploration of cutting-edge model finetuning methods and innovative data preparation techniques.</p>
<p></p>
<h2>Our Mission</h2>
<p>At Zeus Labs, we strive to push the boundaries of AI capabilities, particularly in the realm of language models. Our goal is to create models that not only perform well but also demonstrate exceptional abilities in specific domains, contributing to the advancement of AI technology and its applications.</p>
<p></p>
<h2>Our Approach</h2>
<div>
<ul class="info2">
<li>Cutting-edge finetuning methods for Large Language Models</li>
<li>Innovative data preparation and curation techniques</li>
<li>Focus on domain-specific excellence and versatility</li>
<li>Open collaboration and knowledge sharing within the AI community</li>
<li>Advancing LLM research via novel techniques, which has been applied by all of our members.</li>
</ul>
</div>
<p></p>
<h2>Team:</h2>
<div class="team-member">
<p>Chief ML Engineer, M.S.</p>
<p><strong>@elinas</strong> - <a href="https://huggingface.co/elinas" target="_blank">HuggingFace Profile</a></p>
</div>
<div class="team-member">
<p>Senior Data Scientist, PhD</p>
<p><strong>@ToastyPigeon</strong> - <a href="https://huggingface.co/ToastyPigeon" target="_blank">HuggingFace Profile</a></p>
</div>
<div class="team-member">
<p>Operations Engineer</p>
<p><strong>@fizz</strong> - <a href="https://huggingface.co/Fizzarolli" target="_blank">HuggingFace Profile</a></p>
</div>
<div class="team-member">
<p>ML / DS Engineer</p>
<p><strong>@SteelSkull</strong> - <a href="https://huggingface.co/Steelskull" target="_blank">HuggingFace Profile</a></p>
</div>
<p></p>
<h2>Notable Achievements</h2>
<div class="achievements">
<ul>
<li>Revival of Llama 1 33B by training on over 500M tokens</li>
<li>We did this based on the original pretraining token count of 1.4T and decided to add another 500M tokens to it, to which our surprise ended up
surpassing expectations in both quality and length</li>
<li>
It was trained at 16384 context legth with an *effective* context legnth around 12k due to the nature of the samples, but exceeds in RP.
</li>
<li>
Our next goal is to apply GQA to it, but in the meantime, we will appreciate quanters who will help with running this model on less VRAM!
</li>
</ul>
<ul>
<li>Development of L3-Aethora-15B series, The first heavily fintuned 15b model that focuses in creative writing and general intelligence using a novel
technique known as "zeroing layers."</li>
<li>Creation of the Aether-Lite-V1.8.1 dataset, a carefully curated dataset for AI training</li>
</ul>
</div>
<p></p>
<h2>Join Us</h2>
<p>We are currently growing and looking for passionate individuals interested in machine learning and AI research. Whether you're a seasoned researcher or an enthusiastic beginner, there's a place for you in our community.</p>
<p>Join our Discord to connect with like-minded individuals, share ideas, and potentially collaborate on exciting AI projects!</p>
<a href="https://discord.gg/tknvCD4aje" class="button" target="_blank">Join The Zeus Labs Discord</a>
<p></p>
<h2>Our Work</h2>
<p>Explore our independently developed work and collaborations on our HuggingFace profiles. We're always pushing the boundaries of what's possible with AI!</p>
<p></p>
<h2>Model Quanters!</h2>
<p>If you create quants for our models and we miss them, please post a discussion to that model and we will add it to the Model card!</p>
</div>
</div>
</body>
</html> |