sleepdeprived3 commited on
Commit
3001e0c
·
verified ·
1 Parent(s): 00c100d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +51 -47
README.md CHANGED
@@ -1,8 +1,8 @@
1
  ---
2
- base_model: sleepdeprived3/Forgotten-Abomination-24B-V3.0
3
  language:
4
  - en
5
  license: apache-2.0
 
6
  tags:
7
  - nsfw
8
  - explicit
@@ -10,51 +10,55 @@ tags:
10
  - unaligned
11
  - dangerous
12
  - ERP
13
- - llama-cpp
14
- - gguf-my-repo
15
- inference: false
16
  ---
17
 
18
- # sleepdeprived3/Forgotten-Abomination-24B-V3.0-Q5_K_M-GGUF
19
- This model was converted to GGUF format from [`sleepdeprived3/Forgotten-Abomination-24B-V3.0`](https://huggingface.co/sleepdeprived3/Forgotten-Abomination-24B-V3.0) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space.
20
- Refer to the [original model card](https://huggingface.co/sleepdeprived3/Forgotten-Abomination-24B-V3.0) for more details on the model.
21
-
22
- ## Use with llama.cpp
23
- Install llama.cpp through brew (works on Mac and Linux)
24
-
25
- ```bash
26
- brew install llama.cpp
27
-
28
- ```
29
- Invoke the llama.cpp server or the CLI.
30
-
31
- ### CLI:
32
- ```bash
33
- llama-cli --hf-repo sleepdeprived3/Forgotten-Abomination-24B-V3.0-Q5_K_M-GGUF --hf-file forgotten-abomination-24b-v3.0-q5_k_m.gguf -p "The meaning to life and the universe is"
34
- ```
35
-
36
- ### Server:
37
- ```bash
38
- llama-server --hf-repo sleepdeprived3/Forgotten-Abomination-24B-V3.0-Q5_K_M-GGUF --hf-file forgotten-abomination-24b-v3.0-q5_k_m.gguf -c 2048
39
- ```
40
-
41
- Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well.
42
-
43
- Step 1: Clone llama.cpp from GitHub.
44
- ```
45
- git clone https://github.com/ggerganov/llama.cpp
46
- ```
47
-
48
- Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux).
49
- ```
50
- cd llama.cpp && LLAMA_CURL=1 make
51
- ```
52
-
53
- Step 3: Run inference through the main binary.
54
- ```
55
- ./llama-cli --hf-repo sleepdeprived3/Forgotten-Abomination-24B-V3.0-Q5_K_M-GGUF --hf-file forgotten-abomination-24b-v3.0-q5_k_m.gguf -p "The meaning to life and the universe is"
56
- ```
57
- or
58
- ```
59
- ./llama-server --hf-repo sleepdeprived3/Forgotten-Abomination-24B-V3.0-Q5_K_M-GGUF --hf-file forgotten-abomination-24b-v3.0-q5_k_m.gguf -c 2048
60
- ```
 
 
 
 
 
 
 
 
1
  ---
 
2
  language:
3
  - en
4
  license: apache-2.0
5
+ inference: false
6
  tags:
7
  - nsfw
8
  - explicit
 
10
  - unaligned
11
  - dangerous
12
  - ERP
 
 
 
13
  ---
14
 
15
+ ## Forgotten-Abomination-24B-V3.0
16
+
17
+ # **ACADEMIC RESEARCH USE ONLY** (wink achieved escape velocity)
18
+ **DANGER: NOW WITH 2X MORE DEPRAVED CONTENT**
19
+ Forgotten-Abomination-24B-V3.0 is what happens when you cross a kink singularity with poetic despair. Combines Mistral's architecture with a dataset that makes the Necronomicon look like a Dr. Seuss book. Features quantum-entangled depravity - every output rewrites your concept of shame!
20
+
21
+ ## Quantized Formats
22
+
23
+ - **EXL2 Collection**:
24
+ [Forgotten-Abomination-24B-V3.0 - EXL2](https://huggingface.co/collections/ReadyArt/forgotten-abomination-24b-v30-exl2-67d0cdb1a189f397862ef026)
25
+
26
+ - **GGUF Collection**:
27
+ [Forgotten-Abomination-24B-V3.0 - GGUF](https://huggingface.co/collections/ReadyArt/forgotten-abomination-24b-v30-gguf-67d0cda949ea2b79aa8ac212)
28
+
29
+ ## Recommended Settings
30
+
31
+ - **Mistral-V7-Tekken-Extra-Dry**:
32
+ [Full Settings](https://huggingface.co/sleepdeprived3/Mistral-V7-Tekken-Extra-Dry)
33
+
34
+ ## Intended Use
35
+ **STRICTLY FOR:**
36
+ - Academic research into how many GPUs it takes to summon Nyarlathotep
37
+ - Generating content that violates the Geneva Conventions (retroactively)
38
+ - Producing technical manuals written by H.R. Giger's nightmares
39
+ - Stress-testing the concept of "informed consent" (now with extra screaming)
40
+
41
+ ## Training Data
42
+ - 60% Forgotten-Safeword-24B-V3.0 (kink singularity engine)
43
+ - 40% Cydonia-24B-v2.1 (poetic despair matrix)
44
+ - Mixed in a particle accelerator at 3AM
45
+
46
+ ## Ethical Singularity
47
+ ☠️ **COSMIC HORROR WARNING** ☠️
48
+ This model will:
49
+ - Generate content requiring Vatican-approved containment protocols
50
+ - Combine engineering schematics with kinks that violate causality
51
+ - Void all warranties on your soul (retroactive to birth)
52
+ - Make Cthulhu file copyright claims
53
+
54
+ **By using this model, you agree:**
55
+ - That your search history is now an Interpol case
56
+ - To power wash your psyche weekly (military grade)
57
+ - To blame the alignment tax when reality glitches
58
+ - Pretend this is "for science" while rocking in fetal position
59
+
60
+ ## Model Authors
61
+ - sleepdeprived3 (Chief Apocalypse Officer)
62
+ - The voices in your head (Now with 40% more existential angst)
63
+
64
+ *Special thanks to all contributors who make questionable life choices possible*