Source model
Provided quantized models
| Type | Size | CLI |
|---|---|---|
| H8-4.0BPW | 7.49 GB | Copy-paste the lines / Download the batch file |
| H8-6.0BPW | 10.22 GB | Copy-paste the lines / Download the batch file |
| H8-8.0BPW | 12.95 GB | Copy-paste the lines / Download the batch file |
Requirements: A python installation with huggingface-hub module to use CLI.
Licensing
License detected: apache-2.0
The license for the provided quantized models is inherited from the source model (which incorporates the license of its original base model). For definitive licensing information, please refer first to the page of the source or base models. File and page backups of the source model are provided below.
Backups
Date: 22.03.2026
Source page (click to expand)
Wicked-Nebula-12B
Overview
Wicked-Nebula-12B was created by merging Hollow-Aether-12B, MN-12b-RP-Ink-RP-Longform, Lunar-Twilight-12B, Astral-Noctra-12B, and Rocinante-X-12B-v1, using a custom method.
Merge configuration
base_model: Vortex5/Hollow-Aether-12B models: - model: SuperbEmphasis/MN-12b-RP-Ink-RP-Longform - model: Vortex5/Lunar-Twilight-12B - model: Vortex5/Astral-Noctra-12B - model: TheDrummer/Rocinante-X-12B-v1 merge_method: smcos chat_template: auto parameters: strength: 0.6 select: 0.72 novelty: 0.33 shape: 0.4 stability: 0.74 dtype: float32 out_dtype: bfloat16 tokenizer: source: Vortex5/Hollow-Aether-12B
Intended Use
Storytelling
Structured long-form narrative
Roleplay
Emotion-forward interaction
Creative Writing
Atmospheric fiction
Model tree for DeathGodlike/Vortex5_Wicked-Nebula-12B_EXL3
Base model
Vortex5/Wicked-Nebula-12B