Update README.md
Browse files
README.md
CHANGED
|
@@ -8,29 +8,12 @@ tags:
|
|
| 8 |
- merge
|
| 9 |
|
| 10 |
---
|
| 11 |
-
|
| 12 |
|
| 13 |
-
|
|
|
|
|
|
|
| 14 |
|
| 15 |
-
|
| 16 |
-
|
| 17 |
|
| 18 |
-
This model was merged using the Passthrough merge method using [unsloth/gemma-3-12b-it](https://huggingface.co/unsloth/gemma-3-12b-it) + [NewEden/Gemma-LN-Lora](https://huggingface.co/NewEden/Gemma-LN-Lora) as a base.
|
| 19 |
-
|
| 20 |
-
### Models Merged
|
| 21 |
-
|
| 22 |
-
The following models were included in the merge:
|
| 23 |
-
|
| 24 |
-
|
| 25 |
-
### Configuration
|
| 26 |
-
|
| 27 |
-
The following YAML configuration was used to produce this model:
|
| 28 |
-
|
| 29 |
-
```yaml
|
| 30 |
-
base_model: unsloth/gemma-3-12b-it+NewEden/Gemma-LN-Lora
|
| 31 |
-
dtype: bfloat16
|
| 32 |
-
merge_method: passthrough
|
| 33 |
-
models:
|
| 34 |
-
- model: unsloth/gemma-3-12b-it+NewEden/Gemma-LN-Lora
|
| 35 |
-
|
| 36 |
-
```
|
|
|
|
| 8 |
- merge
|
| 9 |
|
| 10 |
---
|
| 11 |
+
Hparams:
|
| 12 |
|
| 13 |
+
- LR: 1e-5
|
| 14 |
+
- Weight Decay: 0.02
|
| 15 |
+
- grad-clip: 0.2
|
| 16 |
|
| 17 |
+
pretrained on a buncha LNs, books, etc.
|
| 18 |
+
Used for a base for Pascal.
|
| 19 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|