metadata
license: apache-2.0
language:
- en
- fr
- de
- es
- it
- pt
- ru
- zh
- ja
base_model:
- mistralai/Mistral-Nemo-Instruct-2407
pipeline_tag: text-generation
library_name: transformers
Nemo-Instruct-2407-MPOA-v2-12B
MPOA (Magnitude-Preserving Othogonalized Ablation, AKA norm-preserving biprojected abliteration) has been applied the many of the layers in this model, to both mlp.down_proj.weight and self_attn.o_proj.weight streams.
Compliance was not maximized for this model. The model appears to be near an edge of chaos with regard to some safety refusals, which should be suitable for varied text completion.
More details pending.