Model Card for Phi-4-Argunaut-1-HIRPO
This model is a fine-tuned version of DebateLabKIT/Phi-4-Argunaut-1-SPIN-dev1. It has been trained using TRL.
Quick start
from transformers import pipeline
question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?"
generator = pipeline("text-generation", model="DebateLabKIT/Llama-3.1-Argunaut-1-8B-HIRPO", device="cuda")
output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0]
print(output["generated_text"])
Training procedure
This model was trained with Hindsight Instruction Relabeling Preference Optimization (HIRPO), an Online DPO version derived from The Wisdom of Hindsight Makes Language Models Better Instruction Followers.
More details about the training procedure can be found in the blog post.
We have released the preference pairs generated online as a separate dataset: DebateLabKIT/argunauts-hirpo-preferences.
Framework versions
- TRL: 0.19.1
- Transformers: 4.53.3
- Pytorch: 2.4.1
- Datasets: 3.1.0
- Tokenizers: 0.21.4
Evaluation
As described in this article, Phi-4-Argunaut-1-HIRPO technically masters formal argument analysis but has lost general conversational abilities during one-sided training.
Citations
Cite HIR as:
@misc{zhang2023wisdomhindsightmakeslanguage,
title={The Wisdom of Hindsight Makes Language Models Better Instruction Followers},
author={Tianjun Zhang and Fangchen Liu and Justin Wong and Pieter Abbeel and Joseph E. Gonzalez},
year={2023},
eprint={2302.05206},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2302.05206},
}
Cite TRL as:
@misc{vonwerra2022trl,
title = {{TRL: Transformer Reinforcement Learning}},
author = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallouédec},
year = 2020,
journal = {GitHub repository},
publisher = {GitHub},
howpublished = {\url{https://github.com/huggingface/trl}}
}
- Downloads last month
- 15