id
stringlengths 6
9
| question
stringlengths 32
113
| golden_answers
listlengths 1
16
| metadata
dict | data_source
stringclasses 1
value | prompt
listlengths 1
1
| ability
stringclasses 1
value | reward_model
dict | extra_info
dict |
|---|---|---|---|---|---|---|---|---|
dev_9260
|
Which country Haya Bint Hussein's husband is from?
|
[
"the United Arab Emirates",
"🇦🇪",
"UAE",
"the UAE",
"United Arab Emirates",
"the Emirates",
"the U.A.E.",
"ae",
"U.A.E.",
"Emirates"
] |
{
"context": {
"content": [
[
"Sumaya bint Hassan( born 14 May 1971) is a princess of Jordan."
],
[
"Princess Haya bint Hussein (born 3 May 1974), commonly known as Princess Haya of Jordan, is the daughter of King Hussein of Jordan and his third wife Queen Alia, and the half-sister of King Abdullah II.",
"She is a graduate of Oxford University in England and an accomplished equestrian.",
"She represented Jordan at the 2000 Summer Olympics in Sydney, Australia and is the two-term President of the International Federation for Equestrian Sports (FEI).",
"In addition, she engages in a variety of charitable activities.",
"In 2004, Haya became the sixth wife of the ruler of the Emirate of Dubai, Sheikh Mohammed bin Rashid Al Maktoum.",
"The Sheikh’s first wife,",
"Hind bint Maktoum bin Juma Al Maktoum, remained first lady of Dubai.",
"In June 2019, Haya left her husband in Dubai with her two children and is currently residing in her mansion in the United Kingdom, while conducting legal proceedings over custody of her children."
],
[
"Princess Qhabl Begum Sahiba Esra Birgen (born 1936) is a princess belonging to the Asaf Jah Dynasty of Hyderabad state by marriage.",
"She was married to Prince Mukarram Jah.",
"Princess Esra is a princess by marriage not by birth."
],
[
"Haya bint Abdulaziz( 1929 – 2 November 2009) was a member of the House of Saud."
],
[
"Yousef Al- Samoui is a Jordanian footballer who plays for Al- Hussein."
],
[
"Muhannad Al- Ezza is a Jordanian footballer who plays for Al- Hussein."
],
[
"Lubaba bint al- Harith( died c. 650), also known as Umm Fadl, was a prominent early Muslim.",
"Two of her sisters, Maymuna bint al- Harith and Zaynab bint Khuzayma, became wives of Muhammad."
],
[
"Sheikh Mohammed bin Rashid Al Maktoum (born 15 July 1949) is the Vice President and Prime Minister of the United Arab Emirates (UAE), and ruler of the Emirate of Dubai.",
"Since his accession in 2006, after the death of his brother Sheikh Maktoum, he has undertaken reforms in the UAE's government, starting with the UAE Federal Government Strategy in April 2007.",
"In 2010, he launched the UAE vision 2021 with the aim of making the UAE 'one of the best countries in the world' by 2021.",
"He is responsible for the growth of Dubai into a global city, as well as the launch of a number of major enterprises including Emirates Airline, DP World, and the Jumeirah Group.",
"Many of these are held by Dubai Holding, a company with multi-diversified businesses and investments.",
"Sheikh Mohammed has overseen the development of numerous projects in Dubai including the creation of a technology park and a free economic zone, Dubai Internet City, Dubai Media City, the Dubai International Finance Centre, the Palm Islands and the Burj Al Arab hotel.",
"He also drove the construction of Burj Khalifa, the tallest building in the world.",
"A keen equestrian, he is the founder of the Maktoum family-owned Godolphin racing stable and the owner of Darley, a thoroughbred breeding operation with operations in six countries.",
"In 2012, he rode the horse \"Madji Du Pont\" 160 km to take the FEI World Endurance Championship.",
"He is a recognised poet in his native Arabic.",
"He shares a close and friendly relationship with Mohammed bin Zayed Al Nahyan, who is the Crown Prince of the Emirate of Abu Dhabi, and the Deputy Supreme Commander of the UAE Armed Forces."
],
[
"Malek Al- Yasseri is a Jordanian footballer who plays for Al- Hussein."
],
[
"Tsandzile Dlamini is a princess of Swaziland who currently serves as Minister of Home Affairs."
]
],
"title": [
"Princess Sumaya bint Hassan",
"Haya bint Hussein",
"Princess Esra",
"Haya bint Abdulaziz Al Saud",
"Yousef Al-Samoui",
"Muhannad Al-Ezza",
"Lubaba bint al-Harith",
"Mohammed bin Rashid Al Maktoum",
"Malek Al-Yasseri",
"Tsandzile Dlamini"
]
},
"supporting_facts": {
"sent_id": [
4,
0
],
"title": [
"Haya bint Hussein",
"Mohammed bin Rashid Al Maktoum"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which country Haya Bint Hussein's husband is from?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"the United Arab Emirates",
"🇦🇪",
"UAE",
"the UAE",
"United Arab Emirates",
"the Emirates",
"the U.A.E.",
"ae",
"U.A.E.",
"Emirates"
]
},
"style": "rule"
}
|
{
"index": 0,
"split": "test"
}
|
dev_8565
|
Which film has the director born later, The Abyss or Going Hollywood?
|
[
"Abyss",
"The Abyss"
] |
{
"context": {
"content": [
[
"James Francis Cameron( born August 16, 1954) is a Canadian filmmaker, artist, and environmentalist, who is best known for making science fiction and epic films for the Hollywood mainstream.",
"Cameron first gained recognition for directing\" The Terminator\"( 1984).",
"He found further critical and commercial success with\" Aliens\"( 1986),\" The Abyss\"( 1989),( 1991) and\" True Lies\"( 1994).",
"His greatest big-budget productions have been\" Titanic\"( 1997) and\" Avatar\"( 2009), the former earning him Academy Awards in Best Picture, Best Director and Best Film Editing.",
"\" Avatar,\" filmed in 3 D technology, also garnered him nominations in the same categories.",
"He also co-founded Lightstorm Entertainment, Digital Domain and Earthship Productions.",
"In addition to his filmmaking, he is a National Geographic explorer of the sea and has produced a number of documentaries on the subject.",
"Cameron contributed to underwater filming and remote vehicle technologies and helped create the digital 3D Fusion Camera System.",
"In 2012, Cameron became the first person to perform a solo descent to the bottom of the Mariana Trench, the deepest part of the Earth's ocean, in the\" Deepsea Challenger\" submersible.",
"In total, Cameron's films have grossed approximately US$ 2 billion in North America and US$ 6 billion worldwide.",
"Cameron's\" Avatar\" and\" Titanic\" are the second and third highest- grossing films of all time, earning$ 2.78 billion and$ 2.19 billion, respectively.",
"Cameron holds the achievement of having directed the first two of the five films in history to gross over$ 2 billion worldwide.",
"In 2010,\" Time\" magazine named Cameron one of the 100 most influential people in the world."
],
[
"The Abyss is a 1989 American science fiction film written and directed by James Cameron and starring Ed Harris, Mary Elizabeth Mastrantonio, and Michael Biehn.",
"When an American submarine sinks in the Caribbean, the U.S. search and recovery team works with an oil platform crew, racing against Soviet vessels to recover the boat.",
"Deep in the ocean, they encounter something unexpected.",
"The film was released on August 9, 1989, receiving generally positive reviews and grossed$ 90 million.",
"It won the Academy Award for Best Visual Effects and was nominated for three more Academy awards."
],
[
"Going Hollywood is a 1933 American pre-Code musical film directed by Raoul Walsh and starring Marion Davies and Bing Crosby.",
"It was written by Donald Ogden Stewart and based on a story by Frances Marion.",
"\" Going Hollywood\" was released by Metro- Goldwyn- Mayer on December 22, 1933."
],
[
"Fred R. Krug is an American film and television producer- director born in Bern, Switzerland."
],
[
"Claude Weisz is a French film director born in Paris."
],
[
"Yolonda Ross is an American character actress, writer and director born (31 July 1974)."
],
[
"W. Augustus Barratt( 1873- 1947) was a Scottish- born, later American, songwriter and musician."
],
[
"Jacques Décombe is a French author, actor and director born in 1953."
],
[
"Raoul A. Walsh( March 11, 1887 December 31, 1980) was an American film director, actor, founding member of the Academy of Motion Picture Arts and Sciences( AMPAS) and the brother of the silent screen actor George Walsh.",
"He was known for portraying John Wilkes Booth in the silent classic\" The Birth of a Nation\"( 1915) and for directing such films as\" The Big Trail\"( 1930), starring John Wayne,\" High Sierra\"( 1941), starring Ida Lupino and Humphrey Bogart; and\" White Heat\"( 1949), starring James Cagney and Edmond O'Brien.",
"He directed his last film in 1964."
],
[
"Sepideh Farsi is an Iranian film director, born in Tehran in 1965."
]
],
"title": [
"James Cameron",
"The Abyss",
"Going Hollywood",
"Fred Roy Krug",
"Claude Weisz",
"Yolonda Ross",
"W. Augustus Barratt",
"Jacques Décombe",
"Raoul Walsh",
"Sepideh Farsi"
]
},
"supporting_facts": {
"sent_id": [
0,
0,
0,
0
],
"title": [
"The Abyss",
"Going Hollywood",
"James Cameron",
"Raoul Walsh"
]
},
"type": "bridge_comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which film has the director born later, The Abyss or Going Hollywood?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Abyss",
"The Abyss"
]
},
"style": "rule"
}
|
{
"index": 1,
"split": "test"
}
|
dev_5183
|
Who is the paternal grandfather of Alexios Komnenos (Son Of Andronikos I)?
|
[
"Isaac Komnenos",
"Isaac Komnenos (son of Alexios I)"
] |
{
"context": {
"content": [
[
"Jiajak Jaqeli was the Empress consort of Alexios II of Trebizond."
],
[
"Rev. John Westley( 1636 – 78) was an English nonconformist minister.",
"He was the grandfather of John Wesley( founder of Methodism)."
],
[
"Abd al- Muttalib Shaybah ibn Hashim( c. 497 – 578) was the grandfather of Islamic prophet Muhammad."
],
[
"Nikephoros Euphorbenos Katakalon was a Byzantine aristocrat and son-in- law of Emperor Alexios I Komnenos.",
"He was the son of the distinguished general Constantine Euphorbenos Katakalon, one of Alexios' most trusted officials.",
"As a token of Alexios' appreciation for Constantine, Nikephoros was married to the\" porphyrogennete\" Maria, the emperor's second daughter, and raised to the rank of\" panhypersebastos\".",
"From the\" Alexiad\", he is known to have participated in combat against the Cumans in 1095, alongside his father, where he distinguished himself by his bravery.",
"It is unknown when he died, but he is recorded as being dead by 1130.",
"With Maria, he had numerous children, but only two sons, Alexios Komnenos and Andronikos, are known by name, having held senior positions later in the century.",
"Another son, John, is known only by his commemoration in the\" typikon\" of the Monastery of Christ Philanthropos.",
"The couple also had an unknown number of daughters( at least three according to Varzos), as their existence is mentioned in the\" typikon\" of the Kecharitomene Monastery."
],
[
"Kaya Alp was, according to Ottoman tradition, the son of Kızıl Buğa and the father of Suleyman Shah, who was, in turn, the grandfather of Ertuğrul, and the great grandfather of the Ottoman Empire founder, Osman I."
],
[
"George (Georgios) Palaiologos or Palaeologus was a Byzantine general, one of the most prominent military commanders and supporters of Emperor Alexios I Komnenos (r. 1081–1118).",
"He was the son of the first known member of the Palaiologos family, the \"strategos\" of Mesopotamia Nikephoros Palaiologos.",
"His wife Anna Doukaina was the sister of Irene Doukaina, the wife of Alexios I Komnenos, making him the emperor's brother-in-law.",
"As Alexios Komnenos’ leading general and close friend, he played an important role in his campaigns, especially the Battle of Dyrrhachium against the Normans or the Battle of Levounion against the Pechenegs.",
"He was the chief source used by Anna Komnene in her \"Alexiad\" on her father’s battles, and is very favourably portrayed by her in her book as both capable and loyal."
],
[
"Lyon Cohen( 1868–1937) was a Polish- born Canadian businessman and a philanthropist.",
"He was the grandfather of singer/ poet Leonard Cohen."
],
[
"Nicholas Kallikles was a prominent physician and a leading court poet active in the Byzantine court in Constantinople during the reigns of Alexios I Komnenos( r. 1081– 1118) and John II Komnenos( r. 1118–1143)."
],
[
"Alexios Komnenos (1170 – 1199) was a natural son of Andronikos I Komnenos, the Byzantine Emperor",
"(r. 1183 – 1185) by his relative and mistress Theodora Komnene, Queen Dowager of Jerusalem.",
"During the reign of Emperor Manuel I Komnenos (r. 1143–1180), Alexios accompanied his father Andronikos in exile, visiting, inter alia, the Kingdom of Georgia.",
"The Georgian king George III, their relative, granted to Andronikos several castles in Kakhetia in the east of Georgia.",
"Andronikos returned to Constantinople and attained to the Byzantine crown in 1183, only to be overthrown and killed in 1185.",
"Alexios then fled to Georgia, where he was restored to his father's Georgian estates.",
"At one point, he was even considered by some Georgian nobles as a candidate to become a consort of the queen regnant Tamar of Georgia.",
"According to the Georgian historical tradition, Alexios's progeny flourished in Georgia, producing the noble family of Andronikashvili, i.e., \"scions of Andronikos\", so named after Alexios's purported son.",
"In spite of the extremely fragmentary nature of the early Andronikashvili pedigree, Professor Cyril Toumanoff (1976) accepted the Komnenian origin as plausible, but the evidence marshaled by Kuršankis (1977) suggests that this may have been only a legend.",
"Toumanoff also assumed that the line of the \"provincial kings\" of Alastani (c. 1230–1348), known from the medieval Georgian sources and including one named Andronikos, might have belonged to the Georgian Komnenoi/Andronikashvili."
],
[
"Andronikos I Komnenos (– 12 September 1185), usually Latinized as Andronicus I Comnenus, was Byzantine Emperor from 1183 to 1185.",
"He was the son of Isaac Komnenos and the grandson of the emperor",
"Alexios I."
]
],
"title": [
"Jiajak Jaqeli",
"John Westley",
"Abd al-Muttalib",
"Nikephoros Katakalon",
"Kaya Alp",
"George Palaiologos",
"Lyon Cohen",
"Nicholas Kallikles",
"Alexios Komnenos (son of Andronikos I)",
"Andronikos I Komnenos"
]
},
"supporting_facts": {
"sent_id": [
0,
1
],
"title": [
"Alexios Komnenos (son of Andronikos I)",
"Andronikos I Komnenos"
]
},
"type": "inference"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Who is the paternal grandfather of Alexios Komnenos (Son Of Andronikos I)?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Isaac Komnenos",
"Isaac Komnenos (son of Alexios I)"
]
},
"style": "rule"
}
|
{
"index": 2,
"split": "test"
}
|
dev_4272
|
Which film has the director who was born later, Alice Of Wonderland In Paris or The Roundabouts Of Handsome Karl?
|
[
"Alice Of Wonderland In Paris",
"Alice of Wonderland in Paris"
] |
{
"context": {
"content": [
[
"Eugene Merril Deitch( born August 8, 1924) is an American illustrator, animator and film director.",
"Based in Prague since 1959, Deitch is known for creating animated cartoons such as\" MunroTom Terrific\", and\" Nudnik\", as well as his work on the\" Popeye\" and\" Tom and Jerry\" series."
],
[
"Alice Maud Fanner, later Alice Maud Taite,( 1865 – 1930) was a British landscape painter."
],
[
"Alice A. Andrews( 1866 – 1946), later Alice A. Parker, was an American pianist and composer, and musical director of the Andrews Family Opera.",
"Later in life, she was a vocal coach and instructor in New York City."
],
[
"Carl August Hugo Froelich",
"( 5 September 1875 – 12 February 1953) was a German film pioneer and film director.",
"He was born and died in Berlin."
],
[
"Hassan Zee is a Pakistani- American film director who was born in Chakwal, Pakistan."
],
[
"W. Augustus Barratt( 1873- 1947) was a Scottish- born, later American, songwriter and musician."
],
[
"Rumbi Katedza is a Zimbabwean Film Producer and Director who was born on 17 January 1974."
],
[
"Alice of Norfolk or Alice of Brotherton( c. 1324 – c. 30 January 1352) was an English noblewoman.",
"She was the daughter of Thomas of Brotherton, and a granddaughter of King Edward I of England.",
"She married Edward Montagu, 1st Baron Montagu."
],
[
"The Roundabouts of Handsome Karl( German: Die Umwege des schönen Karl) is a 1938 German comedy film directed by Carl Froelich and starring Heinz Rühmann, Karin Hardt and Sybille Schmitz.",
"It portrays the experiences of a young waiter during the Great Depression."
],
[
"Alice of Wonderland in Paris or Alice in Paris is a 1966 Czech- American animated film directed by Gene Deitch and produced by William L. Snyder in extreme limited animation."
]
],
"title": [
"Gene Deitch",
"Alice Maud Fanner",
"Alice A. Andrews",
"Carl Froelich",
"Hassan Zee",
"W. Augustus Barratt",
"Rumbi Katedza",
"Alice of Norfolk",
"The Roundabouts of Handsome Karl",
"Alice of Wonderland in Paris"
]
},
"supporting_facts": {
"sent_id": [
0,
0,
0,
1
],
"title": [
"Alice of Wonderland in Paris",
"The Roundabouts of Handsome Karl",
"Gene Deitch",
"Carl Froelich"
]
},
"type": "bridge_comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which film has the director who was born later, Alice Of Wonderland In Paris or The Roundabouts Of Handsome Karl?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Alice Of Wonderland In Paris",
"Alice of Wonderland in Paris"
]
},
"style": "rule"
}
|
{
"index": 3,
"split": "test"
}
|
dev_3346
|
Who is Eudocia (Daughter Of Valentinian Iii)'s sibling-in-law?
|
[
"Anicius Olybrius",
"Olybrius",
"Flavius Anicius Olybrius"
] |
{
"context": {
"content": [
[
"Nick Sidi( born 22 February 1966 in Manchester, England) is an English actor.",
"He is the son -in- law of actress Rosalind Knight and brother- in- law of actress Susannah Elliott- Knight.",
"He often plays drama or comedy roles on television and frequently appears in BBC dramas."
],
[
"Eudocia or Eudoxia (439 – 466/474?) was the eldest daughter of Roman emperor Valentinian III and his wife, Licinia Eudoxia.",
"She was thus the granddaughter on her mother's side of Eastern emperor Theodosius II and his wife, the poet Aelia Eudocia; and on her father's side of Western emperor Constantius III and his wife Galla Placida.",
"In the mid-440s, at age 5, Eudocia was betrothed to Huneric, son of the Vandal king Gaiseric (and then a hostage in Italy).",
"This engagement served to strengthen the alliance between the Western court and the Vandal kingdom in Africa.",
"Their marriage did not take place at this time, however, because Eudocia was not yet of age.",
"Eudocia's father was assassinated in 455, and his successor, Petronius Maximus, compelled Eudocia's mother to marry him and Eudocia herself to marry his son, Palladius.",
"In response, the Vandals (reportedly at the request of Eudocia's mother) invaded Italy and captured Eudocia, her mother, and her younger sister, Placidia.",
"After 7 years Eudocia's mother and sister were sent to Constantinople, while Eudocia remained in Africa and married Huneric c. 460.",
"They had a son, Hilderic, who reigned as king of the Vandals, from 523-530.",
"At some time following the birth of Hilderic, Eudocia withdrew to Jerusalem due to religious differences with her Arian husband.",
"She died there and was buried in the sepulcher of her grandmother, Aelia Eudocia."
],
[
"Robert Woodrow Levering( October 3, 1914 – August 11, 1989) was a U.S. Representative from Ohio, son-in- law of Usher L. Burdick and brother- in- law of Quentin N. Burdick."
],
[
"Antony Charles Robert Armstrong- Jones, 1st Earl of Snowdon( 7 March 193013 January 2017), commonly known as Lord Snowdon, was a British photographer and filmmaker.",
"He was the husband of Princess Margaret and brother- in- law of Queen Elizabeth II."
],
[
"Charles Francis Norton( 1807 – 1835) was a 19th- century Member of Parliament for Guildford.",
"Norton was also a Captain of 52nd Light Infantry; son-in- law of Sir Colin Campbell; brother- in- law of writer Caroline Norton"
],
[
"Gertrude Elizabeth, Lady Colin Campbell (\"née\" Blood; 3 May 1857 – 1 November 1911) was an Irish-born journalist, author, playwright, and editor.",
"She was married to Lord Colin Campbell, a brother-in-law of Princess Louise, Queen Victoria's fourth daughter."
],
[
"Wyndham Goold( 1812-1854) was a Member of Parliament( MP) for County Limerick from 1850 to 1854.",
"Goold was educated at Westminster School and Trinity College, Dublin.",
"He was the son in law of Frederick Falkiner Goold, Archdeacon of Raphoe and brother in law of Bishop Hamilton Verschoyle"
],
[
"Placidia was the wife of Olybrius, unrecognized Western Roman Emperor.",
"Her full name is uncertain.",
"The \"Chronicle of the Roman Emperors:",
"The reign by reign record of the rulers of Imperial Rome\" (1995) by Chris Scarre gives her name as Galla Placidia Valentiniana or Galla Placidia the Younger, based on naming conventions for women in ancient Rome."
],
[
"Robert Vadra is an Indian businessman, entrepreneur and the husband of Priyanka Gandhi.",
"He is the son -in- law of Sonia Gandhi and Rajiv Gandhi, and the brother- in- law of Rahul Gandhi."
],
[
"Gregory B. Penner( born December 18, 1969) is an American businessman and venture capitalist.",
"Penner was named the chairman of Walmart on June 5, 2015.",
"He is the son-in- law of S. Robson Walton and the grandson- in- law of Sam Walton, the founder of Walmart."
]
],
"title": [
"Nick Sidi",
"Eudocia (daughter of Valentinian III)",
"Robert W. Levering",
"Antony Armstrong-Jones, 1st Earl of Snowdon",
"Charles Francis Norton",
"Gertrude Elizabeth Blood",
"Wyndham Goold",
"Placidia",
"Robert Vadra",
"Greg Penner"
]
},
"supporting_facts": {
"sent_id": [
6,
0
],
"title": [
"Eudocia (daughter of Valentinian III)",
"Placidia"
]
},
"type": "inference"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Who is Eudocia (Daughter Of Valentinian Iii)'s sibling-in-law?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Anicius Olybrius",
"Olybrius",
"Flavius Anicius Olybrius"
]
},
"style": "rule"
}
|
{
"index": 4,
"split": "test"
}
|
dev_10972
|
Who was born first out of Felix Cicaterri and Rajiv Gandhi?
|
[
"Felix Cicaterri"
] |
{
"context": {
"content": [
[
"John William Joseph McMahon( 28 December 1917 – 8 May 2001) was an Australian- born first- class cricketer who played for Surrey and Somerset in England from 1947 to 1957."
],
[
"Henry Walter Moore( 1849 – 20 August 1916) was an English- born first- class cricketer who spent most of his life in New Zealand."
],
[
"Punjala Shiv Shankar( 10 August 1929 – 27 February 2017) was an Indian politician.",
"He served as the Minister of External Affairs, Law, and Petroleum.",
"He was a very influential minister in Indira Gandhi's and Rajiv Gandhi's cabinets and was one of the most senior politicians in India.",
"He also served as Governor of Sikkim from 1994 to 1995 and Governor of Kerala from 1995 to 1996."
],
[
"Pendekanti Venkatasubbaiah was an Indian politician.",
"He was the Governor of Bihar from 15 March 1985 to 25 February 1988 and the Governor of Karnataka, India from 26 February 1988 to 5 February 1990.",
"He was elected to the Lok Sabha and was the Union Minister of State for Home and Parliamentary Affairs in both Indira Gandhi and Rajiv Gandhi cabinet."
],
[
"Hartley W Lobban (9 May 1926 – 15 October 2004) was a Jamaican-born first-class cricketer who played 17 matches for Worcestershire in the early 1950s."
],
[
"Felix Cicaterri,( S.J.)( July 3, 1804 – July 15, 1873) was the third president of Santa Clara University, California, United States.",
"He was also a one- time rector of Verona's Jesuit College.",
"In 1848 he was made president of the Jesuit College at Vienna now the University of Vienna.",
"After immigrating to the United States, he was a professor of divinity at St. John's College in Fordham, New York.",
"Later he taught philosophy at Georgetown College in the United States.",
"On March 11, 1857, Cicaterri was appointed Santa Clara University's third president and inherited the financial woes created by construction projects.",
"Cicaterri was removed from the University's presidency in 1861 due to Rome's displeasure.",
"In January 1873, he was sent to the College of the Sacred Heart in Woodstock, Maryland, where he died July 15, 1873."
],
[
"Robert Vadra is an Indian businessman, entrepreneur and the husband of Priyanka Gandhi.",
"He is the son -in- law of Sonia Gandhi and Rajiv Gandhi, and the brother- in- law of Rahul Gandhi."
],
[
"Thom Adcox- Hernandez is an American actor and voice actor known for his roles as Brian in the prime- time soap opera,\" Falcon Crest\" and the voices of Felix the Cat in\" The Twisted Tales of Felix the Cat\" and Lexington on\" Gargoyles\"."
],
[
"Wale Adebanwi( born 1969) is a Nigerian- born first Black Rhodes Professor at St Antony's College, Oxford."
],
[
"Rajiv Gandhi( 20 August 1944 – 21 May 1991) was an Indian politician who served as the 6th Prime Minister of India from 1984 to 1989.",
"He took office after the 1984 assassination of his mother, Prime Minister Indira Gandhi, to become the youngest Indian Prime Minister at the age of 40.",
"Gandhi was a scion of the politically powerful Nehru – Gandhi family, which had been associated with the Indian National Congress party.",
"For much of his childhood, his maternal grandfather Jawaharlal Nehru was Prime Minister.",
"Gandhi attended college in the United Kingdom.",
"He returned to India in 1966 and became a professional pilot for the state- owned Indian Airlines.",
"In 1968, he married Sonia Gandhi; the couple settled in Delhi to a domestic life with their children Rahul Gandhi and Priyanka Gandhi Vadra.",
"For much of the 1970s, his mother Indira Gandhi was prime minister and his brother Sanjay Gandhi an MP; despite this, Rajiv Gandhi remained apolitical.",
"After Sanjay's death in a fighter plane crash in 1980, Gandhi reluctantly entered politics at the behest of Indira.",
"The following year he won his brother's Parliamentary seat of Amethi and became a member of the Lok Sabha—the lower house of India's Parliament.",
"As part of his political grooming, Rajiv was made general secretary of the Congress party and given significant responsibility in organising the 1982 Asian Games.",
"On the morning of 31 October 1984, his mother was assassinated by one of her bodyguards; later that day, Gandhi was appointed Prime Minister.",
"His leadership was tested over the next few days as organised mobs rioted against the Sikh community, resulting in anti-Sikh riots in Delhi.",
"That December, Congress party won the largest Lok Sabha majority to date, 411 seats out of 542.",
"Rajiv Gandhi's period in office was mired in controversies; perhaps the greatest crises were the Bhopal disaster, Bofors scandal and the Shah Bano case.",
"In 1988, he reversed the coup in Maldives, antagonising militant Tamil groups such as PLOTE, intervening and then sending peacekeeping troops to Sri Lanka in 1987, leading to open conflict with the Liberation Tigers of Tamil Eelam( LTTE).",
"In mid-1987, the Bofors scandal damaged his corruption- free image and resulted in a major defeat for his party in the 1989 election.",
"Gandhi remained Congress President until the elections in 1991.",
"While campaigning for the elections, he was assassinated by a suicide bomber from the LTTE.",
"His widow Sonia became the president of the Congress party in 1998 and led the party to victory in the 2004 and 2009 parliamentary elections.",
"His son Rahul Gandhi is a Member of Parliament and the current President of Indian National Congress.",
"In 1991, Indian government posthumously awarded Gandhi the Bharat Ratna, the country's highest civilian award.",
"At the India Leadership Conclave in 2009, the\" Revolutionary Leader of Modern India\" award was conferred posthumously on Gandhi."
]
],
"title": [
"John McMahon (Surrey and Somerset cricketer)",
"Henry Moore (cricketer)",
"P. Shiv Shankar",
"Pendekanti Venkatasubbaiah",
"Hartley Lobban",
"Felix Cicaterri",
"Robert Vadra",
"Thom Adcox-Hernandez",
"Wale Adebanwi",
"Rajiv Gandhi"
]
},
"supporting_facts": {
"sent_id": [
0,
0
],
"title": [
"Felix Cicaterri",
"Rajiv Gandhi"
]
},
"type": "comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Who was born first out of Felix Cicaterri and Rajiv Gandhi?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Felix Cicaterri"
]
},
"style": "rule"
}
|
{
"index": 5,
"split": "test"
}
|
dev_11734
|
Did Larry Godfrey and Barbara Strang share the same nationality?
|
[
"yes"
] |
{
"context": {
"content": [
[
"Several people share the surname Dugès:"
],
[
"Gregory La Cava( March 10, 1892 – March 1, 1952) was an American film director of Italian descent best known for his films of the 1930s, including\" My Man Godfrey\" and\" Stage Door\", which earned him nominations for Academy Award for Best Director."
],
[
"Godfrey- Proctor,( Godfrey and Proctor), was a British car manufacturer( circa late 1920s- 1928) founded by Henry Ronald Godfrey( H.R. Godfrey) and Stuart Proctor at Richmond, Surrey in England.",
"Godfrey left G.N.( Godfrey- Nash) to form H.R. Godfrey Motors, providing G.N. spares and rebuilds, which evolved into Godfrey and Proctor who both serviced GN's and produced their own car based on the Austin 7.",
"About 10 cars had been produced when the company ceased trading in 1928.",
"Godfrey went on to found the HRG Engineering Company in 1935 and S.R. Proctor joined in 1950 as technical director, replacing original partner Guy Robbins."
],
[
"Cry Wolf is a 1947 American mystery film directed by Peter Godfrey and featuring Errol Flynn and Barbara Stanwyck, based on the novel of the same name by Marjorie Carleton."
],
[
"Unexpected Uncle is a 1941 American comedy- drama film directed by Peter Godfrey and starring Charles Coburn, Anne Shirley, and James Craig."
],
[
"Alf, Bill and Fred is a 1964 short animated film directed by Bob Godfrey and written by Stan Hayward."
],
[
"Several people share the surname Lacordaire:"
],
[
"Malcolm Pointon( died February 2007) was a pianist and lecturer from Thriplow, England, and the subject of the film\" Malcolm and Barbara- A Love Story\" shown in 1999, and, more recently, of an Independent Television program entitled\" Malcolm and Barbara: Love ’s Farewell\", broadcast on Wednesday, 8 August 2007."
],
[
"Barbara Strang née",
"Barbara Mary Hope Carr later Lady Strang (20 April 1925 – 11 April 1982) was a British English language scholar."
],
[
"Laurence Paul Godfrey( born 9 June 1976, in Bristol) is a British archer.",
"He studied for an HND in Mechanical Manufacturing at the University of the West of England from 1997 to 2000."
]
],
"title": [
"Dugès",
"Gregory La Cava",
"Godfrey-Proctor",
"Cry Wolf (1947 film)",
"Unexpected Uncle",
"Alf, Bill and Fred",
"Lacordaire",
"Malcolm Pointon",
"Barbara Strang",
"Larry Godfrey"
]
},
"supporting_facts": {
"sent_id": [
0,
1
],
"title": [
"Larry Godfrey",
"Barbara Strang"
]
},
"type": "comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Did Larry Godfrey and Barbara Strang share the same nationality?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"yes"
]
},
"style": "rule"
}
|
{
"index": 6,
"split": "test"
}
|
dev_5147
|
Where was the director of film A Winter Of Cyclists born?
|
[
"Chill Factor",
"Southern Ireland",
"ie",
"Éire",
"ireland",
"Eire",
"Ireland, Republic of",
"Hibernia",
"Irish",
"IRL",
"Ireland",
"IE",
"Republic of Ireland",
"🇮🇪"
] |
{
"context": {
"content": [
[
"Olav Aaraas( born 10 July 1950) is a Norwegian historian and museum director.",
"He was born in Fredrikstad.",
"From 1982 to 1993 he was the director of Sogn Folk Museum, from 1993 to 2010 he was the director of Maihaugen and from 2001 he has been the director of the Norwegian Museum of Cultural History.",
"In 2010 he was decorated with the Royal Norwegian Order of St. Olav."
],
[
"Ian Barry is an Australian director of film and TV."
],
[
"Jesse Edward Hobson( May 2, 1911 – November 5, 1970) was the director of SRI International from 1947 to 1955.",
"Prior to SRI, he was the director of the Armour Research Foundation."
],
[
"Peter Levin is an American director of film, television and theatre."
],
[
"Mike Prendergast (born 21 May 1977) is an Irish rugby union coach who started his professional career with Munster.",
"He also played a season with both Bourgoin in France and Gloucester in England before returning for one final season with Munster.",
"He played as a scrum-half and represented Young Munster in the All-Ireland League.",
"Having previously been director of rugby for amateur Irish club Young Munster, Prendergast joined French club Grenoble as a skills coach in 2013, where former Leinster and Ireland hooker Bernard Jackman was then defence coach.",
"Ahead of the 2017–18 Top 14 season, Prendergast joined Oyonnax as their attack coach, before joining Stade Français, where former Munster teammate",
"Paul O'Connell was also a coach, as their backs and attack coach for the 2018–19 season, after Oyonnax were relegated from the Top 14.",
"After one season with Stade Français, Prendergast departed the club to join their Parisian rivals, Racing 92, where former Munster teammate Ronan O'Gara once coached and also where former Munster players Donnacha Ryan and Simon Zebo are currently playing, as their backs coach on a long-term contract ahead of the 2019–20 season."
],
[
"S.N. Mathur was the Director of the Indian Intelligence Bureau between September 1975 and February 1980.",
"He was also the Director General of Police in Punjab."
],
[
"A Winter of Cyclists is a 2013 documentary film by Mike Prendergast.",
"The film chronicles a group of Colorado participants as they attempt to complete a 52-day winter bicycle commuting challenge created by Scot Stucky.",
"The challenge, known as “The Icy Bike Winter Commuting Challenge”, was created to encourage people to cycle to work from October to March, during the darker, colder and the snowier months of the year.",
"The film captures the wide diversity in which riders adapt to new commuting logistics, adverse weather conditions and riding in the dark.",
"And when the Colorado riders encounter an unexpected solidarity across the American continent and into Europe via Facebook, the film provides glimpses into a broader global winter cycling community that is just becoming aware of each other.",
"Facebook Group members encourage each other, share weather reports, gear tips, and inspirational and touching personal stories.",
"The film concludes with the results of the challenge and the riders reflection on their winter of cycling."
],
[
"Alix Madigan is an American film producer known for her work on the 2010 film\" Winter's Bone\", for which she was nominated for an Academy Award."
],
[
"Brian Patrick Kennedy( born 5 November 1961) is an Irish- born art museum director who has worked in Ireland and Australia, and now lives and works in the United States.",
"He is currently the director of the Peabody Essex Museum.",
"He was the director of the Toledo Museum of Art in Ohio from 2010 to 2019.",
"He was the director of the Hood Museum of Art from 2005 to 2010, and the National Gallery of Australia( Canberra) from 1997- 2004."
],
[
"Winter in the Woods is a 1956 German drama film directed by Wolfgang Liebeneiner and starring Claus Holm.",
"It is a remake of the 1936 film\" Winter in the Woods\"."
]
],
"title": [
"Olav Aaraas",
"Ian Barry (director)",
"Jesse E. Hobson",
"Peter Levin",
"Mike Prendergast",
"S. N. Mathur",
"A Winter of Cyclists",
"Alix Madigan",
"Brian Kennedy (gallery director)",
"Winter in the Woods"
]
},
"supporting_facts": {
"sent_id": [
0,
0
],
"title": [
"A Winter of Cyclists",
"Mike Prendergast"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Where was the director of film A Winter Of Cyclists born?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Chill Factor",
"Southern Ireland",
"ie",
"Éire",
"ireland",
"Eire",
"Ireland, Republic of",
"Hibernia",
"Irish",
"IRL",
"Ireland",
"IE",
"Republic of Ireland",
"🇮🇪"
]
},
"style": "rule"
}
|
{
"index": 7,
"split": "test"
}
|
dev_3910
|
Which film has the director who is older than the other, Melody Of Murder or City Of Blood?
|
[
"Melody Of Murder"
] |
{
"context": {
"content": [
[
"Dan Milne is a British actor/ director who is possibly best known for his role in\" EastEnders\"."
],
[
"Melody of Death is a 1922 British silent crime film directed by Floyd Martin Thornton and starring Philip Anthony, Enid R. Reed and Dick Sutherd.",
"It is an adaptation of the 1915 novel\" The Melody of Death\" by Edgar Wallace."
],
[
"Elliot Silverstein( born August 3, 1927, in Boston, Massachusetts) is an American director, who is best known for being the director of the movie\" Cat Ballou\"( 1965)."
],
[
"Ben Palmer is a British film and television director who is known for being the director of\" Bo' Selecta\" and\" The Inbetweeners\"."
],
[
"Scott Fox is a pornographic film director who is a member of the AVN Hall of Fame."
],
[
"The Melody of Murder,( Original Title: Mordets Melodi), is a 1944 Danish film noir directed by Bodil Ipsen and starring Gull- Maj Norin and Poul Reichhardt.",
"The dark tale revolves around a sexually ambiguous serial killer whose crimes are committed to a French cabaret song, and the police suspect a chanteuse who sings the same tune during her performances.",
"Produced during the German occupation of Denmark in World War II, the film was praised by critics for its\" stylish cinematography, thematic intensity, and dark vision of Copenhagen;\" and consider it one of the most influential Danish films of that period."
],
[
"Bodil Ipsen( 30 August 1889 – 26 November 1964) was a Danish actress and film director, and is considered one of the great stars of Danish cinematic history.",
"Her acting career, which began in theater and silent films, was marked by leading roles in large folk comedies and melodramas.",
"However, it was as a director that she was most influential: directing the first Danish film noir and making several dark psychological thrillers during the 1940s and 1950s.",
"Ipsen's name along with that of Bodil Kjer is given to Denmark's most celebrated film prize, the Bodil Award."
],
[
"Darrell James Roodt( born in Johannesburg, April 28, 1962) is a South African film director, screenwriter and producer.",
"He is probably most well known for his 1992 film\" Sarafina!\" which starred actress Whoopi Goldberg.",
"Also regarded as South Africa's most prolific film director, Roodt has worked with the late Patrick Swayze in\" Father Hood\", James Earl Jones in\" Cry, the Beloved Country\" and Ice Cube in\" Dangerous Ground\"."
],
[
"City of Blood is a 1983 South African thriller film directed by Darrell Roodt, and starring Joe Stewardson, Ian Yule, Ken Gampu."
],
[
"White: Melody of Death( lit.",
"White: The Melody of the Curse) is a 2011 South Korean horror film by Kim Gok and Kim Sun."
]
],
"title": [
"Dan Milne",
"Melody of Death",
"Elliot Silverstein",
"Ben Palmer",
"Scotty Fox",
"Melody of Murder",
"Bodil Ipsen",
"Darrell Roodt",
"City of Blood",
"White: Melody of Death"
]
},
"supporting_facts": {
"sent_id": [
0,
0,
0,
0
],
"title": [
"Melody of Murder",
"City of Blood",
"Bodil Ipsen",
"Darrell Roodt"
]
},
"type": "bridge_comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which film has the director who is older than the other, Melody Of Murder or City Of Blood?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Melody Of Murder"
]
},
"style": "rule"
}
|
{
"index": 8,
"split": "test"
}
|
dev_4351
|
Where was the director of film In The Soup (1936 Film) born?
|
[
"Weston-super-Mare, Somerset",
"Weston-super-Mare",
"Weston super Mare"
] |
{
"context": {
"content": [
[
"Henry Edwards (18 September 1882 – 2 November 1952) was an English actor and film director.",
"He appeared in 81 films between 1915 and 1952.",
"He also directed 67 films between 1915 and 1937.",
"Edwards was married to actress Chrissie White, who co-starred in a number of his films.",
"He was born in Weston-super-Mare, Somerset and died in Chobham, Surrey."
],
[
"Dana Blankstein- Cohen( born March 3, 1981) is the director of the Israeli Academy of Film and Television.",
"She is a film director, and an Israeli culture entrepreneur."
],
[
"Peter Levin is an American director of film, television and theatre."
],
[
"Brian Patrick Kennedy( born 5 November 1961) is an Irish- born art museum director who has worked in Ireland and Australia, and now lives and works in the United States.",
"He is currently the director of the Peabody Essex Museum.",
"He was the director of the Toledo Museum of Art in Ohio from 2010 to 2019.",
"He was the director of the Hood Museum of Art from 2005 to 2010, and the National Gallery of Australia( Canberra) from 1997- 2004."
],
[
"Olav Aaraas( born 10 July 1950) is a Norwegian historian and museum director.",
"He was born in Fredrikstad.",
"From 1982 to 1993 he was the director of Sogn Folk Museum, from 1993 to 2010 he was the director of Maihaugen and from 2001 he has been the director of the Norwegian Museum of Cultural History.",
"In 2010 he was decorated with the Royal Norwegian Order of St. Olav."
],
[
"Jason Moore( born October 22, 1970) is an American director of film, theatre and television."
],
[
"Ian Barry is an Australian director of film and TV."
],
[
"S.N. Mathur was the Director of the Indian Intelligence Bureau between September 1975 and February 1980.",
"He was also the Director General of Police in Punjab."
],
[
"In the Soup is a 1936 British comedy film directed by Henry Edwards and starring Ralph Lynn, Judy Gunn, Morton Selten and Nelson Keys."
],
[
"Jesse Edward Hobson( May 2, 1911 – November 5, 1970) was the director of SRI International from 1947 to 1955.",
"Prior to SRI, he was the director of the Armour Research Foundation."
]
],
"title": [
"Henry Edwards (actor)",
"Dana Blankstein",
"Peter Levin",
"Brian Kennedy (gallery director)",
"Olav Aaraas",
"Jason Moore (director)",
"Ian Barry (director)",
"S. N. Mathur",
"In the Soup (1936 film)",
"Jesse E. Hobson"
]
},
"supporting_facts": {
"sent_id": [
0,
4
],
"title": [
"In the Soup (1936 film)",
"Henry Edwards (actor)"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Where was the director of film In The Soup (1936 Film) born?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Weston-super-Mare, Somerset",
"Weston-super-Mare",
"Weston super Mare"
]
},
"style": "rule"
}
|
{
"index": 9,
"split": "test"
}
|
dev_6484
|
Where does Bernard Tschumi's father work at?
|
[
"École Polytechnique Fédérale de Lausanne",
"ETHL",
"Ecole polytechnique federale de Lausanne",
"EPF Lausanne",
"Swiss Federal Institute of Technology in Lausanne",
"EPFL",
"École polytechnique fédérale de Lausanne"
] |
{
"context": {
"content": [
[
"Fiona McIntosh( born 1960) is an author of adult and children's books who lives in Australia.",
"She was born in Brighton, England and between the ages of three and eight, travelled a lot to Africa due to her father's work.",
"At the age of nineteen, she travelled first to Paris and later to Australia, where she has lived ever since.",
"In 2007, she released a crime novel,\" Bye Bye Baby\", under the pseudonym of Lauren Crow, however the pen name has since been dropped for the republished edition of\" Bye Bye Baby\" and for the sequel,\" Beautiful Death\"."
],
[
"Bernard Tschumi (born 25 January 1944 in Lausanne, Switzerland) is an architect, writer, and educator, commonly associated with deconstructivism.",
"Son of the well-known Swiss architect Jean Tschumi and a French mother, Tschumi is a dual French-Swiss national who works and lives in New York City and Paris.",
"He studied in Paris and at ETH in Zurich, where he received his degree in architecture in 1969."
],
[
"Christopher Shinn is an American playwright.",
"His play\" Dying City\"( 2006) was a finalist for the 2008 Pulitzer Prize for Drama, and\" Where Do We Live\"( 2004) won the 2005 Obie Award, Playwriting."
],
[
"Jean André Tschumi (born February 14, 1904",
"Plainpalais, now a neighborhood of Geneva, Switzerland, died January 25, 1962) was a Swiss architect and professor at the École Polytechnique Fédérale de Lausanne.",
"A member of the Modern Movement, Jean Tschumi is known for his buildings for Sandoz (laboratory, factory), Nestlé (pavilions at various international fairs, headquarters), la Mutuelle Vaudoise (headquarters), as well as for some of his projects (underground Paris souterrain, observation tower, etc.)."
],
[
"Emanuel Glicen Romano or Emanuel Glicenstein( 1897 – 1984) was a painter born in Italy.",
"He emigrated to America and spent some time in Safed in Israel.",
"where he organised a museum for his father's work."
],
[
"Dave Grossman is an American game programmer and game designer, most known for his work at Telltale Games and early work at LucasArts.",
"He has also written several children's books, and a book of\" guy poetry\" called\" Ode to the Stuff in the Sink\"."
],
[
"Andrey Ippolitovich Vilkitsky was a Russian hydrographer and surveyor.",
"He was born in the Minsk Governorate.",
"His son, Boris Vilkitsky followed up his father's work; the Vilkitsky Islands are named after him."
],
[
"David Shute is a British journalist, best known for his work at the BBC."
],
[
"Hernando de Cabezón, (baptized 7 September 1541 – 1 October 1602) was a Spanish composer and organist, son of Antonio de Cabezón.",
"Only a few of his works are extant today, and he is chiefly remembered for publishing the bulk of his father's work."
],
[
"Frank Wildhorn( born November 29, 1958) is an American composer known for both his musicals and popular songs.",
"He is most known for his musical\" Jekyll& Hyde\", which ran four years on Broadway, and for writing the# 1 International hit song\" Where Do Broken Hearts Go\" for Whitney Houston."
]
],
"title": [
"Fiona McIntosh",
"Bernard Tschumi",
"Christopher Shinn",
"Jean Tschumi",
"Emanuel Glicen Romano",
"Dave Grossman (game developer)",
"Andrey Vilkitsky",
"David Shute",
"Hernando de Cabezón",
"Frank Wildhorn"
]
},
"supporting_facts": {
"sent_id": [
1,
1
],
"title": [
"Bernard Tschumi",
"Jean Tschumi"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Where does Bernard Tschumi's father work at?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"École Polytechnique Fédérale de Lausanne",
"ETHL",
"Ecole polytechnique federale de Lausanne",
"EPF Lausanne",
"Swiss Federal Institute of Technology in Lausanne",
"EPFL",
"École polytechnique fédérale de Lausanne"
]
},
"style": "rule"
}
|
{
"index": 10,
"split": "test"
}
|
dev_2144
|
When was Prince Frederick William Of Solms-Braunfels's wife born?
|
[
"3 March 1778"
] |
{
"context": {
"content": [
[
"Amalia of Solms- Braunfels( 31 August 1602, Braunfels – 8 September 1675, The Hague), was Princess consort of Orange by marriage to Frederick Henry, Prince of Orange.",
"She acted as the political adviser of her spouse during his reign, and acted as his de facto deputy and regent during his infirmity from 1640 – 47.",
"She also served as chair of the regency council during the minority of her grandson William III, Prince of Orange from 1650 until 1672.",
"She was the daughter of count John Albert I of Solms- Braunfels and countess Agnes of Sayn- Wittgenstein."
],
[
"Ferdinand Wilhelm Ernst, 2nd Prince of Solms- Braunfels( 8 February 1721 in Braunfels – 2 October 1783, ibid.) was the second Prince of Solms- Braunfels.",
"He was the son of Frederick William, Prince of Solms- Braunfels( 1696 – 1761) by his first wife Princess Magdalena Henrietta of Nassau- Weilburg( 1691 – 1725)."
],
[
"Alexander Frederick, Landgrave of Hesse( 25 January 1863, Copenhagen – 26 March 1945) was a Hessian Prince.",
"He was the son of Prince Frederick William of Hesse- Kassel and Princess Anna of Prussia.",
"From 1888–1925 he was Head of the House of Hesse, but abdicated his position to his brother Prince Frederick Charles of Hesse.",
"He was visually handicapped from birth, and this, in addition to his morganatic marriage, played a part in his decision to abdicate."
],
[
"Christine Charlotte of Solms- Braunfels( 10 November 1690 in Greifenstein – 16 October 1771 in Homburg) was a Countess of Solms- Braunfels by birth and by marriage Landgravine of Hesse- Homburg."
],
[
"Prince Frederick William of Solms- Braunfels( 11 January 1696 in Braunfels – 24 February 1761, Braunfels) was the first Prince of Solms- Braunfels.",
"He was the son of Count William Maurice of Solms- Braunfels( 1651 – 1724) and his wife Magdalene Sophie of Hesse- Homburg( 1660 – 1720), a daughter of William Christoph, Landgrave of Hesse- Homburg, and his first wife."
],
[
"Wilhelm Christian Karl, 3rd Prince of Solms Braunfels (9 January 1759, in Braunfels – 20 March 1837, in Braunfels) was by succession an immediate Prince, then a nobleman and head of the Princely House of Solms-Braunfels, a Prussian major general and Hessian deputy."
],
[
"Prince Frederick William of Solms-Braunfels (22 October 1770 in Braunfels – 13 April 1814 in Slawentzitz) was a Prussian Major General.",
"He was the fourth son of Ferdinand William Ernest, 2nd Prince of Solms-Braunfels (1721–1783) and Countess Sophie Christine Wilhelmine of Solms-Laubach (1741–1772).",
"He became known mainly through his marriage to Princess Frederica of Mecklenburg-Strelitz.",
"She was the widow of Prince Louis Charles of Prussia (1773–1796).",
"When she became pregnant in 1798, he married her in order to avoid a scandal.",
"The daughter died soon after birth.",
"Frederick William was alleged to have been strongly inclined to consume alcohol and had to quit military service in 1805 for health reasons.",
"He also lost his income and even his brother advised Frederica to divorce.",
"She was initially against it, but when in 1813, she met Ernest Augustus, King of Hanover (1771–1851), she, too, wanted the divorce.",
"Before they got around to it, however, Frederick William died, in 1814 in Sławięcice."
],
[
"Frederica of Mecklenburg-Strelitz (\"Friederike Louise Caroline Sophie Charlotte Alexandrine\") (3 March 1778 – 29 June 1841) was a German princess who became, by marriage, princess of Prussia, princess of Solms-Braunfels, Duchess of Cumberland in Britain and Queen of Hanover (in Germany) as the consort of Ernest Augustus, King of Hanover (the fifth son and eighth child of King George III).",
"She was born in the \"Altes Palais\" of Hanover as the fifth daughter of Charles II, Duke of Mecklenburg-Strelitz, and his first wife, Frederica, daughter of Prince George William of Hesse-Darmstadt.",
"Her father assumed the title of Grand Duke of Mecklenburg on 18 June 1815.",
"Duchess Frederica was the niece of her future mother-in-law, Queen Charlotte, through her father."
],
[
"Countess Amalie Henriette Charlotte of Solms- Baruth( Kliczków, 30 January",
"1768 – Karlsruhe, 31 October 1847) was a countess by birth of Solms- Baruth.",
"She was the only child of John Christian II, Count of Solms- Baruth and his wife, Friederike Louise of Reuss- Köstritz."
],
[
"Johanetta of Sayn-Wittgenstein (15 February 1561 – 13 April 1622) was a daughter of Count Louis I of Sayn-Wittgenstein and his first wife, Anna of Solms-Braunfels, who was a relative of Amalia of Solms-Braunfels."
]
],
"title": [
"Amalia of Solms-Braunfels",
"Ferdinand, Prince of Solms-Braunfels",
"Alexander Frederick, Landgrave of Hesse",
"Christine Charlotte of Solms-Braunfels",
"Frederick William, Prince of Solms-Braunfels",
"William, Prince of Solms-Braunfels",
"Prince Frederick William of Solms-Braunfels",
"Frederica of Mecklenburg-Strelitz",
"Countess Amalie Henriette of Solms-Baruth",
"Johannetta of Sayn-Wittgenstein (1561–1622)"
]
},
"supporting_facts": {
"sent_id": [
2,
0
],
"title": [
"Prince Frederick William of Solms-Braunfels",
"Frederica of Mecklenburg-Strelitz"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: When was Prince Frederick William Of Solms-Braunfels's wife born?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"3 March 1778"
]
},
"style": "rule"
}
|
{
"index": 11,
"split": "test"
}
|
dev_11004
|
Where does Andy Smith (Entrepreneur)'s wife work at?
|
[
"Leland Stanford Junior University",
"The Farm",
"Stanford University",
"Stanford",
"University of Stanford"
] |
{
"context": {
"content": [
[
"Rear- Admiral Andrew( Andy) Smith is a former Royal Canadian Navy officer.",
"Smith was the former Chief of Military Personnel and Chief of Transformation, until retiring in 2013 to move to a senior role with Public Works and Government Services Canada."
],
[
"Where Do You Go may refer to:"
],
[
"David Shute is a British journalist, best known for his work at the BBC."
],
[
"Andy Smith( born 6 July 1984) is an English former professional rugby league footballer who played as a.",
"Smith was the first try scorer of the annual magic weekend in 2007."
],
[
"Andrew\" Andy\" Smith( born 25 May 1966 in York, England) is a retired British motorcycle speedway rider."
],
[
"Dave Grossman is an American game programmer and game designer, most known for his work at Telltale Games and early work at LucasArts.",
"He has also written several children's books, and a book of\" guy poetry\" called\" Ode to the Stuff in the Sink\"."
],
[
"Andy Smith (born January 9, 1968) is an American entrepreneur, start up advisor, and author.",
"He is best known for \"The Dragonfly Effect: Quick, Effective, and Powerful Ways To Use Social Media to Drive Social Change,\" which he wrote with his wife, Dr. Jennifer Aaker.",
"Described by the \"New Yorker\" as \"the first book to explain what happens at the intersection of social media like Twitter, marketing and psychologyThe Dragonfly Effect\" examines how people use social media to connect and form groups that create global change.",
"In 2014, he partnered with Jay Adelson to form Center Electric, an early-stage technology venture capital firm designed to leverage the growth of the Internet of Things.",
"Center Electric was an outgrowth of Smith and Adelson's passion project, Point Option, which develops next generation devices for in-home use."
],
[
"Christopher Shinn is an American playwright.",
"His play\" Dying City\"( 2006) was a finalist for the 2008 Pulitzer Prize for Drama, and\" Where Do We Live\"( 2004) won the 2005 Obie Award, Playwriting."
],
[
"Andrew or Andy Smith may refer to:"
],
[
"Jennifer Aaker (born 15 January 1967, California) is an American social psychologist, author, and General Atlantic Professor of Marketing at the Stanford Graduate School of Business.",
"Best known for her research on time, money, and happiness, Aaker also focuses on the transmission of ideas through social networks, the power of story in decision making, and how to build global brands across cultures.",
"She is the recipient of the Distinguished Scientific Achievement Award from the Society for Consumer Psychology and the Stanford Distinguished Teaching Award."
]
],
"title": [
"Andrew Smith (military officer)",
"Where Do You Go",
"David Shute",
"Andy Smith (rugby league)",
"Andy Smith (speedway rider)",
"Dave Grossman (game developer)",
"Andy Smith (entrepreneur)",
"Christopher Shinn",
"Andrew Smith",
"Jennifer Aaker"
]
},
"supporting_facts": {
"sent_id": [
1,
0
],
"title": [
"Andy Smith (entrepreneur)",
"Jennifer Aaker"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Where does Andy Smith (Entrepreneur)'s wife work at?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Leland Stanford Junior University",
"The Farm",
"Stanford University",
"Stanford",
"University of Stanford"
]
},
"style": "rule"
}
|
{
"index": 12,
"split": "test"
}
|
dev_10575
|
Are both On A Magical Night and The Mongols (Film) from the same country?
|
[
"yes"
] |
{
"context": {
"content": [
[
"Date at Midnight is a 1959 British crime film from the Danzigers."
],
[
"Bombay Dreams is a Swedish comedy film from 2004."
],
[
"On a Magical Night is a 2019 French comedy film written and directed by Christophe Honoré and starring Chiara Mastroianni.",
"It was screened in the Un Certain Regard section at the 2019 Cannes Film Festival where Mastroianni won the award for Best Performance."
],
[
"One and Five is a 1969 short film from Egypt."
],
[
"The Mongols is a 1961 Italian/ French international co-production historical adventure film directed by Andre DeToth and Leopoldo Savona, and starring Jack Palance and Anita Ekberg."
],
[
"Shima is a 2007 film from Uzbekistan."
],
[
"A Kind of America is a Hungarian comedy film from 2002."
],
[
"Cayuco is a Spanish documentary film from 2007."
],
[
"21 ème Siècle is a 2004 film from Senegal."
],
[
"California Winter is an American drama film from 2012."
]
],
"title": [
"Date at Midnight",
"Bombay Dreams (film)",
"On a Magical Night",
"One and Five",
"The Mongols (film)",
"Shima (film)",
"A Kind of America",
"Cayuco (film)",
"21ème Siècle",
"California Winter"
]
},
"supporting_facts": {
"sent_id": [
0,
0
],
"title": [
"On a Magical Night",
"The Mongols (film)"
]
},
"type": "comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Are both On A Magical Night and The Mongols (Film) from the same country?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"yes"
]
},
"style": "rule"
}
|
{
"index": 13,
"split": "test"
}
|
dev_4915
|
Which film has the director died earlier, L'Arbre, Le Maire Et La Médiathèque or Sette Giorni All'Altro Mondo?
|
[
"Sette Giorni All'Altro Mondo",
"Sette giorni all'altro mondo"
] |
{
"context": {
"content": [
[
"Jacob Le Maire( c. 1585, Antwerp or Amsterdam- 22 December 1616, at sea) was a Dutch mariner who circumnavigated the earth in 1615 and 1616.",
"The strait between Tierra del Fuego and Isla de los Estados was named the Le Maire Strait in his honor, though not without controversy.",
"It was Le Maire himself who proposed to the council aboard\" Eendracht\" that the new passage should be called by his name and the council unanimously agreed with Le Maire.",
"The author or authors of\" The Relation\" took\" Eendracht\" captain Schouten ’s side by proclaiming:\" Eendracht\" then rounded Cape Horn, proving that Tierra del Fuego was not a continent."
],
[
"Sette giorni all' altro mondo is a 1936 Italian comedy film directed by Mario Mattoli."
],
[
"William\" Willy\" le Maire de Warzée d' Hermalle( 7 March 1879 – 1966) was a Belgian tennis player.",
"He has been named as among the best male tennis players in Belgian history.",
"Le Maire de Warzée d' Hermalle reached the quarterfinals of Wimbledon in 1904 and 1908.",
"Le Maire de Warzée d' Hermalle was born into the Belgian nobility, the son of Marie- Léon- Hubert le Maire Warzée d' Hermalle( born 1848) and Noëmie Constance Caroline Georgine Valérie de Warzée d' Hermalle( born in Beaufays in 1850).",
"His elder brother, Marie- Georges- Gérard- Léon le Maire de Warzée d' Hermalle, Baron de Warzée d' Hermalle( 1877 – 1931), was a diplomat and Belgian ambassador to Iran, Japan and China, who married Dorothy Hall, daughter of Irish writer Owen Hall."
],
[
"Le masque de la Méduse is a 2009 fantasy horror film directed by Jean Rollin.",
"The film is a modern-day telling of the Greek mythological tale of the Gorgon and was inspired by the 1964 classic Hammer Horror film of the same name and the 1981 cult classic\" Clash of the Titans\".",
"It was Rollin's final film, as the director died in 2010."
],
[
"Thulasi is a 1987 Tamil romantic drama film directed by Ameerjan.",
"The film features Murali and Seetha in lead roles.",
"The film, produced by P. S. V. Hariharan, had musical score by Sampath Selvam and was released on 27 November 1987.",
"The music director died before the release of the film."
],
[
"Un altro mondo( English title: Another World) is a 2010 Italian drama film directed by and starring Silvio Muccino and based on the novel of the same name by Carla Vangelista."
],
[
"Mario Mattòli( 30 November 1898 – 26 February 1980) was an Italian film director and screenwriter.",
"He directed 86 films between 1934 and 1966.",
"His 1939 film\" Defendant, Stand Up!\" was shown as part of a retrospective on Italian comedy at the 67th Venice International Film Festival."
],
[
"Querelle is a 1982 West German- French English- language drama film directed by Rainer Werner Fassbinder and starring Brad Davis, adapted from French author Jean Genet's 1947 novel\" Querelle of Brest\".",
"It marked Fassbinder's final film as a writer/ director; it was posthumously released just months after the director died of a drug overdose in June 1982."
],
[
"L'Arbre, le maire et la médiathèque ou les sept hasards is a French comedy film directed by Éric Rohmer.",
"It was shown at the 1993 Montreal World Film Festival where it received the FIPRESCI prize.",
"It is known variously by the alternate title\" Les Sept hasards\" and in English translation as\" The Tree, the Mayor and the Mediatheque\"."
],
[
"Jean Marie Maurice Schérer or Maurice Henri Joseph Schérer, known as Éric Rohmer (21 March 192011 January 2010), was a French film director, film critic, journalist, novelist, screenwriter, and teacher.",
"Rohmer was the last of the post-World War II French New Wave directors to become established.",
"He edited the influential film journal, \"Cahiers du cinéma\", from 1957 to 1963, while most of his colleagues—among them Jean-Luc Godard and François Truffaut—were making the transition from film critics to filmmakers and gaining international attention.",
"Rohmer gained international acclaim around 1969 when his film \"My Night at Maud's\" was nominated at the Academy Awards.",
"He won the San Sebastián International Film Festival with \"Claire's Knee\" in 1971 and the Golden Lion at the Venice Film Festival for \"The Green Ray\" in 1986.",
"Rohmer went on to receive the Venice Film Festival's Career Golden Lion in 2001.",
"After Rohmer's death in 2010, his obituary in \"The Daily Telegraph\" described him as \"the most durable filmmaker of the French New Wave\", outlasting his peers and \"still making movies the public wanted to see\" late in his career."
]
],
"title": [
"Jacob Le Maire",
"Sette giorni all'altro mondo",
"William le Maire de Warzée d'Hermalle",
"Le Masque de la Méduse",
"Thulasi (1987 film)",
"Un altro mondo",
"Mario Mattoli",
"Querelle",
"L'Arbre, le maire et la médiathèque",
"Éric Rohmer"
]
},
"supporting_facts": {
"sent_id": [
0,
0,
0,
0
],
"title": [
"L'Arbre, le maire et la médiathèque",
"Sette giorni all'altro mondo",
"Éric Rohmer",
"Mario Mattoli"
]
},
"type": "bridge_comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which film has the director died earlier, L'Arbre, Le Maire Et La Médiathèque or Sette Giorni All'Altro Mondo?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Sette Giorni All'Altro Mondo",
"Sette giorni all'altro mondo"
]
},
"style": "rule"
}
|
{
"index": 14,
"split": "test"
}
|
dev_7491
|
Are the directors of both films The Great Profile and Kim Kardashian, Superstar from the same country?
|
[
"yes"
] |
{
"context": {
"content": [
[
"Kim Kardashian, Superstar( also known simply as Kim K Superstar) is a 2007 pornographic film featuring American television personality Kim Kardashian and singer- actor Ray J.",
"It depicts the pair having sexual intercourse in October 2002 while on vacation in Cabo San Lucas, Mexico."
],
[
"Hranush Alexis\" Hrush\" Achemyan( born July 19, 1987) is an Armenian- American celebrity make- up artist and beauty influencer.",
"She is best known for being a personal stylist for Kylie Jenner and Kim Kardashian."
],
[
"Kimberly Noel Kardashian West(\" née\" Kardashian; born October 21, 1980) is an American media personality, model, businesswoman, socialite and actress.",
"Kardashian first gained media attention as a friend and stylist of Paris Hilton but received wider notice after a 2002 sex tape,\" Kim Kardashian, Superstar\", with her then- boyfriend Ray J was released in 2007.",
"Later that year, she and her family began to appear in the E! reality television series\" Keeping Up with the Kardashians\"( 2007 – present).",
"Its success soon led to the creation of spin- offs including\" Kourtney and Kim Take New York\"( 2011–2012) and\" Kourtney and Kim Take Miami\"( 2009 – 2013).",
"In recent years, Kim Kardashian has developed an online and social media presence, including hundreds of millions of followers on Twitter and Instagram.",
"She has released a variety of products tied to her name, including the 2014 mobile game, a variety of clothing and products, the 2015 photo book\" Selfish\" and her eponymous personal app.",
"Her relationship with rapper Kanye",
"West has also received significant media coverage; the couple married in 2014, and they have four children together.",
"Kardashian has appeared in films such as\" Disaster Movie\"( 2008),\" Deep in the Valley\"( 2009) and( 2013).",
"\" Time\" magazine included Kardashian on their list of 2015's 100 most influential people, while\" Vogue\" described her in 2016 as a\" pop culture phenomenon.\"",
"Both critics and admirers have described her as exemplifying the notion of being famous for being famous.",
"She was reported to be the highest- paid reality television personality of 2015, with her estimated total earnings exceeding US$ 53 million."
],
[
"I Want Your",
"Love is the title of both a 2010 short film and a 2012 feature- length film.",
"Both films were directed and written by Travis Mathews.",
"The drama films both revolve around the friends and ex-lovers of Jesse Metzger, a gay man in his mid-thirties who is forced to move back to his hometown from San Francisco due to financial reasons.",
"The actors' own names, along with much of their real- life stories, were used for their characters in both films, which features graphic sexual scenes.",
"The production of both films was aided by the gay pornographic studio NakedSword.",
"This led to the full- length film being refused exemption from classification, which would have allowed it to screen at the Melbourne Queer Film Festival, a decision to which actor James Franco( who invited Mathews to collaborate on Franco's film\" Interior. Leather Bar.\") reacted negatively."
],
[
"William Raymond Norwood Jr.( born January 17, 1981), known professionally as Ray J, is an American rapper, singer, songwriter, television personality, actor, and entrepreneur.",
"Born in McComb, Mississippi and raised in Carson, California, he is the brother of recording artist and actress Brandy Norwood and the first cousin of rapper Snoop Dogg.",
"In January 2017, he competed in the nineteenth season of the UK reality television programme\" Celebrity Big Brother\"."
],
[
"The Great Profile is a 1940 film directed by Walter Lang and starring John Barrymore and John Payne.",
"Barrymore lampoons himself.",
"A famous actor, given to drink, nearly destroys the show, but his leading lady returns to save it.",
"Meanwhile, a young girl tries to reform him."
],
[
"Sophie Theallet( Born 1964 in Bagnères- de- Bigorre, France) is a French fashion designer whose clients include First Lady Michelle Obama, Jennifer Lopez, Jessica Alba, Oprah Winfrey, Gabrielle Union, and Kim Kardashian"
],
[
"Matt Corboy( born June 4, 1973) is an American actor.",
"He has appeared in both films and television series."
],
[
"Walter Lang( August 10, 1896 – February 7, 1972) was an American film director."
],
[
"Leopoldo Salcedo( March 12, 1912 – June 11, 1998) was a two- time FAMAS award- winning Filipino film actor who specialized in portraying dramatic heroes.",
"Dubbed as\" The Great Profile\", he was said to be among the first\" kayumanggi\" or dark- skinned Filipino film stars, in contrast to the lighter- skinned mestizo actors of his generation."
]
],
"title": [
"Kim Kardashian, Superstar",
"Hrush Achemyan",
"Kim Kardashian",
"I Want Your Love (film)",
"Ray J",
"The Great Profile",
"Sophie Theallet",
"Matt Corboy",
"Walter Lang",
"Leopoldo Salcedo"
]
},
"supporting_facts": {
"sent_id": [
0,
0,
0,
0
],
"title": [
"The Great Profile",
"Kim Kardashian, Superstar",
"Walter Lang",
"Ray J"
]
},
"type": "bridge_comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Are the directors of both films The Great Profile and Kim Kardashian, Superstar from the same country?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"yes"
]
},
"style": "rule"
}
|
{
"index": 15,
"split": "test"
}
|
dev_5180
|
Which film whose director is younger, Vendredi Soir or El Santo De La Espada?
|
[
"Vendredi soir",
"Vendredi Soir"
] |
{
"context": {
"content": [
[
"Miloš Zličić( born 29 December 1999) is a Serbian football forward.",
"He is a younger brother of Lazar Zličić."
],
[
"Leopoldo Torre Nilsson( 5 May 1924 – 8 September 1978), also known as Leo Towers and as Babsy, was an Argentine film director, producer and screenwriter.",
"Born as Leopoldo Torres Nilsson( he later changed his paternal surname from Torres to Torre) was the son of Argentine pioneer film director Leopoldo Torres Ríos, with whom he collaborated between 1939 and 1949.",
"He debuted in 1947 with the short\" El muro\".",
"His mother was an Argentinian citizen of Swedish descent.",
"His uncle was cinematographer Carlos Torres Ríos( 1898 – 1956).",
"Torre Nilsson's first full- length film,\" El crimen de Oribe\"( 1950), was an adaptation of Adolfo Bioy Casares's novel\" El perjurio de la nieve\".",
"In 1954 he directed\" Días de odio\", based on Jorge Luis Borges's short story\" Emma Zunz\".",
"In 1956 he directed\" Graciela\", based on Carmen Laforet's novel\" Nada\", winner of Nadal Literary Prize 1944.",
"He also directed films about icons of Argentine history and culture:\" Martín Fierro\"( 1968), about the main character of Argentina's national poem;\" El Santo de la Espada\"( 1970), about General José de San Martín; and( 1971), about Martín Miguel de Güemes.",
"It was entered into the 7th Moscow International Film Festival.",
"His 1973 film\" Los siete locos\" won the Silver Bear at the 23rd Berlin International Film Festival.",
"Torre Nilsson was married to writer Beatriz Guido, whose work served as inspiration and who worked alongside him in many of his scripts.",
"He is acknowledged as the first Argentine film director to be critically acclaimed outside the country, making Argentina's film production known in important international festivals.",
"He died of cancer in his native Buenos Aires in 1978, at the age of 54.",
"He was buried at the Cementerio Británico in Buenos Aires.",
"A novelized biography of Torre Nilsson,\" El Gran Babsy\", by Mónica Martín, was published in 1993.",
"Another biography,\" Leopoldo Torre Nilsson: Imagen y Poesía\" was published in 2006 by the newspaper\" La Nación\" and the Aguilar editorial house."
],
[
"Hugo Arana( born July 23, 1943) is an Argentine film, television and theatre actor.",
"Arana was born in Buenos Aires, grew up in Monte Grande and moved with his family to Lomas de Zamora and then Lanús.",
"He studied acting with Marcello Lavalle and Augusto Fernandez.",
"In his first years as an actor, he was part of a theatre group called\" Errare Humanum Est\" and he acted in films such as\" El Santo de la Espada\"( 1970) and\" La tregua\"( 1974).",
"In the 1980s, he became popular for his part in an advertisement for Crespi wine, and then for his part in the TV sitcom\" Matrimonios y algo más\", created by Hugo Moser, in which he played two characters who were highly acclaimed by the public: the\" Groncho\"( in the comedy sketch\" El Groncho y La Dama\"( The Shabby Man and the Lady)) and Huguito Araña( a stereotypically feminine gay man).",
"He has worked on the Telefé TV series\" Los exitosos Pells\", where he played the director of the fictitious channel\" Mega News\", Franco Andrada."
],
[
"El Santo de la Espada( in English,\" The Saint of the Sword\") is a 1970 Argentine historical epic film directed by Leopoldo Torre Nilsson and starring Alfredo Alcón.",
"It narrates the life of José de San Martín.",
"It was written by Beatriz Guido and Luis Pico Estrada, based on the eponymous novel by Ricardo Rojas.",
"The script was supervised by the Sanmartinian National Institute.",
"It had a great success, and it is the most successful movie about\" El Libertador\" San Martín, although it received some negative reviews due to historical inaccuracies."
],
[
"Jesus Christ Vampire Hunter is a 2001 horror parody film from Odessa Filmworks which deals with Jesus' modern-day struggle to protect the lesbians of Ottawa, Ontario, Canada, from vampires with the help of Mexican wrestler El Santo (based on El Santo, Enmascarado de Plata, and played by actor Jeff Moffet, who starred as El Santo in two other Odessa Filmworks productions).",
"This film earned an honorable mention in the Spirit of Slamdance category at the 2002 Slamdance Film Festival and is considered a cult film."
],
[
"Vendredi soir( Friday Night) is a 2002 drama film directed by Claire Denis.",
"The screenplay was written by Claire Denis and Emmanuèle Bernheim, based upon Bernheim's novel of the same name.",
"The film premiered at the 2002 Venice Film Festival."
],
[
"Claire Denis( born 21 April 1946) is a French film director and writer.",
"Her feature film\" Beau Travail\"( 1999) has been called one of the greatest films of the 1990s.",
"Other acclaimed works include\" Trouble Every Day\"( 2001),\" 35 Shots of Rum\"( 2008), and\" White Material\"( 2009).",
"Her work has dealt with themes of colonial and post-colonial West Africa, as well as issues in modern France, and continues to influence European cinematic identity."
],
[
"Marcos Eusebio Jiménez de la Espada Martín( born 3 November 1985), known simply as Marcos, is a Spanish footballer who plays for Indian club East Bengal as a forward."
],
[
"La Bestia humana is a 1957 Argentine film whose story is based on the novel\" La Bête Humaine\" by the French writer Émile Zola."
],
[
"In 1998 she wrote Vendredi soir( Friday night), a novel that was adapted into film by Claire Denis in 2002.",
"She also worked with Michel Houellebecq on a film adaptation of his novel Platform."
]
],
"title": [
"Miloš Zličić",
"Leopoldo Torre Nilsson",
"Hugo Arana",
"El Santo de la Espada",
"Jesus Christ Vampire Hunter",
"Vendredi soir",
"Claire Denis",
"Marcos Jiménez de la Espada Martín",
"La Bestia humana",
"Emmanuèle Bernheim"
]
},
"supporting_facts": {
"sent_id": [
0,
0,
0,
0
],
"title": [
"Vendredi soir",
"El Santo de la Espada",
"Claire Denis",
"Leopoldo Torre Nilsson"
]
},
"type": "bridge_comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which film whose director is younger, Vendredi Soir or El Santo De La Espada?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Vendredi soir",
"Vendredi Soir"
]
},
"style": "rule"
}
|
{
"index": 16,
"split": "test"
}
|
dev_12318
|
Who is the spouse of the director of film Marianne And Juliane?
|
[
"Volker Schlöndorff"
] |
{
"context": {
"content": [
[
"Gertrude of Saxony and Bavaria( 1152/55–1197) was Duchess of Swabia as the spouse of Duke Frederick IV, and Queen of Denmark as the spouse of King Canute VI."
],
[
"Sophia Magdalena of Denmark (3 July 1746 – 21 August 1813) was Queen of Sweden as the spouse of King Gustav III."
],
[
"Marianne and Juliane (lit. \"",
"The Leaden Time\" or \"Leaden Times\"), also called The German Sisters in the United Kingdom, is a 1981 West German film directed by Margarethe von Trotta.",
"The screenplay is a fictionalized account of the true lives of Christiane and Gudrun Ensslin.",
"Gudrun, a member of The Red Army Faction, was found dead in her prison cell in Stammheim in 1977.",
"In the film, Von Trotta depicts the two sisters Juliane (Christine) and Marianne (Gudrun) through their friendship and journey to understanding each other. \"",
"Marianne and Juliane\" was von Trotta's third film and solidified her position as a director of the New German Cinema.",
"\"Marianne and Juliane\" also marked the first time that von Trotta worked with Barbara Sukowa.",
"They would go on to work on six more films together."
],
[
"Heather Denise Gibson is a Scottish economist currently serving as Director- Advisor to the Bank of Greece( since 2011).",
"She is the spouse of Euclid Tsakalotos, former Greek Minister of Finance."
],
[
"Margarethe von Trotta (born 21 February 1942) is a German film director who has been referred to as a \"leading force\" of the New German Cinema movement.",
"Von Trotta's extensive body of work has won awards internationally.",
"She was married to and collaborated with director Volker Schlöndorff.",
"Although they made a successful team, von Trotta felt she was seen as secondary to Schlöndorff.",
"Subsequently, she established a solo career for herself and became \"Germany’s foremost female film director, who has offered the most sustained and successful female variant of \"Autorenkino\" in postwar German film history\".",
"Certain aspects of von Trotta's work have been compared to Ingmar Bergman's features from the 1960s and 1970s.",
"Von Trotta has been called \"the world’s leading feminist filmmaker.",
"\" The predominant aim of her films is to create new representations of women.",
"Her films are concerned with relationships between and among women (sisters, best friends, etc.), as well as with relationships between women and men, and involve political setting.",
"Nevertheless, she rejects the suggestion that she makes \"women’s films\"."
],
[
"Mehdi Abrishamchi is an Iranian People's Mujahedin of Iran( MEK) politician who has been described as\" the right hand man of Massoud Rajavi\".",
"He is the former spouse of the group's current leader, Maryam Rajavi, whom he divorced so that she could become the spouse of Massoud Rajavi."
],
[
"Princess Auguste of Bavaria( 28 April 1877 – 25 June 1964) was a member of the Bavarian Royal House of Wittelsbach and the spouse of Archduke Joseph August of Austria."
],
[
"Maria Teresa, Grand Duchess of Luxembourg( born María Teresa Mestre y Batista; on 22 March 1956), is the spouse of Grand Duke Henri."
],
[
"Marie Louise Coidavid( 1778 – March 11, 1851), was the Queen of the Kingdom of Haiti 1811 – 20 as the spouse of Henri I of Haiti."
],
[
"Adib Kheir was a leading Syrian nationalist of the 1920s.",
"He was the owner of the Librairie Universelle in Damascus.",
"His granddaughter is the spouse of Manaf Tlass."
]
],
"title": [
"Gertrude of Bavaria",
"Sophia Magdalena of Denmark",
"Marianne and Juliane",
"Heather D. Gibson",
"Margarethe von Trotta",
"Mehdi Abrishamchi",
"Princess Auguste of Bavaria (1875–1964)",
"Maria Teresa, Grand Duchess of Luxembourg",
"Marie-Louise Coidavid",
"Adib Kheir"
]
},
"supporting_facts": {
"sent_id": [
1,
2
],
"title": [
"Marianne and Juliane",
"Margarethe von Trotta"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Who is the spouse of the director of film Marianne And Juliane?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Volker Schlöndorff"
]
},
"style": "rule"
}
|
{
"index": 17,
"split": "test"
}
|
dev_1188
|
What is the place of birth of Gwladys's husband?
|
[
"Gwynllwg"
] |
{
"context": {
"content": [
[
"Devisingh Ransingh Shekhawat is the former and first First Gentleman of India as the husband of Pratibha Patil."
],
[
"James Randall Marsh( 1896- 1965) was an American artist and the husband of Anne Steele Marsh."
],
[
"Jonathan Leach( born April 18, 1973) is a former professional tennis player from the United States.",
"He is the husband of Lindsay Davenport."
],
[
"Saint Gwynllyw Milwr or Gwynllyw Farfog, known in English in a corrupted form as Woolos the Warrior or Woolos the Bearded (450 – 500 AD) was a Welsh king and religious figure.",
"He was King of Gwynllwg in South Wales and is the legendary founder and patron saint of the City of Newport living around the 5th century.",
"According to medieval tradition he was a feared warlord and raider who knew King Arthur, but later found religion and became a hermit founding St Woolos Cathedral in Newport.",
"He was the father of one of the most revered Welsh saints, Saint Cadoc the Wise."
],
[
"Andrew Upton( born 1 February 1966) is an Australian playwright, screenwriter, and director.",
"He is the husband of actress Cate Blanchett."
],
[
"Stan Rice( November 7, 1942 – December 9, 2002) was an American poet and artist.",
"He was the husband of author Anne Rice."
],
[
"Stan Marks is an Australian writer and journalist.",
"He is the husband of Holocaust survivor Eva Marks."
],
[
"Saint Gwladys ferch Brychan or St Gladys (Latin-Gladusa), daughter of King Brychan of Brycheiniog, was the queen of the saint-king Gwynllyw Milwr and the mother of Cadoc \"the Wise\", whose \"vita\" may be the earliest saint's life to mention Arthur.",
"Gwladys's other children were Cynidr, Bugi, Cyfyw, Maches, Glywys II and Egwine.",
"Today her main church and associated school is in Bargoed."
],
[
"Stephen Alan Fletcher Pownall( born 30 December 1986) is an English singer- songwriter and the husband of actress Gabriella Wilde."
],
[
"Lapidoth(\" Lappidothtorches\") was the husband of Deborah the fourth Judge of Israel.",
"Lapidoth is also a Hebrew male given name."
]
],
"title": [
"Devisingh Ransingh Shekhawat",
"James Randall Marsh",
"Jon Leach",
"Gwynllyw",
"Andrew Upton",
"Stan Rice",
"Stan Marks",
"Gwladys",
"Alan Pownall",
"Lapidoth"
]
},
"supporting_facts": {
"sent_id": [
0,
1
],
"title": [
"Gwladys",
"Gwynllyw"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: What is the place of birth of Gwladys's husband?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Gwynllwg"
]
},
"style": "rule"
}
|
{
"index": 18,
"split": "test"
}
|
dev_152
|
Where did the director of film My Heidelberg, I Can Not Forget You die?
|
[
"AR",
"Argentine Republic",
"🇦🇷",
"Argentina",
"ARG",
"ar"
] |
{
"context": {
"content": [
[
"Dana Blankstein- Cohen( born March 3, 1981) is the director of the Israeli Academy of Film and Television.",
"She is a film director, and an Israeli culture entrepreneur."
],
[
"Jesse Edward Hobson( May 2, 1911 – November 5, 1970) was the director of SRI International from 1947 to 1955.",
"Prior to SRI, he was the director of the Armour Research Foundation."
],
[
"Olav Aaraas( born 10 July 1950) is a Norwegian historian and museum director.",
"He was born in Fredrikstad.",
"From 1982 to 1993 he was the director of Sogn Folk Museum, from 1993 to 2010 he was the director of Maihaugen and from 2001 he has been the director of the Norwegian Museum of Cultural History.",
"In 2010 he was decorated with the Royal Norwegian Order of St. Olav."
],
[
"Do n't Forget",
"You're Going to Die is a 1995 French drama film directed, co-written by and starring Xavier Beauvois."
],
[
"Brian Patrick Kennedy( born 5 November 1961) is an Irish- born art museum director who has worked in Ireland and Australia, and now lives and works in the United States.",
"He is currently the director of the Peabody Essex Museum.",
"He was the director of the Toledo Museum of Art in Ohio from 2010 to 2019.",
"He was the director of the Hood Museum of Art from 2005 to 2010, and the National Gallery of Australia( Canberra) from 1997- 2004."
],
[
"James Bauer (1884–1940) was a German film director.",
"Following the Nazi takeover of power in 1933, he emigrated first to Spain and later to Argentina, where he directed several films until his death in 1940."
],
[
"Peter Levin is an American director of film, television and theatre."
],
[
"My Heidelberg, I Can Not Forget You",
"(German:",
"Mein Heidelberg, ich kann Dich nicht vergessen) is a 1927 German silent film directed by James Bauer and starring Dorothea Wieck, and Hans Adalbert Schlettow.",
"It was shot at the Emelka Studios in Munich."
],
[
"S.N. Mathur was the Director of the Indian Intelligence Bureau between September 1975 and February 1980.",
"He was also the Director General of Police in Punjab."
],
[
"Ian Barry is an Australian director of film and TV."
]
],
"title": [
"Dana Blankstein",
"Jesse E. Hobson",
"Olav Aaraas",
"Don't Forget You're Going to Die",
"Brian Kennedy (gallery director)",
"James Bauer",
"Peter Levin",
"My Heidelberg, I Can Not Forget You",
"S. N. Mathur",
"Ian Barry (director)"
]
},
"supporting_facts": {
"sent_id": [
2,
1
],
"title": [
"My Heidelberg, I Can Not Forget You",
"James Bauer"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Where did the director of film My Heidelberg, I Can Not Forget You die?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"AR",
"Argentine Republic",
"🇦🇷",
"Argentina",
"ARG",
"ar"
]
},
"style": "rule"
}
|
{
"index": 19,
"split": "test"
}
|
dev_7508
|
Where did the composer of film Rajaratha graduate from?
|
[
"Vidya Vikas Institute of Engineering & Technology (VVIET)",
"Vidya Vikas Institute of Engineering & Technology"
] |
{
"context": {
"content": [
[
"Thomas Morse( born June 30, 1968) is an American composer of film and concert music."
],
[
"Abe Meyer( 1901 – 1969) was an American composer of film scores."
],
[
"Alonso Mudarra( c. 1510 – April 1, 1580) was a Spanish composer of the Renaissance, and also played the vihuela, a guitar- shaped string instrument.",
"He was an innovative composer of instrumental music as well as songs, and was the composer of the earliest surviving music for the guitar."
],
[
"Anup Bhandari is an Indian software engineer, turned to film maker, film director, composer and lyricist who works primarily in Kannada cinema,",
"now debut Kannada film director Anup Bhandari seems to have charted out his career graph well in advance.",
"He was born in Mangaluru to director Sudhakar Bhandary and Rathna Bhandari, raised in Mysuru and has worked in the US, but hopes to settle down in Bengaluru.",
"Anup holds an Engineering degree from Vidya Vikas Institute of Engineering & Technology (VVIET), Mysuru.",
"His directorial début mystery thriller Kannada movie Rangitaranga has been critically acclaimed.",
"Anup also wrote the lyrics and composed music for the movie.",
"His other works include a short movie \"Words\" for which cinematography was done by Lance Kaplan.",
"He is currently working on his next film with\nSudeep"
],
[
"Bert Grund( 1920–1992) was a German composer of film scores."
],
[
"Amedeo Escobar( 1888–1973) was an Italian composer of film scores."
],
[
"Rajaratha is is a 2018 Indian Kannada romantic comedy film written and directed by Anup Bhandari who also composed the film's music.",
"It stars Nirup Bhandari and Avantika Shetty in the lead roles in their second collaboration with the director after the success of \"RangiTaranga\".",
"The film was dubbed in Telugu as \"Rajaratham\" (English: \"Royal Chariot\") with Rana Daggubati replacing Puneeth Rajkumar as the narrator.",
"Arya, who made his debut in Kannada cinema for the first time with the film, features in a supporting role alongside P. Ravishankar and Sruthi Hariharan.",
"While the film score is composed by B. Ajaneesh Loknath, the soundtrack is composed by Anup Bhandari.",
"The cinematography is by William David.",
"The first look of the film was released in the social media by the team on 2 October 2017.",
"The film was released on March 23, 2018"
],
[
"The Seventh Company Outdoors is a 1977 French comedy film directed by Robert Lamoureux.",
"It is a sequel to Now Where Did the 7th Company Get to ?."
],
[
"Walter Ulfig was a German composer of film scores."
],
[
"Henri Verdun( 1895–1977) was a French composer of film scores."
]
],
"title": [
"Thomas Morse",
"Abe Meyer",
"Alonso Mudarra",
"Anup Bhandari",
"Bert Grund",
"Amedeo Escobar",
"Rajaratha",
"The Seventh Company Outdoors",
"Walter Ulfig",
"Henri Verdun"
]
},
"supporting_facts": {
"sent_id": [
0,
3
],
"title": [
"Rajaratha",
"Anup Bhandari"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Where did the composer of film Rajaratha graduate from?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Vidya Vikas Institute of Engineering & Technology (VVIET)",
"Vidya Vikas Institute of Engineering & Technology"
]
},
"style": "rule"
}
|
{
"index": 20,
"split": "test"
}
|
dev_10177
|
What is the place of birth of the director of film Yoogan?
|
[
"Kovilpatti"
] |
{
"context": {
"content": [
[
"Yoogan is a Tamil horror, suspense thriller film directed by Kamal G. Starring Yashmith and Sakshi Agarwal in the lead roles along with Siddhu GRN, Pradeep Balaji, Shyam Kirthivasan, Tarun Chakarvarthy and Suresh Pillai.",
"Rashaanth Arwin composed the music For the film.",
"Alex Premnath composed the background score and cinematography was by Ravi Arumugam.",
"It is a horror story which revolves around the problems that happen in an IT company.",
"The movie got released on 24 April 2015."
],
[
"Kamal G (born 22 July 1989) is an Indian film director, film editor and film producer from Kovilpatti in Thoothukudi District, Tamil Nadu, who works in the South Indian film industry."
],
[
"Dana Blankstein- Cohen( born March 3, 1981) is the director of the Israeli Academy of Film and Television.",
"She is a film director, and an Israeli culture entrepreneur."
],
[
"Peter Levin is an American director of film, television and theatre."
],
[
"Brian Patrick Kennedy( born 5 November 1961) is an Irish- born art museum director who has worked in Ireland and Australia, and now lives and works in the United States.",
"He is currently the director of the Peabody Essex Museum.",
"He was the director of the Toledo Museum of Art in Ohio from 2010 to 2019.",
"He was the director of the Hood Museum of Art from 2005 to 2010, and the National Gallery of Australia( Canberra) from 1997- 2004."
],
[
"Jesse Edward Hobson( May 2, 1911 – November 5, 1970) was the director of SRI International from 1947 to 1955.",
"Prior to SRI, he was the director of the Armour Research Foundation."
],
[
"Ian Barry is an Australian director of film and TV."
],
[
"Olav Aaraas( born 10 July 1950) is a Norwegian historian and museum director.",
"He was born in Fredrikstad.",
"From 1982 to 1993 he was the director of Sogn Folk Museum, from 1993 to 2010 he was the director of Maihaugen and from 2001 he has been the director of the Norwegian Museum of Cultural History.",
"In 2010 he was decorated with the Royal Norwegian Order of St. Olav."
],
[
"Jason Moore( born October 22, 1970) is an American director of film, theatre and television."
],
[
"S.N. Mathur was the Director of the Indian Intelligence Bureau between September 1975 and February 1980.",
"He was also the Director General of Police in Punjab."
]
],
"title": [
"Yoogan",
"Kamal G",
"Dana Blankstein",
"Peter Levin",
"Brian Kennedy (gallery director)",
"Jesse E. Hobson",
"Ian Barry (director)",
"Olav Aaraas",
"Jason Moore (director)",
"S. N. Mathur"
]
},
"supporting_facts": {
"sent_id": [
0,
0
],
"title": [
"Yoogan",
"Kamal G"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: What is the place of birth of the director of film Yoogan?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Kovilpatti"
]
},
"style": "rule"
}
|
{
"index": 21,
"split": "test"
}
|
dev_9224
|
Where was the director of film Bulletproof Heart (Film) born?
|
[
"El Cajon, California",
"El Cajon"
] |
{
"context": {
"content": [
[
"Peter Levin is an American director of film, television and theatre."
],
[
"Ian Barry is an Australian director of film and TV."
],
[
"Olav Aaraas( born 10 July 1950) is a Norwegian historian and museum director.",
"He was born in Fredrikstad.",
"From 1982 to 1993 he was the director of Sogn Folk Museum, from 1993 to 2010 he was the director of Maihaugen and from 2001 he has been the director of the Norwegian Museum of Cultural History.",
"In 2010 he was decorated with the Royal Norwegian Order of St. Olav."
],
[
"Jesse Edward Hobson( May 2, 1911 – November 5, 1970) was the director of SRI International from 1947 to 1955.",
"Prior to SRI, he was the director of the Armour Research Foundation."
],
[
"Bulletproof Heart (also known as Killer) is a 1995 independent Canadian-American drama-thriller film directed by Mark Malone.",
"It stars Anthony LaPaglia and Mimi Rogers, alongside a supporting cast featuring Matt Craven, Peter Boyle, Monika Schnarre, Joseph Maher, and Mark Acheson.",
"The film first premiered at the Toronto International Film Festival and Houston International Festival in 1994, and was later released to the United States and the United Kingdom in 1995.",
"The story concerns a professional assassin Anthony LaPaglia who is pressured into accepting a new assignment as a favour to his boss Peter Boyle.",
"He later finds himself in a dismal situation with dire consequences for everyone involved..."
],
[
"S.N. Mathur was the Director of the Indian Intelligence Bureau between September 1975 and February 1980.",
"He was also the Director General of Police in Punjab."
],
[
"Mark M. Malone (born November 22, 1958 in El Cajon, California) is a former American football quarterback in the NFL."
],
[
"Dana Blankstein- Cohen( born March 3, 1981) is the director of the Israeli Academy of Film and Television.",
"She is a film director, and an Israeli culture entrepreneur."
],
[
"Brian Patrick Kennedy( born 5 November 1961) is an Irish- born art museum director who has worked in Ireland and Australia, and now lives and works in the United States.",
"He is currently the director of the Peabody Essex Museum.",
"He was the director of the Toledo Museum of Art in Ohio from 2010 to 2019.",
"He was the director of the Hood Museum of Art from 2005 to 2010, and the National Gallery of Australia( Canberra) from 1997- 2004."
],
[
"Jason Moore( born October 22, 1970) is an American director of film, theatre and television."
]
],
"title": [
"Peter Levin",
"Ian Barry (director)",
"Olav Aaraas",
"Jesse E. Hobson",
"Bulletproof Heart (film)",
"S. N. Mathur",
"Mark Malone",
"Dana Blankstein",
"Brian Kennedy (gallery director)",
"Jason Moore (director)"
]
},
"supporting_facts": {
"sent_id": [
0,
0
],
"title": [
"Bulletproof Heart (film)",
"Mark Malone"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Where was the director of film Bulletproof Heart (Film) born?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"El Cajon, California",
"El Cajon"
]
},
"style": "rule"
}
|
{
"index": 22,
"split": "test"
}
|
dev_1638
|
Where did Prince Ferdinand Of Bavaria's mother die?
|
[
"Muenchen",
"München",
"Munchen",
"Monachium",
"Minga",
"Munich"
] |
{
"context": {
"content": [
[
"Princess Victoria of Saxe- Coburg and Gotha( Victoria Franziska Antonia Juliane Luise; 14 February",
"1822 – 10 November 1857) was the daughter of Prince Ferdinand of Saxe- Coburg and Gotha and Princess Maria Antonia Koháry.",
"Her father was the second son of Francis, Duke of Saxe- Coburg- Saalfeld and Countess Augusta Reuss of Ebersdorf."
],
[
"Princess Maria Elisabeth of Bavaria( 9 September 1914 – 13 May 2011) was the eldest daughter of Prince Franz of Bavaria, third son of Ludwig III of Bavaria."
],
[
"Princess Pilar of Bavaria(\" Maria del Pilar Eulalia Antonia Isabella Ludovika Franziska Josepha Rita Euphrasia von Wittelsbach, princess of Bavaria\")",
"( March 13, 1891 – January 29, 1987) was the only daughter of Prince Ludwig Ferdinand of Bavaria and his wife Infanta Maria de la Paz of Spain."
],
[
"Klemens Franz de Paula, Prince of Bavaria (Munich, 19 April 1722 – Munich, 6 August 1770) was the son of the Imperial Field Marshal, Ferdinand of Bavaria (1699–1738), and the grandson of Maximilian II Emanuel, Elector of Bavaria.",
"He married on 17 January 1742 Maria Anna, Pfalzgräfin von der Pfalz (1722–1790), daughter of\nJoseph Charles of Bavaria, Count Palatine and Prince Hereditary of Sulzbach and Countess Palatine Elizabeth Augusta Sophie of Neuburg, They had two sons and two daughters, all of them died shortly after birth:",
"If one of his sons had lived, he would have become Prince-elector of Bavaria in 1777 after the death of Maximilian III Joseph."
],
[
"Infanta María de la Paz of Spain (\"María de la Paz de Borbón y Borbón\"",
"; Madrid, 23 June 1862 – Schloss Nymphenburg, Munich, 4 December 1946) was an infanta of Spain.",
"A daughter of Queen Isabella II of Spain, she married her first cousin Prince Ludwig Ferdinand of Bavaria.",
"She lived for the rest of her life in Germany, dedicating her time to her family, charity work and writing poetry.",
"She wrote a book of memoirs: \"Through Four Revolutions: 1862–1933\"."
],
[
"Prince Ferdinand of Bourbon-Two Sicilies, The Duke of Castro",
"(Ferdinando Maria Andrea Alfonso Marcus; 28 May 1926 – 20 March 2008) was a claimant to the headship of the former Royal House of the Two Sicilies."
],
[
"Chantal Françoise Marie Camille de Chevron-Villette (10 January 1925 in Le Cannet-des-Maures, France – 24 May 2005 in Suresnes, France) was the wife of Prince Ferdinand of Bourbon-Two Sicilies, Duke of Castro, a Bourbon prince and claimant to headship of the Royal House of the Two Sicilies, which was deposed in 1860."
],
[
"Infante Ferdinand of Bavaria",
"(Ferdinand Maria Ludwig Franz von Assisi Isabellus Adalbert Ildefons Martin Bonifaz Joseph Isidro; 10 May 1884 – 5 April 1958) was a Spanish prince, the eldest son and child of Ludwig Ferdinand of Bavaria and his wife, Infanta María de la Paz of Spain.",
"Ferdinand became an Infante of Spain on 20 October 1905 and renounced his rights to the throne of the Kingdom of Bavaria in 1914."
],
[
"Prince Ferdinand of Savoy, 1st Duke of Genoa",
"(Ferdinando Maria Alberto Amedeo Filiberto Vincenzo; 15 November 1822 – 10 February 1855) was the founder of the Genoa branch of the House of Savoy."
],
[
"Prince Ferdinand of Hohenzollern( Ferdinand Maria Fidelis Leopold Meinrad Valentin; born 14 February 1960) is the youngest of three children of Friedrich Wilhelm, Prince of Hohenzollern and his wife Princess Margarita of Leiningen."
]
],
"title": [
"Princess Victoria of Saxe-Coburg and Gotha",
"Princess Maria Elisabeth of Bavaria",
"Princess Pilar of Bavaria",
"Duke Clement Francis of Bavaria",
"Infanta María de la Paz of Spain",
"Prince Ferdinand, Duke of Castro",
"Chantal de Chevron-Villette",
"Prince Ferdinand of Bavaria",
"Prince Ferdinando, Duke of Genoa (1822–1855)",
"Prince Ferdinand of Hohenzollern"
]
},
"supporting_facts": {
"sent_id": [
1,
1
],
"title": [
"Prince Ferdinand of Bavaria",
"Infanta María de la Paz of Spain"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Where did Prince Ferdinand Of Bavaria's mother die?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Muenchen",
"München",
"Munchen",
"Monachium",
"Minga",
"Munich"
]
},
"style": "rule"
}
|
{
"index": 23,
"split": "test"
}
|
dev_1200
|
What is the place of birth of Patricia Kennedy Lawford's father?
|
[
"Bostonia",
"The Hub of the Universe",
"Boston, MA",
"The Athens of America",
"The Walking City",
"The Hub",
"Boston, Massachusetts",
"The Cradle of Liberty",
"Boston",
"Boston, Mass.",
"The Cradle of Modern America",
"Beantown"
] |
{
"context": {
"content": [
[
"He was the father of Obata Masamori."
],
[
"Anacyndaraxes was the father of Sardanapalus, king of Assyria."
],
[
"Cleomenes II( died 309 BC) was Agiad King of Sparta from 369 to 309 BC.",
"The son of Cleombrotus I, he succeeded his brother Agesipolis II.",
"He was the father of Acrotatus I, the father of Areus I, and of Cleonymus, the father of Leonidas II."
],
[
"Christopher Kennedy Lawford( March 29, 1955 – September 4, 2018) was an American author, actor, and activist.",
"He was a member of the prominent Kennedy family, and son of actor Peter Lawford and Patricia\" Pat\" Kennedy Lawford, who was the sister of President John F. Kennedy.",
"He graduated from Tufts University in 1977 and earned a law degree from Boston College in 1983.",
"He later earned a master's certificate in Clinical Psychology from Harvard University, and was a lecturer on drug addiction.",
"After struggling with addiction for 17 years, he became an actor, performing in several movies and television shows for over 20 years.",
"He wrote several books, based on his own experience, about addiction and recovery.",
"He also traveled around the U.S. speaking about his experiences with addiction for 20 years, and was a public health campaigner, working with organizations like the World Health Organization( WHO) and the United Nations( UN), and for the U.S. federal government."
],
[
"Craig Brian Lawford( born 25 November 1972) is an English former professional footballer who played as a midfielder."
],
[
"Patricia Helen Kennedy Lawford (May 6, 1924 – September 17, 2006) was an American socialite and the sixth of nine children of Rose and Joseph P. Kennedy Sr.",
"She was a sister of President John F. Kennedy, Senator Robert F. Kennedy, and Senator Ted Kennedy.",
"Pat wanted to be a film producer, a profession not readily open to young women in her time.",
"She married English actor Peter Lawford in 1954, but they divorced in 1966."
],
[
"John Templeton( 1766–1825) was an early Irish naturalist and botanist.",
"He is often referred to as the\" Father of Irish Botany\".",
"He was the father of naturalist, artist and entomologist Robert Templeton."
],
[
"Joseph Patrick Kennedy Sr.",
"(September 6, 1888 – November 18, 1969) was an American businessman, investor, and politician known for his high-profile positions in United States government and for the political and other achievements of his children.",
"Kennedy was born to a political family in East Boston, Massachusetts.",
"He made a large fortune as a stock market and commodity investor and later rolled over his profits by investing in real estate and a wide range of business industries across the United States.",
"During World War I, he was an assistant general manager of a Boston area Bethlehem Steel shipyard; through this position, he became acquainted with Franklin D. Roosevelt, who was the Assistant Secretary of the Navy.",
"In the 1920s, Kennedy made huge profits by reorganizing and refinancing several Hollywood studios; several acquisitions were ultimately merged into Radio-Keith-Orpheum (RKO) studios.",
"Kennedy increased his fortune with distribution rights for Scotch whisky.",
"He owned the largest privately owned building in the country, Chicago's Merchandise Mart.",
"Kennedy was a leading member of the Democratic Party and of the Irish Catholic community.",
"President Franklin D. Roosevelt appointed Kennedy to be the first chairman of the U.S. Securities and Exchange Commission (SEC), which he led from 1934 to 1935.",
"Kennedy later directed the Maritime Commission.",
"Kennedy served as the United States Ambassador to the United Kingdom from 1938 until late 1940, when he annoyed Roosevelt by his pessimism about Britain's survival.",
"During the Battle of Britain in November 1940, Kennedy publicly suggested that \"Democracy is finished in England.",
"It may be here [in the United States]\".",
"Following this controversy, Kennedy resigned his position.",
"Kennedy was married to Rose Kennedy.",
"During his later life he was heavily involved in the political careers of his sons.",
"Three of the Kennedys' sons attained distinguished political positions: John F. Kennedy (1917–1963) served as a U.S. Senator from Massachusetts and as President of the United States, Robert F. Kennedy (1925–1968) served as Attorney General and as a U.S. Senator from New York, and Ted Kennedy (1932–2009) also served as a U.S. Senator from Massachusetts."
],
[
"Arthur Beauchamp( 1827 – 28 April 1910) was a Member of Parliament from New Zealand.",
"He is remembered as the father of Harold Beauchamp, who rose to fame as chairman of the Bank of New Zealand and was the father of writer Katherine Mansfield."
],
[
"Dorothy Hepworth( 1898 – 8 September 1978) was a British painter and the life partner of Patricia Preece."
]
],
"title": [
"Obata Toramori",
"Anacyndaraxes",
"Cleomenes II",
"Christopher Lawford",
"Craig Lawford",
"Patricia Kennedy Lawford",
"John Templeton (botanist)",
"Joseph P. Kennedy Sr.",
"Arthur Beauchamp",
"Dorothy Hepworth"
]
},
"supporting_facts": {
"sent_id": [
0,
2
],
"title": [
"Patricia Kennedy Lawford",
"Joseph P. Kennedy Sr."
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: What is the place of birth of Patricia Kennedy Lawford's father?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Bostonia",
"The Hub of the Universe",
"Boston, MA",
"The Athens of America",
"The Walking City",
"The Hub",
"Boston, Massachusetts",
"The Cradle of Liberty",
"Boston",
"Boston, Mass.",
"The Cradle of Modern America",
"Beantown"
]
},
"style": "rule"
}
|
{
"index": 24,
"split": "test"
}
|
dev_8808
|
When is the director of film The Last Dragon 's birthday?
|
[
"November 10, 1938"
] |
{
"context": {
"content": [
[
"Brian Saunders( date of birth and death unknown) was a male weightlifter who competed for England."
],
[
"Peter Levin is an American director of film, television and theatre."
],
[
"Terence D. Robinson( date of birth and death unknown) was a male wrestler who competed for England."
],
[
"Etan Boritzer( born 1950) is an American writer of children ’s literature who is best known for his book\" What is God?\" first published in 1989.",
"His best selling\" What is?\" illustrated children's book series on character education and difficult subjects for children is a popular teaching guide for parents, teachers and child- life professionals.",
"Boritzer gained national critical acclaim after\" What is God?\" was published in 1989 although the book has caused controversy from religious fundamentalists for its universalist views.",
"The other current books in the\" What is?\" series include",
"What is Love?,",
"What is Death?,",
"What is Beautiful?,",
"What is Funny?,",
"What is Right?,",
"What is Peace?,",
"What is Money?,",
"What is Dreaming?,",
"What is a Friend?,",
"What is True?,",
"What is a Family?,",
"What is a Feeling?\"",
"The series is now also translated into 15 languages.",
"Boritzer was first published in 1963 at the age of 13 when he wrote an essay in his English class at Wade Junior High School in the Bronx, New York on the assassination of John F. Kennedy.",
"His essay was included in a special anthology by New York City public school children compiled and published by the New York City Department of Education.",
"Boritzer now lives in Venice, California and maintains his publishing office there also.",
"He has helped numerous other authors to get published through\" How to Get Your Book Published!\" programs.",
"Boritzer is also a yoga teacher who teaches regular classes locally and guest- teaches nationally.",
"He is also recognized nationally as an erudite speaker on\" The Teachings of the Buddha.\""
],
[
"Ian Barry is an Australian director of film and TV."
],
[
"Chris D'Lacey( born 15 December 1954) is an English writer of children's fiction, he is best known for writing\" The Last Dragon Chronicles\".",
"He has also written many other books including\" A Dark Inheritance\"."
],
[
"Taimak Guarriello( born June 27, 1964), a.k.a. Taimak, is an American martial arts champion, actor and stuntman best known for his lead role as Leroy Green in the 1985 Sony Pictures cult classic film\" The Last Dragon\"."
],
[
"Michael Schultz (born November 10, 1938) is an American director and producer of film and television."
],
[
"Theodred II was a medieval Bishop of Elmham.",
"The date of Theodred's consecration unknown, but the date of his death was sometime between 995 and 997."
],
[
"The Last Dragon (sometimes listed as Berry Gordy's \"The Last Dragon\") is a 1985 American martial arts comedy film produced by Rupert Hitzig for Berry Gordy and directed by Michael Schultz.",
"The film stars Taimak, Vanity, Julius J. Carry III, Chris Murney, Keshia Knight Pulliam, and Faith Prince.",
"Choreography was done by Lester Wilson and Lawrence Leritz.",
"The Last Dragon takes cues from blaxploitation, writer Venosta admitted his intention: “There really should be a young black kid superhero-type figure who kids can look up to.”",
"It was released in theatres by TriStar Pictures on March 22, 1985.",
"The film was a financial success despite a mixed reception by critics, and is considered a cult classic."
]
],
"title": [
"Brian Saunders (weightlifter)",
"Peter Levin",
"Terence Robinson",
"Etan Boritzer",
"Ian Barry (director)",
"Chris d'Lacey",
"Taimak",
"Michael Schultz",
"Theodred II (Bishop of Elmham)",
"The Last Dragon"
]
},
"supporting_facts": {
"sent_id": [
0,
0
],
"title": [
"The Last Dragon",
"Michael Schultz"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: When is the director of film The Last Dragon 's birthday?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"November 10, 1938"
]
},
"style": "rule"
}
|
{
"index": 25,
"split": "test"
}
|
dev_3492
|
Which country Luís Of Portugal, Duke Of Beja's mother is from?
|
[
"España",
"ES",
"ESP",
"Jettatore",
"🇪🇸",
"Kingdom of Spain",
"Spanish",
"Spain"
] |
{
"context": {
"content": [
[
"Afonso, Prince of Portugal was the first son of king John III of Portugal and his queen, Catherine of Austria.",
"He was the Prince of Portugal but died in the same year he was born, in 1526."
],
[
"Infante João of Viseu( c. 1448 1472) was the older son of Infante Fernando, Duke of Viseu and of his wife Infanta Beatrice of Portugal.",
"In 1470, following his father's death, he inherited the titles of Duke of Viseu and Duke of Beja, as well as the Lordships of Covilhã and Moura.",
"He also became Master of the Order of Christ and of the Order of Santiago, becoming also Constable of Portugal.",
"In July 1472, his uncle, King Afonso V granted him the Moroccan city of Anfa( in Portuguese\" Anafé\").",
"He died young and single, without issue.",
"His brother, Infante Diogo, inherited his titles and estates."
],
[
"Infante João, Duke of Beja( 16 March 1842 – 27 December 1861) was a Portuguese\" infante\"( prince) and member of the House of Braganza."
],
[
"Infante Luís of Portugal, Duke of Beja (3 March 1506, in Abrantes – 27 November 1555, in Marvila, in Lisbon) was the second son of King Manuel I of Portugal and his second wife Maria of Aragon (the third daughter of the Catholic Monarchs).",
"He participated in the Conquest of Tunis."
],
[
"D. Luís Filipe, Prince Royal of Portugal, Duke of Braganza,( 21 March 1887 – 1 February 1908) was the eldest son and heir- apparent of King Carlos I of Portugal.",
"He was born in 1887 when his father was still Prince Royal of Portugal and received the usual style of the heirs to the heir of the Portuguese crown: 4th Prince of Beira at birth, with the subsidiary title 14th Duke of Barcelos.",
"After his grandfather King Luís I of Portugal died, he became Prince Royal of Portugal with the subsidiary titles 21st Duke of Braganza, 20th Marquis of Vila Viçosa, 28th count of Barcelos, 25th count of Ourém, 23rd count of Arraiolos and 22nd count of Neiva."
],
[
"Infanta Beatriz of Portugal( 13 June 1430 – 30 September 1506) was a Portuguese infanta, daughter of John, Constable of Portugal( fourth son of King John I of Portugal and his wife Philippa of Lancaster) and Isabella of Barcelos a daughter of Afonso I, Duke of Braganza."
],
[
"Infante Francisco, Duke of Beja , (Lisbon, 25 May 1691 – Lisbon, 21 July 1742) was a Portuguese \"infante\" (prince) son of Peter II, King of Portugal, and his second wife, Maria Sofia of the Palatinate."
],
[
"Dona Maria Francisca of Savoy (Marie Françoise Élisabeth; 21 June 1646 – 27 December 1683) was twice queen consort of Portugal as the spouse of two Portuguese kings:",
"Afonso VI and Peter II of Portugal.",
"She first became queen of Portugal at the age of 20 on the day of her marriage to Afonso VI; because the marriage was never consummated, she was able to obtain an annulment.",
"On 28 March 1668, she married Afonso's brother, the Infante Peter, Duke of Beja, who was appointed prince regent the same year due to Afonso's perceived incompetence.",
"Maria Francisca became queen of Portugal for the second time when Peter succeeded his brother as Peter II in 1683 but died herself later that year."
],
[
"Maria of Aragon (29 June 1482 – 7 March 1517) was a Spanish infanta, and queen consort of Portugal as the second spouse of Portuguese King Manuel I."
],
[
"Ferdinand of Portugal, Duke of Guarda,( 5 June 1507 – 7 November 1534) was a Portuguese\" infante\"( prince), the son of King Manuel I of Portugal and his second wife, Maria of Aragon."
]
],
"title": [
"Afonso, Prince of Portugal (1526)",
"João, Duke of Viseu",
"Infante João, Duke of Beja",
"Luís of Portugal, Duke of Beja",
"Luís Filipe, Prince Royal of Portugal",
"Beatriz, Duchess of Viseu",
"Infante Francisco, Duke of Beja",
"Maria Francisca of Savoy",
"Maria of Aragon, Queen of Portugal",
"Ferdinand of Portugal, Duke of Guarda"
]
},
"supporting_facts": {
"sent_id": [
0,
0
],
"title": [
"Luís of Portugal, Duke of Beja",
"Maria of Aragon, Queen of Portugal"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which country Luís Of Portugal, Duke Of Beja's mother is from?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"España",
"ES",
"ESP",
"Jettatore",
"🇪🇸",
"Kingdom of Spain",
"Spanish",
"Spain"
]
},
"style": "rule"
}
|
{
"index": 26,
"split": "test"
}
|
dev_8288
|
When did Cristina Rodríguez (Noble)'s father die?
|
[
"10 July 1099"
] |
{
"context": {
"content": [
[
"Albert Thompson( born 1912, date of death unknown) was a Welsh footballer."
],
[
"Cristina Rodríguez (born c. 1075) was a daughter of El Cid and Jimena Díaz.",
"In 1099 or earlier, she married Ramiro Sánchez, and was mother of King García Ramírez of Navarre and Elvira Ramírez, who married before 1137 Rodrigo Gómez."
],
[
"Cristina Rodríguez Cabral( born 26 May 1959) is an Uruguayan poet, researcher, and Afro -Uruguayan activist."
],
[
"Thomas Scott( 1907- date of death unknown) was an English diver."
],
[
"Rodrigo Díaz de Vivar (10 July 1099) was a Castilian knight and warlord in medieval Spain.",
"The Moors called him El Cid , which meant \"the Lord\" (probably from the original Arabic al-sayyid, السَّيِّد), and the Christians, El Campeador, which stood for \"The Battlefielder,Outstanding Warrior,\" or \"The one who stands out in the battlefield\".",
"He was born in Vivar del Cid, a town near the city of Burgos.",
"After his death, he became Spain's celebrated national hero and the protagonist of the most significant medieval Spanish epic poem, \"El Cantar de Mio Cid\".",
"To this day, El Cid remains a Spanish popular folk-hero and national icon, with his life and deeds remembered in plays, films, folktales, songs, and video games."
],
[
"Etan Boritzer( born 1950) is an American writer of children ’s literature who is best known for his book\" What is God?\" first published in 1989.",
"His best selling\" What is?\" illustrated children's book series on character education and difficult subjects for children is a popular teaching guide for parents, teachers and child- life professionals.",
"Boritzer gained national critical acclaim after\" What is God?\" was published in 1989 although the book has caused controversy from religious fundamentalists for its universalist views.",
"The other current books in the\" What is?\" series include",
"What is Love?,",
"What is Death?,",
"What is Beautiful?,",
"What is Funny?,",
"What is Right?,",
"What is Peace?,",
"What is Money?,",
"What is Dreaming?,",
"What is a Friend?,",
"What is True?,",
"What is a Family?,",
"What is a Feeling?\"",
"The series is now also translated into 15 languages.",
"Boritzer was first published in 1963 at the age of 13 when he wrote an essay in his English class at Wade Junior High School in the Bronx, New York on the assassination of John F. Kennedy.",
"His essay was included in a special anthology by New York City public school children compiled and published by the New York City Department of Education.",
"Boritzer now lives in Venice, California and maintains his publishing office there also.",
"He has helped numerous other authors to get published through\" How to Get Your Book Published!\" programs.",
"Boritzer is also a yoga teacher who teaches regular classes locally and guest- teaches nationally.",
"He is also recognized nationally as an erudite speaker on\" The Teachings of the Buddha.\""
],
[
"Theodred II was a medieval Bishop of Elmham.",
"The date of Theodred's consecration unknown, but the date of his death was sometime between 995 and 997."
],
[
"Cristina Kahlo y Calderón( 1908- 1964) was the sister of artist Frida Kahlo.",
"Frida painted a portrait of Cristina, titled\" Portrait of Cristina, My Sister\", and Diego Rivera, Frida's husband, also portrayed Cristina Kahlo in his work.",
"Rivera painted Cristina in the nude, which hint towards the pair's affair."
],
[
"Harry Wainwright( born 1899; date of death unknown) was an English footballer."
],
[
"William Thomas Smith( born 9 April 1897, date of death unknown) was an English professional footballer."
]
],
"title": [
"Albert Thompson (footballer, born 1912)",
"Cristina Rodríguez (noble)",
"Cristina Rodríguez Cabral",
"Thomas Scott (diver)",
"El Cid",
"Etan Boritzer",
"Theodred II (Bishop of Elmham)",
"Cristina Kahlo",
"Harry Wainwright (footballer)",
"Bill Smith (footballer, born 1897)"
]
},
"supporting_facts": {
"sent_id": [
0,
0
],
"title": [
"Cristina Rodríguez (noble)",
"El Cid"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: When did Cristina Rodríguez (Noble)'s father die?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"10 July 1099"
]
},
"style": "rule"
}
|
{
"index": 27,
"split": "test"
}
|
dev_4345
|
Where did the director of film Quarantine 2: Terminal graduate from?
|
[
"Yale",
"Yale College",
"Yale University",
"Collegiate School"
] |
{
"context": {
"content": [
[
"Brian Patrick Kennedy( born 5 November 1961) is an Irish- born art museum director who has worked in Ireland and Australia, and now lives and works in the United States.",
"He is currently the director of the Peabody Essex Museum.",
"He was the director of the Toledo Museum of Art in Ohio from 2010 to 2019.",
"He was the director of the Hood Museum of Art from 2005 to 2010, and the National Gallery of Australia( Canberra) from 1997- 2004."
],
[
"Now Where Did the 7th Company Get To? is a 1973 French- Italian comedy war film directed by Robert Lamoureux.",
"The film portrays the adventures of a French Army squad lost somewhere on the front in May 1940 during the Battle of France."
],
[
"John Pogue is an American film writer, producer, and director.",
"He is an alumnus of Yale University."
],
[
"The Seventh Company Outdoors is a 1977 French comedy film directed by Robert Lamoureux.",
"It is a sequel to Now Where Did the 7th Company Get to ?."
],
[
"Quarantine 2:",
"Terminal is a 2011 American horror film and a sequel to the 2008 film, \"Quarantine\".",
"It was written and directed by John Pogue and produced by Marc Brienstock.",
"The film stars Mercedes Mason, Josh Cooke and Mattie Liptak and revolves around a mutated rabies infection outbreak in a quarantined airport and taking place the same night and nearly same time as the events of the first film.",
"Although the first film was a remake of the Spanish film \"RECQuarantine 2\" has no relation to any of the \"REC\" films and has an entirely different plot and setting."
],
[
"Dana Blankstein- Cohen( born March 3, 1981) is the director of the Israeli Academy of Film and Television.",
"She is a film director, and an Israeli culture entrepreneur."
],
[
"Jason Moore( born October 22, 1970) is an American director of film, theatre and television."
],
[
"Olav Aaraas( born 10 July 1950) is a Norwegian historian and museum director.",
"He was born in Fredrikstad.",
"From 1982 to 1993 he was the director of Sogn Folk Museum, from 1993 to 2010 he was the director of Maihaugen and from 2001 he has been the director of the Norwegian Museum of Cultural History.",
"In 2010 he was decorated with the Royal Norwegian Order of St. Olav."
],
[
"Peter Levin is an American director of film, television and theatre."
],
[
"Ian Barry is an Australian director of film and TV."
]
],
"title": [
"Brian Kennedy (gallery director)",
"Now Where Did the 7th Company Get to?",
"John Pogue",
"The Seventh Company Outdoors",
"Quarantine 2: Terminal",
"Dana Blankstein",
"Jason Moore (director)",
"Olav Aaraas",
"Peter Levin",
"Ian Barry (director)"
]
},
"supporting_facts": {
"sent_id": [
2,
1
],
"title": [
"Quarantine 2: Terminal",
"John Pogue"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Where did the director of film Quarantine 2: Terminal graduate from?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Yale",
"Yale College",
"Yale University",
"Collegiate School"
]
},
"style": "rule"
}
|
{
"index": 28,
"split": "test"
}
|
dev_2170
|
Which film has the director born later, Paris Interlude or Mystery, Alaska?
|
[
"Mystery, Alaska"
] |
{
"context": {
"content": [
[
"Ryan Northcott( born April 9, 1980) is a.",
"He appeared in the film Mystery, Alaska, Ripper, and played Carl Wilson in the Emmy nominated miniseries.",
"He is currently Co- Founder of MEDIAPOP Films and continues to appear in TV and Film."
],
[
"Mathew Jay Roach( born June 14, 1957) is an American film director, producer, and screenwriter, best known for directing the\" Austin Powers\" films and\" Meet the Parents\".",
"He also directed the comedy film\" Dinner for Schmucks\"( 2010), the HBO political dramas\" Recount\"( 2008),\" Game Change\"( 2012) and\" All the Way\"( 2016), and the political comedy\" The Campaign\"( 2012).",
"His twelfth film, the biographical drama\" Trumbo\", was released on November 6, 2015.",
"He produced these films under his Everyman Pictures banner."
],
[
"Mystery, Alaska is a 1999 comedy- drama film directed by Jay Roach about an amateur ice hockey team, from the fictional small- town of Mystery, that plays an exhibition game against the National Hockey League( NHL)'s New York Rangers.",
"It was shot in Banff National Park and in a\" town\" built for the purpose outside Canmore, Alberta."
],
[
"Edwin L. Marin (February 21, 1899 – May 2, 1951) was an American film director who directed 58 films between 1932 and 1951, working with Randolph Scott, Anna May Wong, John Wayne, Peter Lorre, George Raft, Bela Lugosi, Judy Garland, Eddie Cantor, and Hoagy Carmichael, among many others.",
"Marin was born in Jersey City, New Jersey, and died in Los Angeles, California.",
"He was married to actress Ann Morriss.",
"They had three children: Denis Anthony, Reese Andrew and Randi Alexandra.",
"He studied at the University of Pennsylvania and broke into the industry as an assistant director at MGM.",
"He was under contract to MGM for 15 years and RKO for five.",
"When he died he was under contract to Warner Bros."
],
[
"W. Augustus Barratt( 1873- 1947) was a Scottish- born, later American, songwriter and musician."
],
[
"Fred R. Krug is an American film and television producer- director born in Bern, Switzerland."
],
[
"Paris Interlude is a 1934 American drama film directed by Edwin L. Marin and written by Wells Root.",
"The film stars Madge Evans, Otto Kruger, Robert Young, Una Merkel, Ted Healy and Louise Henry.",
"The film was released on July 27, 1934, by Metro- Goldwyn- Mayer."
],
[
"Claude Weisz is a French film director born in Paris."
],
[
"Sepideh Farsi is an Iranian film director, born in Tehran in 1965."
],
[
"Jacques Décombe is a French author, actor and director born in 1953."
]
],
"title": [
"Ryan Northcott",
"Jay Roach",
"Mystery, Alaska",
"Edwin L. Marin",
"W. Augustus Barratt",
"Fred Roy Krug",
"Paris Interlude",
"Claude Weisz",
"Sepideh Farsi",
"Jacques Décombe"
]
},
"supporting_facts": {
"sent_id": [
0,
0,
0,
0
],
"title": [
"Paris Interlude",
"Mystery, Alaska",
"Edwin L. Marin",
"Jay Roach"
]
},
"type": "bridge_comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which film has the director born later, Paris Interlude or Mystery, Alaska?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Mystery, Alaska"
]
},
"style": "rule"
}
|
{
"index": 29,
"split": "test"
}
|
dev_5718
|
Which country Behter's father is from?
|
[
"Khamag Mongol"
] |
{
"context": {
"content": [
[
"Behter or Bekter (?",
"- 1180) was the son of Yesugei, chief of the Kiyad clan, and a junior wife named Suchigu in some sources and Ko'agjin in others.",
"He was also half-brother of Genghis Khan, then known as Temujin.",
"On the death of Yesugei, Temujin, his mother Hoelun, his siblings and two half-brothers (including Behter) were abandoned by their tribe and left to fend for themselves.",
"Living off the land, they managed to survive.",
"However, the older half-brothers deprived 14-year-old Temujin and his brother Qasar of their spoils.",
"Temujin and Qasar stalked and killed Behter, for which they were scolded by their mother Hoelun.",
"Behter is an Orkhon Turkic word meaning \"armor\"."
],
[
"Khalid al- Habib also believed to use the nom de guerre Khalid al Harbi is an Al- Qaeda leader who has overseen Al- Qaeda operations in Afghanistan.",
"In 2005 it was reported that al- Habib died in a US missile strike on the Pakistani village of Damadola though it is unclear whether this is the case as Pakistani officials have claimed that no Al- Qaeda leaders died in the attack.",
"al- Habib's nationality is not clear with sources claiming he may be Egyptian or Moroccan."
],
[
"Roberto Savio( born in Rome, Italy, but also holding Argentine nationality) is a journalist, communication expert, political commentator, activist for social and climate justice and advocate of global governance.",
"He has spent most of his career with Inter Press Service( IPS), the news agency which he founded in 1964 along with Argentine journalist Pablo Piacentini.",
"Savio studied Economics at the University of Parma, followed by post -graduate courses in Development Economics under Gunnar Myrdal, History of Art and International Law in Rome.",
"He started his professional career as a research assistant in International Law at the University of Parma."
],
[
"Eleni Zaude Gabre- Madhin, an Ethiopian economist with Swiss nationality, is a former Chief Executive Officer of the Ethiopia Commodity Exchange( ECX).",
"She has had many years of experience working on agricultural markets – particularly in Sub- Saharan Africa – and has held senior positions in the World Bank, the International Food Policy Research Institute( Washington), and United Nations( Geneva)."
],
[
"Nayelly Hernández( born 23 February 1986) is a former Mexican female professional squash player.",
"She has represented Mexico internationally in several international competitions including the Central American and Caribbean Games, Pan American Games, Women's World Team Squash Championships.",
"Nayelly achieved her highest career ranking of 57 in October 2011 during the 2011 PSA World Tour.",
"Her husband Chris Walker whose nationality is English is also a professional squash player.",
"She joined the Trinity College in 2008 as the first Mexican female to join a US college for squash and graduated in 2010."
],
[
"Saint Rumbold( or\" RumoldRomuold\") was an Irish or Scottish Christian missionary, although his true nationality is not known for certain.",
"He was martyred near Mechelen by two men, whom he had denounced for their evil ways.",
"Saint Rumbold's feast day is celebrated by the Roman Catholic Church, and Western Rite Orthodox Churches, on 24 June; and it is celebrated in Ireland on 3 July.",
"He is the patron saint of Mechelen, where St. Rumbold's Cathedral possesses an elaborate golden shrine on its high altar, containing relics attributed to the saint.",
"It is rumoured that his remains are buried inside the cathedral.",
"Twenty- five paintings in the choir illustrate his life."
],
[
"Baglan Mailybayev( in Kazakh Бағлан Асаубайұлы Майлыбаев) was born on 20 May 1975 in Zhambyl region, Kazakhstan.",
"His nationality is Kazakh.",
"He is a politician of the Republic of Kazakhstan, Deputy Head of the Presidential Administration of the Republic of Kazakhstan, Doctor of Law( 2002)( under the supervision of Professor Zimanov S.Z.- scientific advisor and academician of National Academy of Sciences of the Republic of Kazakhstan) and Ph.D. in Political Science( 1998)."
],
[
"Ali Khalifa Rahuma( born May 16, 1982) is a Libyan football midfielder who's nationality is Libyan.",
"He currently plays for Al- Ittihad, and is a member of the Libya national football team."
],
[
"Yesugei Baghatur or Yesükhei",
"(Modern Mongolian: Есүхэй баатар, \"Yesukhei baatar\"), was a major chief of the Khamag Mongol confederation and the father of Temüjin, later known as Genghis Khan.",
"He was of Borjigin family, and his name literally means \"like nine\", meaning he had the auspicious qualities of the number nine, a lucky number to the Mongols."
],
[
"Fulco was the first known missionary Bishop of Estonia.",
"He was appointed in 1165 by Eskil, the Danish Archbishop of Lund.",
"Before his appointment, Fulco was a Benedictine monk in the abbey of Moutier- la- Celle, near Troyes in France.",
"His nationality is not known.",
"After his appointment, Fulco appears in sources only once.",
"In 1171, Pope Alexander III asked the Archbishop of Trondheim to assign an Estonian monk Nicolaus living in Stavanger to go to Fulco's assistance.",
"No further information survives about Fulco's work in Estonia, or whether he ever even got there."
]
],
"title": [
"Behter",
"Khalid al-Habib",
"Roberto Savio",
"Eleni Gabre-Madhin",
"Nayelly Hernández",
"Rumbold of Mechelen",
"Baglan Mailybayev",
"Ali Rahuma",
"Yesugei",
"Fulco"
]
},
"supporting_facts": {
"sent_id": [
1,
1
],
"title": [
"Behter",
"Yesugei"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which country Behter's father is from?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Khamag Mongol"
]
},
"style": "rule"
}
|
{
"index": 30,
"split": "test"
}
|
dev_1127
|
Which film has the director died later, The True Story Of Eskimo Nell or Flame Of Stamboul?
|
[
"The True Story Of Eskimo Nell",
"Dick Down Under",
"True Story of Eskimo Nell",
"The True Story of Eskimo Nell"
] |
{
"context": {
"content": [
[
"Secret of Stamboul, also known as The Spy in White, is a 1936 British thriller film, taken from the novel\" The Eunuch of Stamboul\" by Dennis Wheatley, directed by Andrew Marton and starring Valerie Hobson, James Mason and Frank Vosper.",
"It was made at Shepperton Studios.",
"The screenplay concerns a British agent who tries to thwart a revolution."
],
[
"Eskimo Nell, also known as\" The Ballad of Eskimo Nell\" and as\" The Sexy Saga of Naughty Nell and Big Dick\", is a 1975 British film directed by Martin Campbell and produced by Stanley Long.",
"Though inspired by\" The Ballad of Eskimo Nell\", the movie owes little to the original bawdy song."
],
[
"Trópico de Sangre is a 2010 drama film based on the true story of the Dominican Republic's historic Mirabal sisters."
],
[
"Flame of Stamboul is a 1951 American thriller film directed by Ray Nazarro and starring Richard Denning, Lisa Ferraday and Norman Lloyd.",
"The film's sets were designed by the art director Cary Odell."
],
[
"The Rose of Stamboul( German: Die Rose von Stambul) is a 1919 German silent comedy film directed by Felix Basch and Arthur Wellin and starring Fritzi Massary, Gustav Botz and Ernst Pittschau.",
"It is based on the 1916 operetta\" The Rose of Stamboul.",
"The film's sets were designed by the art director Ernst Stern."
],
[
"Richard Franklin( 15 July 1948 – 11 July 2007) was an Australian film director."
],
[
"The True Story of Eskimo Nell( retitled Dick Down Under in the United Kingdom) is a 1975 Australian western comedy film produced, directed, and written by Richard Franklin, and starring Max Gillies as Deadeye Dick and Serge Lazareff as Mexico Pete.",
"The film was the first film produced by Richard Franklin.",
"Based on and inspired by the bawdy\" Ballad of Eskimo Nell\", which had actually been banned in Australia, with Deadeye Dick and Mexico Pete setting forth through the Australian Outback in search of the infamous prostitute, Eskimo Nell.",
"The film features large amounts of full frontal nudity which was one of the main attractions of the film at the time.",
"The film was part of the Australian New Wave, with most of the film being shot in Ballarat and some in Canada."
],
[
"The film is based upon the true story of Shindo Renmei, a terrorist organization composed of Japanese immigrants in Brazil, at the end of World War II."
],
[
"Ray Nazarro (September 25, 1902 – September 8, 1986) was an American film and television director, producer, and screenwriter.",
"Budd Boetticher called him a \"ten day picture guy\"."
],
[
"The Chameleon is a 2010 film directed by Jean- Paul Salomé, who co-wrote the screenplay with Natalie Carter.",
"The film is based upon the true story of Frédéric Bourdin who impersonated a missing child named Nicholas Barclay in San Antonio, Texas, in the 1990s.",
"Much of the true story was incorporated into the film although the years have been altered and the location was moved to Baton Rouge, Louisiana."
]
],
"title": [
"Secret of Stamboul",
"Eskimo Nell (film)",
"Trópico de Sangre",
"Flame of Stamboul",
"The Rose of Stamboul (1919 film)",
"Richard Franklin (director)",
"The True Story of Eskimo Nell",
"Dirty Hearts",
"Ray Nazarro",
"The Chameleon (2010 film)"
]
},
"supporting_facts": {
"sent_id": [
0,
0,
0,
0
],
"title": [
"The True Story of Eskimo Nell",
"Flame of Stamboul",
"Richard Franklin (director)",
"Ray Nazarro"
]
},
"type": "bridge_comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which film has the director died later, The True Story Of Eskimo Nell or Flame Of Stamboul?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"The True Story Of Eskimo Nell",
"Dick Down Under",
"True Story of Eskimo Nell",
"The True Story of Eskimo Nell"
]
},
"style": "rule"
}
|
{
"index": 31,
"split": "test"
}
|
dev_4002
|
Where was the place of death of Dmitri Sheremetev's father?
|
[
"Moscow, Russian Federation",
"City of Moscow",
"Moskva, Russia",
"Moscow",
"Moskva Federal City, Russia",
"Moscow, USSR",
"Moskva",
"Moscow, Russia",
"Moscow, Soviet Union",
"Москва",
"Moscow, Russian SFSR"
] |
{
"context": {
"content": [
[
"Motherland is the place of one's birth, the place of one's ancestors, or the place of origin of an ethnic group.",
"Motherland may also refer to:"
],
[
"Gennadiy Valyukevich (1 June 1958 – 29 December 2019) was a triple jumper who represented the USSR and later Belarus.",
"He won three medals at the European Indoor Championships.",
"He was the father of Dmitrij Valukevic, who currently represents Slovakia."
],
[
"Gennadiy Valyukevich (1 June 1958 – 29 December 2019) was a triple jumper who represented the USSR and later Belarus.",
"He won three medals at the European Indoor Championships.",
"He was the father of Dmitrij Valukevic, who currently represents Slovakia."
],
[
"Beaulieu- sur- Loire is a commune in the Loiret department in north -central France.",
"It is also the place of death of Jacques MacDonald, a French general who served in the Napoleonic Wars"
],
[
"The Ancient Egyptian artisan Sennedjem lived in Set Maat( translated as\" The Place of Truth\"), contemporary Deir el- Medina, on the west bank of the Nile, opposite Thebes, during the reigns of Seti I and Ramesses II.",
"Sennedjem had the title\" Servant in the Place of Truth\".",
"He was buried along with his wife, Iyneferti, and family in a tomb in the village necropolis.",
"His tomb was discovered January 31, 1886.",
"When Sennedjem's tomb was found, in it there was regular furniture from his home, including a stool and a bed, which he actually used when he was alive.",
"His titles included Servant in the Place of Truth, meaning that he worked on the excavation and decoration of the nearby royal tombs."
],
[
"The place of birth( POB) or birthplace is the place where a person was born.",
"This place is often used in legal documents, together with name and date of birth, to uniquely identify a person.",
"As a general rule with respect to passports, the place of birth is determined to be the country that currently has\" sovereignty\" over the actual place of birth, regardless of when the birth actually occurred.",
"The place of birth is not necessarily the place where the parents of the new baby live.",
"If the baby is born in a hospital in another place, that place is the place of birth.",
"In many countries, this also means that the government requires that the birth of the new baby is registered in the place of birth.",
"Some countries place less or no importance on the place of birth, instead using alternative geographical characteristics for the purpose of identity documents.",
"For example, Sweden has used the concept of\" födelsehemortdomicile of birth\") since 1947.",
"This means that the domicile of the baby's mother is the registered place of birth.",
"The location of the maternity ward or other physical birthplace is considered unimportant.",
"Similarly, Switzerland uses the concept of\" Heimatortlieu d'origineluogo d'origineplace of origin\" in German, French and Italian respectively).",
"A child born to Swiss parents is automatically assigned the place of origin of the parent with the same last name, so the child either gets their mother's or father's place of origin.",
"A child born to one Swiss parent and one foreign parent acquires the place of origin of their Swiss parent.",
"In a Swiss passport and identity card, the holder's place of origin is stated, not their place of birth.",
"In some countries( primarily in the Americas), the place of birth automatically determines the nationality of the baby, a practice often referred to by the Latin phrase\" jus soli\".",
"Almost all countries outside the Americas instead attribute nationality based on the nationality( ies) of the baby's parents( referred to as\" jus sanguinis\").",
"There can be some confusion regarding the place of birth if the birth takes place in an unusual way: when babies are born on an airplane or at sea, difficulties can arise.",
"The place of birth of such a person depends on the law of the countries involved, which include the nationality of the plane or ship, the nationality( ies) of the parents and/ or the location of the plane or ship( if the birth occurs in the territorial waters or airspace of a country).",
"Some administrative forms may request the applicant's\" country of birth\".",
"It is important to determine from the requester whether the information requested refers to the applicant's\" place of birth\" or\" nationality at birth\".",
"For US citizens born abroad who under the US Constitution acquire US citizenship at the time of birth, the nationality at birth will be USA( American), while the place of birth would be the country in which the actual birth takes place."
],
[
"Nikolai Petrovich Sheremetev (28 June 1751 - 2 January 1809",
"O.S., 9 July 1751 - 14 January 1809 N.S.) was a Russian count, the son of Petr Borisovich Sheremetev, notable grandee of the epoch of empresses Anna Ivanovna, Elizabeth Petrovna, and Catherine II.",
"He was also the grandson of Boris Petrovich Sheremetev.",
"His father P. B. Sheremetev was passionate about the theatre and transferred this passion to his son.",
"N. P. Sheremetev spent his early youth at court.",
"From the age of 13 to 14 he started to act in private theatricals of his father, and then \"on the big court theatre\".",
"In 1765 he played the role of the god Hymen in the mythological ballet \"",
"Acis and Galathea\", in which his childhood comrade, the future Paul I, had distinguished himself.",
"Having a special passion for music, Nikolai Petrovich masterfully played the cello.",
"From 1769 to 1773 he traveled abroad: he attended lectures in Leiden University in the Netherlands, traveled across England, Germany, and Switzerland, took music lessons, and got acquainted with theatre life.",
"In France he took a great interest in opéra comique.",
"Returning to Moscow and receiving the post of director of the Moscow bank for noblemen, Sheremetev started to reconstruct his father's theatre: he engaged in special education of serf children, \"certain to theatre\" (Russian letters, foreign languages, music, singing, dance, diction, refined manners).",
"Feeling extraordinary talent in one of his students, Parasha Kovaleva (Praskovya Ivanovna Kovaleva), he gave her more and more attention, preparing for her star career as the future \"Praskovya Zhemchugova\", whom he married in 1801.",
"Sheremetev's performances involved Moscow gentlefolk.",
"From a small private theatre of the count P. B. Sheremetev, it grew into a troupe capable of giving \"an opera and allegorical ballets\".",
"From the moment of his return from abroad Sheremetev not only watched closely all events of the Moscow theatre life, but on a regular basis he took out the troupe to performances of theatre Medox, and he invited lead actors of Peter theatre (theatre Medox) to teach serf actors.",
"In the beginning of 1790 Sheremetev decided to transfer the serf theatre from Kuskovo to Ostankino.",
"On July 22, 1795 the theatre was opened with the premiere of the heroic opera",
"\"The Capture of Izmail\"",
"(P. Potemkin's libretto, I. Kozlovski's music).",
"Sheremetev's theatre left far behind numerous other serf troupes (the only exception was the theatre of the Count A. R. Vorontsov).",
"In 1796 Sheremetev was made a senator, and in March he moved to St. Petersburg to Catherine II's court.",
"On November 6, 1796 Catherine died and the throne was inherited by Paul I who gave Sheremetev the title of marshal.",
"Sheremetev's life proceeded in St.",
"Petersburg, Pavlovsk, Gatchina; he was engaged in theatre much less.",
"Because of the incurable illness of Zhemchugova, who was now his wife, the Count closed the theatre in Ostankino, and on November 6, 1801 he secretly married her, having first found in an archive the facts testifying to her \"origin\" from the Polish noble clan of Kovalev.",
"The marriage was made public only after Zhemchugova's death (on February 23, 1803).",
"Sheremetev sent a letter to the tsar, informing him about his marriage with a woman \"whose origin incontestably had a noble beginning\" and about the birth of his son and heir.",
"Sheremetev died six years later, in 1809.",
"In his will he gave instructions for a very simple funeral, in a simple board coffin, with the money intended for a funeral distributed instead to the poor and monasteries."
],
[
"Dance of Death, also called\" Danse Macabre\", is a late- medieval allegory of the universality of death.",
"Dance of Death or The Dance of Death may also refer to:"
],
[
"\" Where Was I?\" may refer to:"
],
[
"Dmitri Nikolaevich Sheremetev (3 January 1803 in Saint Petersburg- 12 September 1871 in Kuskovo) was a Russian aristocratic, member of the Sheremetev family.",
"He was the only son of Nikolai Sheremetev and Praskovia Kovalyova-Zhemchugova and the father of Aleksandr Sheremetev."
]
],
"title": [
"Motherland (disambiguation)",
"Gennadiy Valyukevich",
"Gennadiy Valyukevich",
"Beaulieu-sur-Loire",
"Sennedjem",
"Place of birth",
"Nikolai Sheremetev",
"Dance of Death (disambiguation)",
"Where Was I",
"Dmitri Sheremetev"
]
},
"supporting_facts": {
"sent_id": [
1,
11
],
"title": [
"Dmitri Sheremetev",
"Nikolai Sheremetev"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Where was the place of death of Dmitri Sheremetev's father?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Moscow, Russian Federation",
"City of Moscow",
"Moskva, Russia",
"Moscow",
"Moskva Federal City, Russia",
"Moscow, USSR",
"Moskva",
"Moscow, Russia",
"Moscow, Soviet Union",
"Москва",
"Moscow, Russian SFSR"
]
},
"style": "rule"
}
|
{
"index": 32,
"split": "test"
}
|
dev_6054
|
What is the place of birth of the director of film The Grasp Of Greed?
|
[
"colony of New Brunswick",
"NB",
"province of New Brunswick",
"New Brunswick"
] |
{
"context": {
"content": [
[
"The Grasp of Greed is a 1916 American silent drama film directed by Joe De Grasse and starring Lon Chaney.",
"The plot concerns a marooned man's will tattooed on the back of a woman."
],
[
"S.N. Mathur was the Director of the Indian Intelligence Bureau between September 1975 and February 1980.",
"He was also the Director General of Police in Punjab."
],
[
"Peter Levin is an American director of film, television and theatre."
],
[
"Brian Patrick Kennedy( born 5 November 1961) is an Irish- born art museum director who has worked in Ireland and Australia, and now lives and works in the United States.",
"He is currently the director of the Peabody Essex Museum.",
"He was the director of the Toledo Museum of Art in Ohio from 2010 to 2019.",
"He was the director of the Hood Museum of Art from 2005 to 2010, and the National Gallery of Australia( Canberra) from 1997- 2004."
],
[
"John Greed( born 14 November 1966) is a contemporary jewelry designer and retailer.",
"He was born in Wimbledon, London and graduated from the School of Art and Design in London.",
"He later founded John Greed Design Ltd, which has since become a major UK retailer of jewellery."
],
[
"Olav Aaraas( born 10 July 1950) is a Norwegian historian and museum director.",
"He was born in Fredrikstad.",
"From 1982 to 1993 he was the director of Sogn Folk Museum, from 1993 to 2010 he was the director of Maihaugen and from 2001 he has been the director of the Norwegian Museum of Cultural History.",
"In 2010 he was decorated with the Royal Norwegian Order of St. Olav."
],
[
"House of Greed is a 1933 Soviet film directed by Aleksandr Ivanovsky."
],
[
"Ian Barry is an Australian director of film and TV."
],
[
"Joseph Louis De Grasse (May 4, 1873 – May 25, 1940) was a Canadian film director.",
"Born in Bathurst, New Brunswick, he was the elder brother of actor Sam De Grasse."
],
[
"Jesse Edward Hobson( May 2, 1911 – November 5, 1970) was the director of SRI International from 1947 to 1955.",
"Prior to SRI, he was the director of the Armour Research Foundation."
]
],
"title": [
"The Grasp of Greed",
"S. N. Mathur",
"Peter Levin",
"Brian Kennedy (gallery director)",
"John Greed",
"Olav Aaraas",
"House of Greed",
"Ian Barry (director)",
"Joseph De Grasse",
"Jesse E. Hobson"
]
},
"supporting_facts": {
"sent_id": [
0,
1
],
"title": [
"The Grasp of Greed",
"Joseph De Grasse"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: What is the place of birth of the director of film The Grasp Of Greed?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"colony of New Brunswick",
"NB",
"province of New Brunswick",
"New Brunswick"
]
},
"style": "rule"
}
|
{
"index": 33,
"split": "test"
}
|
dev_4669
|
What nationality is Sukhumala Marasri's father?
|
[
"THA",
"Kingdom of Thailand",
"Yothathep",
"Siam",
"Thailand",
"🇹🇭",
"th",
"Land of Smiles"
] |
{
"context": {
"content": [
[
"Sukhumala Marasri (10 May 1861 – 9 July 1927) was a daughter of King Mongkut (Rama IV) and his concubine, Samli (เจ้าคุณจอมมารดาสำลี).",
"Her given name was Princess Sukhumala Marasri (พระองค์เจ้าสุขุมาลมารศรี).",
"She was later one of the four consorts of her half-brother King Chulalongkorn (Rama V)."
],
[
"Mongkut, also known as King Rama IV, reigning title Phra Chom Klao Chao Yu Hua (18 October 18041 October 1868), was the fourth monarch of Siam (Thailand) under the House of Chakri, ruling from 1851 to 1868.",
"Outside Thailand, he is best known as the king in the 1951 musical and 1956 film \"The King and I\", based on the 1946 film \"Anna and the King of Siam\"in",
"turn based on a 1944 novel by an American missionary about Anna Leonowens' years at his court, from 1862 to 1867.",
"During his reign, the pressure of Western expansionism was felt for the first time in Siam.",
"Mongkut embraced Western innovations and initiated the modernization of his country, both in technology and culture—earning him the nickname \"The Father of Science and Technology\" in Siam.",
"Mongkut was also known for his appointing his brother, Prince Chutamani, as Second King, crowned in 1851 as King Pinklao.",
"Mongkut himself assured the country that Pinklao should be respected with equal honor to himself (as King Naresuan had done with his brother Ekathotsarot in 1583).",
"Mongkut's reign was also the time when the power of the House of Bunnag reached its zenith and became the most powerful noble family of Siam."
],
[
"Roberto Savio( born in Rome, Italy, but also holding Argentine nationality) is a journalist, communication expert, political commentator, activist for social and climate justice and advocate of global governance.",
"He has spent most of his career with Inter Press Service( IPS), the news agency which he founded in 1964 along with Argentine journalist Pablo Piacentini.",
"Savio studied Economics at the University of Parma, followed by post -graduate courses in Development Economics under Gunnar Myrdal, History of Art and International Law in Rome.",
"He started his professional career as a research assistant in International Law at the University of Parma."
],
[
"Baglan Mailybayev( in Kazakh Бағлан Асаубайұлы Майлыбаев) was born on 20 May 1975 in Zhambyl region, Kazakhstan.",
"His nationality is Kazakh.",
"He is a politician of the Republic of Kazakhstan, Deputy Head of the Presidential Administration of the Republic of Kazakhstan, Doctor of Law( 2002)( under the supervision of Professor Zimanov S.Z.- scientific advisor and academician of National Academy of Sciences of the Republic of Kazakhstan) and Ph.D. in Political Science( 1998)."
],
[
"Eleni Zaude Gabre- Madhin, an Ethiopian economist with Swiss nationality, is a former Chief Executive Officer of the Ethiopia Commodity Exchange( ECX).",
"She has had many years of experience working on agricultural markets – particularly in Sub- Saharan Africa – and has held senior positions in the World Bank, the International Food Policy Research Institute( Washington), and United Nations( Geneva)."
],
[
"Nayelly Hernández( born 23 February 1986) is a former Mexican female professional squash player.",
"She has represented Mexico internationally in several international competitions including the Central American and Caribbean Games, Pan American Games, Women's World Team Squash Championships.",
"Nayelly achieved her highest career ranking of 57 in October 2011 during the 2011 PSA World Tour.",
"Her husband Chris Walker whose nationality is English is also a professional squash player.",
"She joined the Trinity College in 2008 as the first Mexican female to join a US college for squash and graduated in 2010."
],
[
"Chumbhotbongs Paribatra, Prince of Nakhon Sawan II( 5 December 1904 – 15 September 1959) was the only son of Prince Paripatra to survive into adulthood.",
"He married Mom Ratchawong Pantip Devakula.",
"Their daughter is the artist, Princess Marsi Paribatra.",
"The only surviving male descendant of royal blood from Queen Sukhumala Marasri and an indirect first cousin of the late King Bhumibol Adulyadej( since his father and Prince Mahidol Adulyadej are half- brothers), Chumbhotpong Paripatra was a potential heir to the Thai throne according to the 1924 Palace Law of Succession.",
"In 1952, Prince Chumbhot converted his private residence into the Suan Pakkad Palace."
],
[
"Ali Khalifa Rahuma( born May 16, 1982) is a Libyan football midfielder who's nationality is Libyan.",
"He currently plays for Al- Ittihad, and is a member of the Libya national football team."
],
[
"Suddha Dibyaratana, Princess of Rattanakosin( 14 September 1877 – 2 January 1922), was the Princess of Siam( later Thailand).",
"She was a member of the Siamese Royal Family.",
"She was given the highest rank of any daughter of Chulalongkorn, King Rama V of Siam.",
"Her mother was Queen Sukhumala Marasri, fourth Queen Consort and half- sister of King Chulalongkorn.",
"She had a younger brother, Paribatra Sukhumbandhu, the Prince of Nakhon Sawan.",
"Her full given name was\" Suddha Dibyaratana Sukhumkhattiya Galyavadi\"(.",
"She was, if not, one of the most beautiful princess out of all the King's daughters.",
"On 9 August 1906, she was given the royal title of The Princess of Rattanakosin, translated in Thai as Krommaluang Si Rattanakosin.",
"She was the only daughter given the rank of\" Krommaluang\", the 3rd level of the\" Krom\" ranks."
],
[
"Khalid al- Habib also believed to use the nom de guerre Khalid al Harbi is an Al- Qaeda leader who has overseen Al- Qaeda operations in Afghanistan.",
"In 2005 it was reported that al- Habib died in a US missile strike on the Pakistani village of Damadola though it is unclear whether this is the case as Pakistani officials have claimed that no Al- Qaeda leaders died in the attack.",
"al- Habib's nationality is not clear with sources claiming he may be Egyptian or Moroccan."
]
],
"title": [
"Sukhumala Marasri",
"Mongkut",
"Roberto Savio",
"Baglan Mailybayev",
"Eleni Gabre-Madhin",
"Nayelly Hernández",
"Chumbhotbongs Paribatra",
"Ali Rahuma",
"Suddha Dibyaratana",
"Khalid al-Habib"
]
},
"supporting_facts": {
"sent_id": [
0,
0
],
"title": [
"Sukhumala Marasri",
"Mongkut"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: What nationality is Sukhumala Marasri's father?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"THA",
"Kingdom of Thailand",
"Yothathep",
"Siam",
"Thailand",
"🇹🇭",
"th",
"Land of Smiles"
]
},
"style": "rule"
}
|
{
"index": 34,
"split": "test"
}
|
dev_2584
|
Which country the director of film Vendetta (1986 Film) is from?
|
[
"Dominion of New Zealand",
"NZL",
"NZ",
"Aotearoa",
"nz",
"🇳🇿",
"New Zealand"
] |
{
"context": {
"content": [
[
"Bruce Logan (born 1938) is a New Zealand conservative Christian who has been involved in opposition to liberal social policies within his country for over two decades."
],
[
"Vendetta is a 1986 women in prison film directed by Bruce Logan and starring Karen Chase and Sandy Martin.",
"It was released in September 1986."
],
[
"John Donatich is the Director of Yale University Press."
],
[
"Olav Aaraas( born 10 July 1950) is a Norwegian historian and museum director.",
"He was born in Fredrikstad.",
"From 1982 to 1993 he was the director of Sogn Folk Museum, from 1993 to 2010 he was the director of Maihaugen and from 2001 he has been the director of the Norwegian Museum of Cultural History.",
"In 2010 he was decorated with the Royal Norwegian Order of St. Olav."
],
[
"Brian Patrick Kennedy( born 5 November 1961) is an Irish- born art museum director who has worked in Ireland and Australia, and now lives and works in the United States.",
"He is currently the director of the Peabody Essex Museum.",
"He was the director of the Toledo Museum of Art in Ohio from 2010 to 2019.",
"He was the director of the Hood Museum of Art from 2005 to 2010, and the National Gallery of Australia( Canberra) from 1997- 2004."
],
[
"John Farrell is the director of YouTube in Latin America."
],
[
"Peter Levin is an American director of film, television and theatre."
],
[
"Ian Barry is an Australian director of film and TV."
],
[
"Dana Blankstein- Cohen( born March 3, 1981) is the director of the Israeli Academy of Film and Television.",
"She is a film director, and an Israeli culture entrepreneur."
],
[
"Michael Govan( born 1963) is the director of the Los Angeles County Museum of Art since 2006.",
"Prior to this, Govan worked as the director of the Dia Art Foundation in New York City."
]
],
"title": [
"Bruce Logan",
"Vendetta (1986 film)",
"John Donatich",
"Olav Aaraas",
"Brian Kennedy (gallery director)",
"John Farrell (businessman)",
"Peter Levin",
"Ian Barry (director)",
"Dana Blankstein",
"Michael Govan"
]
},
"supporting_facts": {
"sent_id": [
0,
0
],
"title": [
"Vendetta (1986 film)",
"Bruce Logan"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which country the director of film Vendetta (1986 Film) is from?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Dominion of New Zealand",
"NZL",
"NZ",
"Aotearoa",
"nz",
"🇳🇿",
"New Zealand"
]
},
"style": "rule"
}
|
{
"index": 35,
"split": "test"
}
|
dev_7179
|
Which film was released more recently, Girl Slaves Of Morgana Le Fay or Strawberry Shortcake: Rockaberry Roll?
|
[
"Strawberry Shortcake: Rockaberry Roll"
] |
{
"context": {
"content": [
[
"Girl Slaves of Morgana",
"Le Fay is a 1971 French film by Bruno Gantillon, an erotic and parodic tale involving Morgan le Fay and a castle full of women in the French countryside.",
"Morgan rules the island Avalon as if it were Lesbos in a lesbian exploitative movie praised by critics for its setup and characters, and named a\" neglected Eurotrash classic\"."
],
[
"Strawberry Shortcake:",
"Let's Dance is a Direct- to- DVD feature that was released on October 2, 2007 by 20th Century Fox Home Entertainment, although it was given an early release by Kidtoon Films on September 1, 2007 in select Cinemas."
],
[
"Love Slaves of the Amazons is a 1957 adventure film, written and directed by Curt Siodmak."
],
[
"The Strawberry Shortcake Movie: Sky's the Limit (known as The Strawberry Shortcake Movie by viewers) is a 2009 American/Canadian/French CGI-animated direct-to-video film directed by Michael Hack and Mucci Fassett.",
"It serves as the pilot for Strawberry Shortcake's Berry Bitty Adventures, a series that aired one year after the film's release.",
"It also takes place after a pilot cartoon based on it."
],
[
"Strawberry Shortcake:",
"The Sweet Dreams Movie is a 2006 American computer- animated family fantasy adventure film produced by DIC Entertainment Corporation and released theatrically in select cities on October 7, 2006 by Kidtoon Films.",
"It was the first feature- length film to feature the eponymous American Greetings property, and stars the voices of Sarah Heinke, Rachel Ware, Nils Haaland, and Bridget Robbins.",
"The film also includes the first appearance of the villain, the Peculiar Pie Man, since Nelvana's Strawberry Shortcake specials from more than twenty years earlier."
],
[
"Strawberry Shortcake Meets the Berrykins is a 1985 animated television special from Canada's Nelvana Limited, and the last one until DiC's revival to feature the American Greetings character, Strawberry Shortcake (2003–2009).",
"In 1985, the special gained popularity in the United Kingdom when it was used as a theatrical featurette alongside \"The Care Bears Movie\", introducing the Strawberry Shortcake franchise to a British public who at the time were largely unaware of it."
],
[
"The film was released in Japan on 10 October 2009."
],
[
"Strawberry Shortcake: Rockaberry Roll is a Direct-to-DVD feature that was released on August 12, 2008 by 20th Century Fox Home Entertainment, and was later shown theaterically by Kidtoon Films in October 2008."
],
[
"Azaad Desh Ke Gulam(\" Slaves of a Free Nation\") is a 1990 Indian Hindi film released on 6 April, starring Jackie Shroff and Rekha.",
"The film was a remake of Tamil film\" Sudhanthira Naattin Adimaigal\".",
"The film was a hit."
],
[
"Slaves of Babylon is a 1953 American Technicolor adventure film directed by William Castle and starring Richard Conte and Linda Christian"
]
],
"title": [
"Girl Slaves of Morgana Le Fay",
"Strawberry Shortcake: Let's Dance",
"Love Slaves of the Amazons",
"The Strawberry Shortcake Movie: Sky's the Limit",
"Strawberry Shortcake: The Sweet Dreams Movie",
"Strawberry Shortcake Meets the Berrykins",
"The Wonderful World of Captain Kuhio",
"Strawberry Shortcake: Rockaberry Roll",
"Azaad Desh Ke Gulam",
"Slaves of Babylon"
]
},
"supporting_facts": {
"sent_id": [
1,
0
],
"title": [
"Girl Slaves of Morgana Le Fay",
"Strawberry Shortcake: Rockaberry Roll"
]
},
"type": "comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which film was released more recently, Girl Slaves Of Morgana Le Fay or Strawberry Shortcake: Rockaberry Roll?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Strawberry Shortcake: Rockaberry Roll"
]
},
"style": "rule"
}
|
{
"index": 36,
"split": "test"
}
|
dev_8900
|
Which film was released more recently, Son Of France or Switchmas?
|
[
"Switchmas",
"All I Want Is Christmas"
] |
{
"context": {
"content": [
[
"Empress of France or Empress of the French may refer to:"
],
[
"Son of France( French: Fils de France) is a 1946 French drama film directed by Pierre Blondy and starring Jean Mercanton, Jimmy Gaillard and Jacques Famery.",
"It was made with the assistance of the French Army Film Service."
],
[
"Chroniques de France or Chronicles of France may refer to:"
],
[
"Isabella of France( c. 1295 – 1358), was the daughter of Philip IV of France and Joan I of Navarre, queen consort of Edward II of England and mother of Edward III of England.",
"Isabella of France( or Isabelle, etc.) may also refer to:"
],
[
"Switchmas( earlier titles Ira Finkelstein's Christmas and All I Want Is Christmas) is an American musical comedy film directed by Sue Corcoran.",
"It tells the story of a Christmas obsessed Jewish boy on his way to sunny Florida who figures out how to get the Christmas of his dreams by trading airline tickets and places with another boy on his way to snowy Christmastown, Washington.",
"This film also features CGI scenes based on Ira's Christmas dreams.",
"It stars Elliott Gould and David DeLuise and is directed and produced by Sue Corcoran.",
"It was originally filmed, in Leavenworth, Washington, under the title\" Ira Finkelstein's Christmas\".",
"At a local screening in 2012 it had the title\" All I want is Christmas\".",
"The film was included in the 2012 Seattle International Film Festival."
],
[
"Coney Island Baby is a 2003 comedy- drama in which film producer Amy Hobby made her directorial debut.",
"Karl Geary wrote the film and Tanya Ryno was the film's producer.",
"The music was composed by Ryan Shore.",
"The film was shot in Sligo, Ireland, which is known locally as\" Coney Island\".",
"The film was screened at the Newport International Film Festival.",
"Hobby won the Jury Award for\" Best First Time Director\".",
"The film made its premiere television broadcast on the Sundance Channel."
],
[
"Louis of France or Louis de France may refer to:"
],
[
"Sonnenallee is a railway station in the Neukölln district of Berlin.",
"It is served by the S- Bahn lines and and is located at the southeastern end of the major street of the same name, about which a film was produced in 1999."
],
[
"Mercœur may refer to the following places in France:",
"Or to the"
],
[
"The film was released in Japan on 10 October 2009."
]
],
"title": [
"Empress of France",
"Son of France",
"Chroniques de France",
"Isabella of France (disambiguation)",
"Switchmas",
"Coney Island Baby (film)",
"Louis of France",
"Berlin Sonnenallee station",
"Mercœur",
"The Wonderful World of Captain Kuhio"
]
},
"supporting_facts": {
"sent_id": [
0,
5
],
"title": [
"Son of France",
"Switchmas"
]
},
"type": "comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which film was released more recently, Son Of France or Switchmas?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Switchmas",
"All I Want Is Christmas"
]
},
"style": "rule"
}
|
{
"index": 37,
"split": "test"
}
|
dev_11526
|
Which film has the director who was born later, The Brand Of Hate or Lady Stay Dead?
|
[
"Lady Stay Dead"
] |
{
"context": {
"content": [
[
"W. Augustus Barratt( 1873- 1947) was a Scottish- born, later American, songwriter and musician."
],
[
"Rumbi Katedza is a Zimbabwean Film Producer and Director who was born on 17 January 1974."
],
[
"The Brand of Hate is a 1934 American western film directed by Lewis D. Collins and starring Bob Steele, Lucile Browne and William Farnum."
],
[
"Bars of Hate is a 1935 American crime film directed by Albert Herman."
],
[
"Lady Stay Dead is a 1981 Australian thriller film directed by Terry Bourke about a psychotic handyman."
],
[
"Lewis D. Collins, often known as Lew Collins or Cullen Lewis( 12 January 1899 – 24 August 1954), was an American film director.",
"He is associated particularly with the western genre."
],
[
"Ben Palmer is a British film and television director who is known for being the director of\" Bo' Selecta\" and\" The Inbetweeners\"."
],
[
"Edward J. Yates( September 16, 1918 – June 2, 2006) was an American television director who was the director of the ABC television program\" American Bandstand\" from 1952 until 1969."
],
[
"Hassan Zee is a Pakistani- American film director who was born in Chakwal, Pakistan."
],
[
"Terry Christopher Bourke (Bairnsdale, Victoria 19 April 1940 – 29 June 2002) was an Australian journalist, screenwriter, producer and director.",
"He worked as a show business journalist and production assistant in Hong Kong for a number of years before returning to Australia in 1971.",
"He made several movies and also worked in television.",
"Phil Avalon called hum \"one of Australia's first maverick filmmakers\"."
]
],
"title": [
"W. Augustus Barratt",
"Rumbi Katedza",
"The Brand of Hate",
"Bars of Hate",
"Lady Stay Dead",
"Lewis D. Collins",
"Ben Palmer",
"Edward Yates",
"Hassan Zee",
"Terry Bourke"
]
},
"supporting_facts": {
"sent_id": [
0,
0,
0,
0
],
"title": [
"The Brand of Hate",
"Lady Stay Dead",
"Lewis D. Collins",
"Terry Bourke"
]
},
"type": "bridge_comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which film has the director who was born later, The Brand Of Hate or Lady Stay Dead?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Lady Stay Dead"
]
},
"style": "rule"
}
|
{
"index": 38,
"split": "test"
}
|
dev_4956
|
Which film has the director who is older than the other, The Burning Moon or Ana And The Wolves?
|
[
"Ana And The Wolves"
] |
{
"context": {
"content": [
[
"Valley of the Wolves: Homeland is a film of the Turkish TV serial media franchise\" Valley of the Wolves\".",
"The story is about the Turkish coup d'état attempt."
],
[
"Haldun Boysan( born 6 July 1958) is a Turkish actor.",
"One of his notable works are Valley of the Wolves as Tombalacı Mehmet who's one of the councils of the Wolves."
],
[
"Carlos Saura Atarés( born 4 January 1932) is a Spanish film director, photographer and writer.",
"Along with Luis Buñuel and Pedro Almodóvar, he is counted among the Spain ’s three most renowned filmmakers.",
"He has a long and prolific career that spans over half a century.",
"Several of his films have won many international awards.",
"Saura began his career in 1955 making documentary shorts.",
"He quickly gained international prominence when his first feature- length film premiered at Cannes Film Festival in 1960.",
"Although he started filming as a neorealist, Saura quickly switched to films encoded with metaphors and symbolism in order to get around the Spanish censors.",
"In 1966, he was thrust into the international spotlight when his film\" La Caza\" won the Silver Bear at the Berlin International Film Festival.",
"In the following years, he forged an international reputation for his cinematic treatment of emotional and spiritual responses to repressive political conditions.",
"By the 1970s, Saura was the best known filmmaker working in Spain.",
"His films employed complex narrative devices and were frequently controversial.",
"He won Special Jury Awards for\" La Prima Angélica\"( 1973) and\" Cría Cuervos\"( 1975) in Cannes; and an Academy Award for Best Foreign Language Film nomination in 1979 for\" Mama Cumple 100 Años\".",
"In the 1980s, Saura was in the spotlight for his Flamenco trilogy –\" Bodas de SangreCarmen\" and\" El Amor Brujo,\" in which he combined dramatic content and flamenco dance forms.",
"His work continued to be featured in worldwide competitions and earned numerous awards.",
"He received two nominations for Academy Awards for Best Foreign Language Film, for\" Carmen\"( 1983) and\" Tango\"( 1998).",
"His films are sophisticated expression of time and space fusing reality with fantasy, past with present, and memory with hallucination.",
"In the last two decades of the 20th century, Saura has concentrated on works uniting music, dance and images."
],
[
"Ben Palmer is a British film and television director who is known for being the director of\" Bo' Selecta\" and\" The Inbetweeners\"."
],
[
"The Burning Moon is a 1992 German splatter film directed by Olaf Ittenbach.",
"The film depicts a juvenile delinquent reading two highly gruesome bedtime stories to his young sister, the first involving a blind date who turns out to be a serial killer and the second following a raping, murdering priest and the murder of the suspected killer by an angry townsperson, who is then sent to Hell and tortured."
],
[
"Olaf Ittenbach( born 1 April 1969) is a German horror movie director, actor and special effects artist."
],
[
"Ana and the Wolves is a 1973 Spanish drama film directed by Carlos Saura.",
"Starring Geraldine Chaplin as a foreign governess who comes to an isolated house to take care of the children of a convoluted family.",
"The film is encoded with political symbolism of Francoist Spain.",
"Saura ’s 1979",
"Mamá cumple cien años was a sequel of sorts.",
"It was entered into the 1973 Cannes Film Festival."
],
[
"Elliot Silverstein( born August 3, 1927, in Boston, Massachusetts) is an American director, who is best known for being the director of the movie\" Cat Ballou\"( 1965)."
],
[
"Ana María Vignoli or Ana Maria Vignoli( born 27 July 1945) is a Uruguayan former minister of Social Development."
],
[
"Mamá cumple cien años is a 1979 Spanish comedy film written and directed by Carlos Saura.",
"The film is a comedy sequel of the drama Ana and the Wolves directed by Saura in 1973.",
"It was nominated for the Academy Award for Best Foreign Language Film at the 52nd Academy Awards."
]
],
"title": [
"Valley of the Wolves: Homeland",
"Haldun Boysan",
"Carlos Saura",
"Ben Palmer",
"The Burning Moon",
"Olaf Ittenbach",
"Ana and the Wolves",
"Elliot Silverstein",
"Ana María Vignoli",
"Mama Turns 100"
]
},
"supporting_facts": {
"sent_id": [
0,
0,
0,
0
],
"title": [
"The Burning Moon",
"Ana and the Wolves",
"Olaf Ittenbach",
"Carlos Saura"
]
},
"type": "bridge_comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which film has the director who is older than the other, The Burning Moon or Ana And The Wolves?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Ana And The Wolves"
]
},
"style": "rule"
}
|
{
"index": 39,
"split": "test"
}
|
dev_10021
|
Which film has the director who was born later, Three Kinds Of Heat or Stop The Wedding?
|
[
"Stop the Wedding",
"Stop The Wedding"
] |
{
"context": {
"content": [
[
"Edward J. Yates( September 16, 1918 – June 2, 2006) was an American television director who was the director of the ABC television program\" American Bandstand\" from 1952 until 1969."
],
[
"After the Wedding is a 1962 Soviet drama film directed by."
],
[
"Stop the Wedding is a 2016 American- Canadian made for television romantic comedy film directed by Anne Wheeler and starring Rachel Boston, Niall Matter and Alan Thicke.",
"It is an adaptation of the novel of the same name by Stephanie Bond.",
"The film was produced by the Hallmark Channel and premiered on their network in the United States on June 11, 2016, as part of their June wedding series.",
"The film earned Anne Wheeler a 2017 Leo Award, in the Best Direction( Television Film) category."
],
[
"Hassan Zee is a Pakistani- American film director who was born in Chakwal, Pakistan."
],
[
"Three Kinds of Heat is a 1987 American action film written and directed by Leslie Stevens and starring Robert Ginty, Victoria Barrett, Shakti Chen, Jeannie Brown, Leslie Clark and Malcolm Connell.",
"It was released on December 4, 1987, by The Cannon Group, Inc."
],
[
"W. Augustus Barratt( 1873- 1947) was a Scottish- born, later American, songwriter and musician."
],
[
"Rumbi Katedza is a Zimbabwean Film Producer and Director who was born on 17 January 1974."
],
[
"Anne Wheeler, OC( born September 23, 1946) is a Canadian film and television writer, producer, and director.",
"Graduating in Mathematics from the University of Alberta she was a computer programmer before traveling abroad.",
"Her years of travels inspired her to become a storyteller and when she returned she joined a group of old friends to form a film collective.",
"From 1975 to 1985 she worked for the NFB where she made her first feature film,\" A War Story\"( 1981), which was about her father, Ben Wheeler and his time as a doctor in a P.O.W. camp during World War II.",
"The war is a common theme in her work and she revisited it later in her films\" Bye Bye Blues\"( 1989) and\" The War Between Us\"( 1995).",
"Her first non-NFB film was\" Loyalties\" in 1986.",
"In addition to her films, Wheeler has directed episodes of\" Cold SquadDa Vinci's InquestMysterious WaysThis Is Wonderland\" and\" The Guard\".",
"Wheeler has been nominated four times for the Genie Award for Best Achievement in Direction for her films\" Loyalties\"( 1986),\" Cowboys Do n't Cry\"( 1988),\" Bye Bye Blues\"( 1989), and\" Suddenly Naked\"( 2001).",
"Her 1998 television miniseries,\" The Sleep Room\", won Gemini awards for best television movie and best direction.",
"In 2017 Wheeler won a Leo Award for Best Direction( Television Film) for the Hallmark movie\" Stop the Wedding\".",
"Wheeler was made an Officer of the Order of Canada in 1995.",
"In 2012 she received the Queen Elizabeth II Diamond Jubilee Medal.",
"Wheeler has also been awarded seven honorary doctorates."
],
[
"Leslie Clark Stevens IV( February 3, 1924 – April 24, 1998) was an American producer, writer, and director.",
"He created two television series for the ABC network,\" The Outer Limits\"( 1963 – 1965) and\" Stoney Burke\"( 1962 – 63), and\" Search\"( 1972 – 73) for NBC.",
"Stevens was the director of the horror film\" Incubus\"( 1966), which stars William Shatner, and was the second film to use the Esperanto language.",
"He wrote an early work of New Age philosophy,( 1970)."
],
[
"Ben Palmer is a British film and television director who is known for being the director of\" Bo' Selecta\" and\" The Inbetweeners\"."
]
],
"title": [
"Edward Yates",
"After the Wedding (1962 film)",
"Stop the Wedding",
"Hassan Zee",
"Three Kinds of Heat",
"W. Augustus Barratt",
"Rumbi Katedza",
"Anne Wheeler",
"Leslie Stevens",
"Ben Palmer"
]
},
"supporting_facts": {
"sent_id": [
0,
0,
0,
0
],
"title": [
"Three Kinds of Heat",
"Stop the Wedding",
"Leslie Stevens",
"Anne Wheeler"
]
},
"type": "bridge_comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which film has the director who was born later, Three Kinds Of Heat or Stop The Wedding?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Stop the Wedding",
"Stop The Wedding"
]
},
"style": "rule"
}
|
{
"index": 40,
"split": "test"
}
|
dev_10714
|
What is the award that the director of film Chantilly Lace (Film) got?
|
[
"Emmy Awards",
"Emmys",
"Emmy Award"
] |
{
"context": {
"content": [
[
"John Donatich is the Director of Yale University Press."
],
[
"Olav Aaraas( born 10 July 1950) is a Norwegian historian and museum director.",
"He was born in Fredrikstad.",
"From 1982 to 1993 he was the director of Sogn Folk Museum, from 1993 to 2010 he was the director of Maihaugen and from 2001 he has been the director of the Norwegian Museum of Cultural History.",
"In 2010 he was decorated with the Royal Norwegian Order of St. Olav."
],
[
"Dana Blankstein- Cohen( born March 3, 1981) is the director of the Israeli Academy of Film and Television.",
"She is a film director, and an Israeli culture entrepreneur."
],
[
"Michael Govan( born 1963) is the director of the Los Angeles County Museum of Art since 2006.",
"Prior to this, Govan worked as the director of the Dia Art Foundation in New York City."
],
[
"Chantilly Lace is a 1993 American made-for-television drama film shot in Sundance, Utah, for the Showtime Network and eventually released on video via Columbia TriStar Home Entertainment.",
"The film was directed by Linda Yellen and features dialogue that was largely improvised by its ensemble cast."
],
[
"Peter Levin is an American director of film, television and theatre."
],
[
"Ian Barry is an Australian director of film and TV."
],
[
"Etan Boritzer( born 1950) is an American writer of children ’s literature who is best known for his book\" What is God?\" first published in 1989.",
"His best selling\" What is?\" illustrated children's book series on character education and difficult subjects for children is a popular teaching guide for parents, teachers and child- life professionals.",
"Boritzer gained national critical acclaim after\" What is God?\" was published in 1989 although the book has caused controversy from religious fundamentalists for its universalist views.",
"The other current books in the\" What is?\" series include",
"What is Love?,",
"What is Death?,",
"What is Beautiful?,",
"What is Funny?,",
"What is Right?,",
"What is Peace?,",
"What is Money?,",
"What is Dreaming?,",
"What is a Friend?,",
"What is True?,",
"What is a Family?,",
"What is a Feeling?\"",
"The series is now also translated into 15 languages.",
"Boritzer was first published in 1963 at the age of 13 when he wrote an essay in his English class at Wade Junior High School in the Bronx, New York on the assassination of John F. Kennedy.",
"His essay was included in a special anthology by New York City public school children compiled and published by the New York City Department of Education.",
"Boritzer now lives in Venice, California and maintains his publishing office there also.",
"He has helped numerous other authors to get published through\" How to Get Your Book Published!\" programs.",
"Boritzer is also a yoga teacher who teaches regular classes locally and guest- teaches nationally.",
"He is also recognized nationally as an erudite speaker on\" The Teachings of the Buddha.\""
],
[
"John Farrell is the director of YouTube in Latin America."
],
[
"Linda Yellen is an American director, producer and writer of film and television.",
"As a producer some of her credits are \"Playing for Time\" (1980), \"The Royal Romance of Charles and Diana\" (1982) and \"Second Serve\" (1986).",
"Some of her credits as a director include \"Northern Lights\" (1997), \"The Simian Line\" (2001), (2011) and \"The Last Film Festival\" (2011) which stars Dennis Hopper in his final acting role.",
"Check out information at https://www.facebook.com/TLFFMOVIE \"BLUSH\" (2014)",
"The untold story of a love triangle between Greta Garbo, Marlene Dietrich and Mercedes De Acosta",
"She is the producer/writer of \"Skinny and Cat\" (2014)",
"Linda's projects have received 2 Peabodys, 7 Emmys, including a Primetime Emmy Award as a producer for \"Playing for Time\" in 1981, 1 Silver Nymph and 2 Christophers.",
"She is an alumna of Barnard College and Columbia University.",
"Linda's films have been shown at Cannes, Sundance , New York, Hamptons, Monte Carlo, Aspen, Banff, Palm Springs, Nashville, Santa Fe International, and Deauville Film Festivals.",
"Additionally, Linda's projects have been nominated for a Golden Globe and a Writers Guild Award."
]
],
"title": [
"John Donatich",
"Olav Aaraas",
"Dana Blankstein",
"Michael Govan",
"Chantilly Lace (film)",
"Peter Levin",
"Ian Barry (director)",
"Etan Boritzer",
"John Farrell (businessman)",
"Linda Yellen"
]
},
"supporting_facts": {
"sent_id": [
1,
6
],
"title": [
"Chantilly Lace (film)",
"Linda Yellen"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: What is the award that the director of film Chantilly Lace (Film) got?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Emmy Awards",
"Emmys",
"Emmy Award"
]
},
"style": "rule"
}
|
{
"index": 41,
"split": "test"
}
|
dev_8666
|
Who is Prince Jean Of Luxembourg's mother?
|
[
"Princess Josephine Charlotte of Belgium",
"Princess Joséphine Charlotte of Belgium"
] |
{
"context": {
"content": [
[
"Prince Jean of Orléans, Duke of Guise( Jean Pierre Clément Marie; 4 September 1874 – 25 August 1940), was the third son and youngest child of Prince Robert, Duke of Chartres( 1840 – 1910), grandson of Prince Ferdinand Philippe and great- grandson of Louis Philippe I, King of the French.",
"His mother was Françoise of Orléans, daughter of François, Prince of Joinville, and Princess Francisca of Brazil."
],
[
"May Green Hinckley( May 1, 1881 – May 2, 1943) was the third general president of the Primary of The Church of Jesus Christ of Latter- day Saints( LDS Church) from 1940 until her death.",
"She was the stepmother of Gordon B. Hinckley, fifteenth president of the LDS Church."
],
[
"Dorothy Karolyn Granger( November 21, 1911 – January 4, 1995) was an American actress best known for her roles in short subject comedies in Hollywood.",
"She was also the stepmother of film maker and former record producer Anthony J. Hilder."
],
[
"Peju Ogunmola is a Yoruba film actress who stars in Nollywood movies of Yoruba genres.",
"Her father was the veteran actor Kola Ogunmola.",
"She is the wife of Sunday Omobolanle, a 64- year- old Nigerian comic actor, playwright, film director, and producer.",
"She is the stepmother of Sunkanmi Omobolanle who is also an actor."
],
[
"Jean (Jean Benoît Guillaume Robert Antoine Louis Marie Adolphe Marc d'Aviano; 5 January 1921 – 23 April 2019) was the Grand Duke of Luxembourg from 1964 until his abdication in 2000.",
"He was the first Grand Duke of Luxembourg of French agnatic descent.",
"Jean was the eldest son of Grand Duchess Charlotte and Prince Felix.",
"Jean's primary education was initially in Luxembourg, before attending Ampleforth College in England.",
"In 1938, he was officially named Hereditary Grand Duke as heir to the throne of Luxembourg.",
"While Luxembourg was occupied by Germans during the Second World War, the grand ducal family was abroad in exile.",
"Jean studied at the Université Laval in Quebec City.",
"Jean later volunteered to join the British army's Irish Guards in 1942, and after graduating from the Royal Military Academy Sandhurst, received his commission in 1943.",
"He participated in the Normandy landings and the Battle for Caen, and joined the Allied forces in the liberation of Luxembourg.",
"From 1984 until 2000, he was colonel of the Irish Guards.",
"On 9 April 1953, Jean married Princess Joséphine Charlotte of Belgium with whom he had five children.",
"On 12 November 1964, Grand Duchess Charlotte abdicated and Jean succeeded her as Grand Duke of Luxembourg.",
"He then reigned for 36 years before he himself abdicated on 7 October 2000 and was succeeded by his son, Grand Duke Henri."
],
[
"Edith Irene Bailey Baker( November 17, 1901 – April 2, 1994) was an American politician and a United States Representative from Tennessee.",
"She was the widow of Howard Baker Sr. and the stepmother of Howard Baker Jr."
],
[
"Jane Wigham( née Smeal; 1801–1888) was a leading Scottish abolitionist.",
"She was the stepmother of Eliza Wigham and the second wife of John Wigham."
],
[
"Princess Margaretha of Liechtenstein( born Princess Margaretha of Luxembourg on 15 May 1957) is the fourth child and second and youngest daughter of Grand Duke Jean of Luxembourg and Princess Joséphine- Charlotte of Belgium.",
"As the sister of Grand Duke Henri of Luxembourg and the sister- in- law of Prince Hans- Adam II of Liechtenstein, she is a princess of two current realms and a member of the Luxembourg and Liechtenstein reigning dynasties."
],
[
"Henriette Feuerbach( 13 August 1812 – 5 August 1892) was a German author and arts patron.",
"She was the wife of Joseph Anselm Feuerbach and the stepmother of painter Anselm Feuerbach, whom she supported in his art."
],
[
"Prince Jean of Luxembourg (given names: \"Jean Félix Marie Guillaume\"; born 15 May 1957 at Betzdorf Castle), the second son of Jean, Grand Duke of Luxembourg and Princess Joséphine-Charlotte of Belgium.",
"He is the twin brother of Princess Margaretha.",
"He frequently goes by the name of Jean Nassau.",
"On 26 September 1986, Prince Jean renounced his right of succession to the Luxembourg throne."
]
],
"title": [
"Prince Jean, Duke of Guise",
"May Green Hinckley",
"Dorothy Granger",
"Peju Ogunmola",
"Jean, Grand Duke of Luxembourg",
"Irene Baker",
"Jane Wigham",
"Princess Margaretha of Liechtenstein",
"Henriette Feuerbach",
"Prince Jean of Luxembourg"
]
},
"supporting_facts": {
"sent_id": [
0,
10
],
"title": [
"Prince Jean of Luxembourg",
"Jean, Grand Duke of Luxembourg"
]
},
"type": "inference"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Who is Prince Jean Of Luxembourg's mother?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Princess Josephine Charlotte of Belgium",
"Princess Joséphine Charlotte of Belgium"
]
},
"style": "rule"
}
|
{
"index": 42,
"split": "test"
}
|
dev_128
|
Who is the father of the director of film Red 2 (Film)?
|
[
"Aldo Parisot"
] |
{
"context": {
"content": [
[
"Aldo L. \"Dean\" Parisot is an American film and television director.",
"He won the 1988 Academy Award for Best Live Action Short Film for \"The Appointments of Dennis Jennings\", which was co-written and starred comedian Steven Wright, with whom he shares the award.",
"Among his television credits are episodes of \"Monk\" (including the two-hour pilot \"Mr. Monk and the CandidateNorthern Exposure\" and \"Curb Your Enthusiasm\".",
"Parisot was born in Wilton, Connecticut, to Ellen James (née Lewis), a painter and art teacher, and Aldo Parisot, a Brazilian-born, well-known cellist and pedagogue.",
"He graduated from New York University's Tisch School of the Arts.",
"He took part in the Sundance Institute's June Lab.\n, he was scheduled to direct the third installment of the Bill and Ted franchise.",
"It was confirmed in May 2018 that \"Bill & Ted Face the Music\" was currently in pre-production.",
"Dean Parisot was married to film editor Sally Menke until her death in 2010; the couple had two children."
],
[
"Mia Riverton is an American film actress and producer, best known for her role in the 2005 film\" Red Doors\"."
],
[
"Peter Hamel( 1911–1979) was a German screenwriter and a director of film and television.",
"He appeared as himself in the 1948 comedy\" Film Without a Title\".",
"He is the father of the composer Peter Michael Hamel."
],
[
"Ian Barry is an Australian director of film and TV."
],
[
"Red 2 is 2013 American action comedy film and sequel to the 2010 film \"Red\".",
"It was based on the limited comic book series of the same name, created by Warren Ellis and Cully Hamner, and published by the DC Comics imprint Homage.",
"The film stars Bruce Willis, John Malkovich, Mary-Louise Parker, Catherine Zeta-Jones, Lee Byung-hun, Anthony Hopkins, and Helen Mirren, with Dean Parisot directing a screenplay by Jon and Erich Hoeber.",
"\"Red 2\" was released on July 19, 2013."
],
[
"Paul Brooke( born 22 November 1944) is a retired English actor of film, television and radio.",
"He is the father of actor Tom Brooke."
],
[
"He was the father of Obata Masamori."
],
[
"Georgia Lee is an American independent film director.",
"Her work includes the 2006 film \"Red Doors\"."
],
[
"Viscount was the first Director of Railways in Japan and is known as the\" father of the Japanese railways\"."
],
[
"He is the father of the later Member of Parliament Anna Eliasson."
]
],
"title": [
"Dean Parisot",
"Mia Riverton",
"Peter Hamel",
"Ian Barry (director)",
"Red 2 (film)",
"Paul Brooke",
"Obata Toramori",
"Georgia Lee",
"Inoue Masaru (bureaucrat)",
"Lars Eliasson"
]
},
"supporting_facts": {
"sent_id": [
2,
3
],
"title": [
"Red 2 (film)",
"Dean Parisot"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Who is the father of the director of film Red 2 (Film)?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Aldo Parisot"
]
},
"style": "rule"
}
|
{
"index": 43,
"split": "test"
}
|
dev_10942
|
Who died earlier, Gunnar Kulldorff or Gordon Gibson Sr.?
|
[
"Gordon Gibson Sr.",
"Gordon Gibson, Sr."
] |
{
"context": {
"content": [
[
"Joshua Gibson",
"Jr.( August 11, 1930 – September 10, 2003) was a former African- American baseball infielder in the Negro Leagues.",
"He played in 1949 and 1950 for the Homestead Grays.",
"He also played with the Farnham Pirates in the Provincial League in 1951.",
"His father, Josh Gibson Sr., played in the Negro Leagues, and considered one of the greatest power hitters in baseball history."
],
[
"Gunnar Kulldorff (6 December 1927 – 25 June 2015) was a Swedish statistician, specializing in estimation theory, survey sampling and order statistics.",
"From 1989 to 1991, he was the president of the International Statistical Institute."
],
[
"Arthur Gordon Linkletter( born Arthur Gordon Kelly or Gordon Arthur Kelley( sources differ), July 17, 1912 – May 26, 2010) was a Canadian- born American radio and television personality.",
"He was the host of\" House Party\", which ran on CBS radio and television for 25 years, and\" People Are Funny\", which aired on NBC radio and television for 19 years.",
"He became a naturalized United States citizen in 1942.",
"One popular feature of his\" House Party\" program was the\" Kids Say the Darndest Things\" segments.",
"A series of books followed which contained the humorous comments made on- air by children."
],
[
"John Lambert Gibson( 7 March 1906 – 17 December 1986) was an independent member of the House of Commons of Canada.",
"He was a logger and lumber merchant by career.",
"Gibson was born in Vancouver, British Columbia.",
"With his brother, Gordon Gibson, Sr., he was involved with the logging- based company Gibson Brothers Industries.",
"John Gibson served as the company's president at one point.",
"He was first elected to Parliament at the Comox — Alberni riding in the 1945 general election at which he was classified as an Independent Liberal.",
"He was re-elected for a second term in 1949 as a purely independent member.",
"Gibson left federal office in 1953 and returned to Gibson Brothers Industries.",
"Gibson died on 17 December 1986 at Vancouver, aged 80."
],
[
"Jonathan Catlett Gibson Sr.( 1793 – Dec. 9, 1849) was a nineteenth- century Virginia farmer, lawyer, politician and War of 1812 veteran, whose five sons would fight for the Confederate States of America, including three sons who followed in his footsteps and became lawyers, of which two served in the Virginia House of Delegates and West Virginia House of Delegates."
],
[
"Daniel Gibson Sr.( born February 27, 1986) is a former American professional basketball player who played seven seasons in the National Basketball Association( NBA) for the Cleveland Cavaliers.",
"He was selected by the Cavaliers in the second round of the 2006 NBA draft."
],
[
"Gordon Gibson Sr.( November 28, 1904 – July 17, 1986) was a prominent business leader and politician in British Columbia.",
"He represented Lillooet from 1953 to 1955 and North Vancouver from 1960 to 1966 in the Legislative Assembly of British Columbia as a Liberal.",
"Gibson was a millionaire timber baron whose nickname was\" Bull of the Woods\" due to his loud lumberjack's voice.",
"He was dismissed as a rough, hard- drinking logger who had made it rich, but was loved by many small loggers as being one of the few people to be interested in them over the interests of big business."
],
[
"Gordon Gibson (1 November 1908 – 7 July 1967) was an Australian cricketer.",
"He played six first-class matches for Tasmania between 1928 and 1933."
],
[
"Gordon Gibson\" Gib\" Hutchinson( 25 December 1912 – 30 December 1996}) was a Canadian- British ice hockey goaltender who had a long and successful career in the English National League( ENL).",
"He was inducted into the British Ice Hockey Hall of Fame in 1951.",
"He married Audrey Wells, daughter of boxing champion Bombardier Billy Wells, in 1942."
],
[
"John Neil Bell (October 4, 1921 – August 17, 1998) was a Canadian football player who played for the Regina/Saskatchewan Roughriders from 1946 to 1952.",
"He played junior football for the Parkdale Lions.",
"In 1998, he was inducted into the Roughriders' Plaza of Honour.",
"He died earlier that year of a heart condition, aged 76."
]
],
"title": [
"Josh Gibson Jr.",
"Gunnar Kulldorff",
"Art Linkletter",
"John Lambert Gibson",
"Jonathan C. Gibson, Sr.",
"Daniel Gibson",
"Gordon Gibson Sr.",
"Gordon Gibson (cricketer)",
"Gib Hutchinson",
"Johnny Bell (Canadian football)"
]
},
"supporting_facts": {
"sent_id": [
0,
0
],
"title": [
"Gunnar Kulldorff",
"Gordon Gibson Sr."
]
},
"type": "comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Who died earlier, Gunnar Kulldorff or Gordon Gibson Sr.?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Gordon Gibson Sr.",
"Gordon Gibson, Sr."
]
},
"style": "rule"
}
|
{
"index": 44,
"split": "test"
}
|
dev_9086
|
Which film has the director who is older, Stradivari (Film) or Darby'S Rangers?
|
[
"Darby's Rangers",
"Darby'S Rangers"
] |
{
"context": {
"content": [
[
"Scott Fox is a pornographic film director who is a member of the AVN Hall of Fame."
],
[
"Elliot Silverstein( born August 3, 1927, in Boston, Massachusetts) is an American director, who is best known for being the director of the movie\" Cat Ballou\"( 1965)."
],
[
"William Augustus Wellman( February 29, 1896 – December 9, 1975) was an American film director notable for his work in crime, adventure and action genre films, often focusing on aviation themes, a particular passion.",
"He also directed several well- regarded satirical comedies.",
"Beginning his film career as an actor, he went on to direct over 80 films, at times co-credited as producer and consultant.",
"In 1927, Wellman famously directed\" Wings\", which became the first film to win an Academy Award for Best Picture at the 1st Academy Awards ceremony.",
"He also won the Academy Award for Best Story for his film\" A Star Is Born\"."
],
[
"Ben Palmer is a British film and television director who is known for being the director of\" Bo' Selecta\" and\" The Inbetweeners\"."
],
[
"Dan Milne is a British actor/ director who is possibly best known for his role in\" EastEnders\"."
],
[
"Darby's Rangers( released in the UK as The Young Invaders) is a 1958 war film starring James Garner as William Orlando Darby, who organized and led the first units of United States Army Rangers during World War II.",
"The movie was shot by Warner Brothers Studios in black and white to match wartime stock footage included in the production and was directed by William Wellman.",
"The film was based on the 1945 book\" Darby's Rangers: An Illustrated Portrayal of the Original Rangers\" by Major James J. Altieri, himself a veteran of Darby's force."
],
[
"William O. Darby",
"( 8 February 1911 – 30 April 1945) was a career United States Army officer who fought in World War II, where he was killed in action in Italy.",
"He was posthumously promoted to brigadier general.",
"Darby led the famous\" Darby's Rangers\", which evolved into the U.S. Army Rangers."
],
[
"Stradivari is a 1988 Italian biographical drama film directed by Giacomo Battiato.",
"It depicts real life events of luthier Antonio Stradivari.",
"It was filmed in sixteen weeks between Cremona( Stradivari's hometown) and Cinecittà."
],
[
"Giacomo Battiato( born 18 October 1943 in Verona) is an Italian film director and writer.",
"One guide to Italian cinema says he: Born in Verona and married to Anna Zaneva since 1999, he began his career in 1973 on Italian RAI TV and ten years later started in cinema with\" I paladini\".",
"His first book,\" Fuori dal cielo\"( 1996), won the Prize Domenico Rea.",
"He directed two non-fiction films about Pope John Paul II:(\" Karol, un uomo diventato Papa\") in 2005 and(\" Karol, un papa rimasto uomo\") in 2006.",
"He has staged several operas, including Verdi's\" Simon Boccanegra\" in Stuttgart, Germany, and Mozart's\" Così fan tutte\" in Naples, Italy.",
"He has also directed special programmes and documentaries, including:\" Expressionismus\", which won the MIFED Gran Prix; a portrait of Pierre Boulez; a film on Italian Design for the Museum of Modern Art, New York.",
"He won the Prix UNESCO at the International Festival of Art Film in Paris in 1993.",
"He has also directed award- winning commercials.",
"Such prizes include the One Show Gold Award from the Art Directors Club of New York, the Andy Award of Excellence from the Advertising Club in New York, and the Lion d' Or at 20th Cannes International Advertising Film Festival.",
"He has taught courses in film direction at the National School of Cinema in Rome."
],
[
"Logan Sandler is an American writer and director who is best known for his first feature film\" Live Cargo\"."
]
],
"title": [
"Scotty Fox",
"Elliot Silverstein",
"William A. Wellman",
"Ben Palmer",
"Dan Milne",
"Darby's Rangers",
"William Orlando Darby",
"Stradivari (film)",
"Giacomo Battiato",
"Logan Sandler"
]
},
"supporting_facts": {
"sent_id": [
0,
1,
0,
0
],
"title": [
"Stradivari (film)",
"Darby's Rangers",
"Giacomo Battiato",
"William A. Wellman"
]
},
"type": "bridge_comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which film has the director who is older, Stradivari (Film) or Darby'S Rangers?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Darby's Rangers",
"Darby'S Rangers"
]
},
"style": "rule"
}
|
{
"index": 45,
"split": "test"
}
|
dev_4905
|
Where was the director of film The Girl With The Dragon Tattoo (2009 Film) born?
|
[
"Denmark proper",
"Danmark",
"TAN",
"DEK",
"DK",
"dk",
"metropolitan Denmark",
"DNK",
"DEN",
"Denmark",
"🇩🇰"
] |
{
"context": {
"content": [
[
"Timothy Miller( born October 10, 1964) is an American film director and visual effects artist.",
"He made his feature- film directing debut with\" Deadpool\"( 2016).",
"He was nominated for the Academy Award for Best Animated Short Film as co-story writer and executive producer of the short animated film\" Gopher Broke\"( 2004).",
"Miller also designed the title sequences of\" The Girl with the Dragon Tattoo\" and."
],
[
"Noomi Rapace( born 28 December 1979) is a Swedish actress.",
"She achieved international fame with her portrayal of Lisbeth Salander in the Swedish film adaptations of the\" Millennium\" series:\" The Girl with the Dragon Tattoo The Girl Who Played with Fire\", and\" The Girl Who Kicked the Hornets' Nest\".",
"In 2011, she was nominated for a BAFTA Award for Best Actress for her performance in\" The Girl with the Dragon Tattoo\", and for an International Emmy Award for Best Actress for the miniseries version of the trilogy.",
"She has also starred as Anna in\" Daisy Diamond\"( 2007), Leena in\" Beyond\"( 2010), Anna in\" The Monitor\"( 2011), Madame Simza Heron in( 2011), Dr. Elizabeth Shaw in\" Prometheus\"( 2012), Beatrice in\" Dead Man Down\"( 2013), Raisa Demidova in\" Child 44\"( 2015), and the seven lead roles in\" What Happened to Monday\"( 2017)."
],
[
"Lisbeth Salander is a fictional character created by Swedish author and journalist Stieg Larsson.",
"She is the lead character in Larsson's award- winning\" Millennium\" series, along with the journalist Mikael Blomkvist.",
"Salander first appeared in the 2005 novel\" The Girl with the Dragon Tattoo\"( original Swedish title,\" Män som hatar kvinnor\", literally\" Men who hate women\" in English).",
"She reappeared in its sequels:\" The Girl Who Played with Fire\"( 2006),\" The Girl Who Kicked the Hornets' Nest\"( 2007),\" The Girl in the Spider's Web\"( 2015),\" The Girl Who Takes an Eye for an Eye\"( 2017) and\" The Girl Who Lived Twice\"( 2019)."
],
[
"The Girl with the Dragon Tattoo (literally \"Men who hate women\") is a 2009 Swedish crime thriller film based on the novel of the same name by Swedish author/journalist Stieg Larsson.",
"It is the first book in the trilogy known as the \"Millennium\" series, published in Sweden in 2005.",
"By August 2009, it had been sold to 25 countries outside Scandinavia and had been seen by more than 6 million people in the countries where it was already released.",
"Directed by Niels Arden Oplev, the film stars Michael Nyqvist and Noomi Rapace."
],
[
"Bo Persson is a sound mixer.",
"On 24 January 2012 he was nominated for an Academy Award for the film\" The Girl with the Dragon Tattoo\"."
],
[
"Niels Arden Oplev (born 26 March 1961) is a writer-director from Denmark."
],
[
"Karl David Sebastian Dencik( born 31 October 1974) is a Swedish- Danish actor.",
"He has acted in both Swedish and Danish films, and has also had major roles in English- language films including\" Tinker, Tailor, Soldier, Spy\" and\" The Girl with the Dragon Tattoo\"."
],
[
"Yorick van Wageningen( born 16 April 1964) is a Dutch actor who has performed in Dutch and American films, including\" The Chronicles of Riddick\" and the 2011 remake of\" The Girl with the Dragon Tattoo\"."
],
[
"Lena Endre( born 8 July 1955) is a Swedish actress of film and television, primarily in the Swedish and Norwegian markets, known for her parts in the Liv Ullmann film\" Trolösa\"( 2000), and the\" Millennium series\" of films( e.g.,\" The Girl with the Dragon Tattoo\"), based on the eponymous trio of Stieg Larsson books.",
"Endre made her English- language debut in 2012, in Paul Thomas Anderson's movie\" The Master\", starring Joaquin Phoenix and Philip Seymour Hoffman."
],
[
"Jeffrey Scott Cronenweth,( born January 14, 1962) is an American cinematographer based in Los Angeles, California who is known for his role as the director of photography on the David Fincher films\" Fight Club The Social Network\", and\" The Girl with the Dragon Tattoo\"( the latter two earned him Academy Award for Best Cinematography nominations).",
"He graduated from the USC School of Cinematic Arts and was invited to join the cinematographers branch of the Academy of Motion Picture Arts and Sciences in 2004.",
"He is the son of Jordan Cronenweth, one of the most influential cinematographers in motion picture history."
]
],
"title": [
"Tim Miller (director)",
"Noomi Rapace",
"Lisbeth Salander",
"The Girl with the Dragon Tattoo (2009 film)",
"Bo Persson",
"Niels Arden Oplev",
"David Dencik",
"Yorick van Wageningen",
"Lena Endre",
"Jeff Cronenweth"
]
},
"supporting_facts": {
"sent_id": [
3,
0
],
"title": [
"The Girl with the Dragon Tattoo (2009 film)",
"Niels Arden Oplev"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Where was the director of film The Girl With The Dragon Tattoo (2009 Film) born?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Denmark proper",
"Danmark",
"TAN",
"DEK",
"DK",
"dk",
"metropolitan Denmark",
"DNK",
"DEN",
"Denmark",
"🇩🇰"
]
},
"style": "rule"
}
|
{
"index": 46,
"split": "test"
}
|
dev_10868
|
Are Transamerica Corporation and World Wonder Ring Stardom both located in the same country?
|
[
"no"
] |
{
"context": {
"content": [
[
"Beatrice St Clere Priestley( born 22 March 1996) is an English professional wrestler, best known by the ring name Bea Priestley.",
"She is currently signed to All Elite Wrestling( AEW) and World Wonder Ring Stardom.",
"She is best known for her appearances in World of Sport Wrestling, Defiant Wrestling and Progress Wrestling."
],
[
"World Wonder Ring Stardom, legal name and often referred to simply as Stardom( stylized as STARD ✪M), is a Japanese\" joshi puroresu\" or women's professional wrestling promotion, founded in September 2010 by former Hyper Visual Fighting Arsion and JDStar promoter Rossy Ogawa, retired wrestler, model and mixed martial artist Fuka Kakimoto and veteran professional wrestler Nanae Takahashi.",
"Stardom quickly became one of the top\" joshi puroresu\" promotions, largely thanks to gravure idol Yuzuki Aikawa becoming the promotion's public face.",
"Much like JDStar, Stardom also places heavy emphasis on the physical attractiveness of its workers, and publishes several modeling photobooks and calendars of its workers per year.",
"Stardom's in- ring style takes influence from mixed martial arts with many workers relying on kicks as the main part of their offense.",
"Since October 2019, Stardom has been owned by Bushiroad, making it a sister promotion to Bushiroad's other promotion New Japan Pro-Wrestling( NJPW)."
],
[
"Xia Brookside is an English professional wrestler and valet currently signed to WWE where she performs on the NXT UK brand.",
"The daughter of Robbie Brookside, she is best known for her appearances in World Wonder Ring Stardom and All Star Wrestling."
],
[
", known mononymously as Konami is a Japanese professional wrestler currently signed to World Wonder Ring Stardom, where she is a two time Artist of Stardom Champion, and currently one half of the reigning World Tag team champions with Jungle Kyona.",
"Takemoto was trained by Kana and Minoru Tanaka, making her debut in February 2015.",
"Konami spent her early career competing for independent promotions such as Pro Wrestling Wave, Reina and JWP, eventually signing to Reina.",
"After leaving the promotion in July 2016, she went on a short hiatus before returning to pro wrestling in November and debuting for World Wonder Ring Stardom in December 2016, going on to become a regular in the promotion before signing full time in May 2018."
],
[
"Paige Wooding( born 23 April 1995) is an English professional wrestler, best known by the ring name Jamie Hayter.",
"She is best known for her work on the independent circuit in England with various promotions including in World Wonder Ring Stardom,, Revolution Pro Wrestling( RevPro) where she is the former Rev Pro British Women's Champion and most recently has appeared on AEW Dynamite."
],
[
"Sadie Gibbs( born 30 March 1992) is an English professional wrestler and gymnast, currently signed to All Elite Wrestling( AEW).",
"She is best known for her appearances in World Wonder Ring Stardom,, The Wrestling League, WrestleGate Pro and British Empire Wrestling( BEW)."
],
[
"As a tarento she belonged to PBB, but now she is a freelancer.",
"Although she is freelance as a professional wrestler, at the time of her debut she belonged to World Wonder Ring Stardom with the ring name of."
],
[
"On January 29, 2012, Bushiroad announced that it had fully acquired New Japan Pro- Wrestling, a major professional wrestling promotion, for ¥ 500 million.",
"Seven years later, it acquired World Wonder Ring Stardom, a women's wrestling promotion.",
"At the 2012 Tokyo Game Show, Bushiroad announced Bushimo, a new social gaming platform for smartphones, which was released in Winter 2012.",
"In March 2013, Bushiroad announced the revival of the media franchise\" Neppu Kairiku Bushi Road\", with an anime television that aired on December 31, 2013 as a collaboration between Bushiroad, Bandai Visual, Nitroplus, and Kinema Citrus.",
"Bushiroad introduced a monthly magazine in September 2013 titled.",
"It features manga serializations of their various trading card games and other franchises.",
"Bushiroad created the media franchise\" BanG Dream!\" in January 2015, which consists of a musical group, several manga series and an anime television series.",
"It also created the media franchise\" Shōjo Kageki Revue Starlight\" in 2017, which consists of a musical and an anime television series."
],
[
"The Transamerica Corporation is an American holding company for various life insurance companies and investment firms operating primarily in the United States, offering life and supplemental health insurance, investments, and retirement services.",
"The company has major offices located in Baltimore, Maryland; Cedar Rapids, Iowa; Denver, Colorado; Exton, Pennsylvania; Harrison, New York; Johns Creek, Georgia; Little Rock, Arkansas; Plano, Texas; and St. Petersburg, Florida.",
"Additional affiliated offices are located throughout the United States.",
"In 1999, it became a subsidiary of Aegon, a European financial services company headquartered in The Hague, Netherlands.",
"Transamerica funds the Transamerica Institute, a nonprofit foundation which comprises the Transamerica Center for Retirement Studies and the Transamerica Center for Health Studies."
],
[
"Iroha was trained by Fuka Kakimoto and made her debut in April 2013 for World Wonder Ring Stardom.",
"She would leave the promotion in 2015, and joined Chigusa Nagayo's Marvelous promotion, where she competes to this day."
]
],
"title": [
"Bea Priestley",
"World Wonder Ring Stardom",
"Xia Brookside",
"Konami (wrestler)",
"Jamie Hayter",
"Sadie Gibbs",
"Haruka Kato",
"Bushiroad",
"Transamerica Corporation",
"Takumi Iroha"
]
},
"supporting_facts": {
"sent_id": [
0,
0
],
"title": [
"Transamerica Corporation",
"World Wonder Ring Stardom"
]
},
"type": "comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Are Transamerica Corporation and World Wonder Ring Stardom both located in the same country?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"no"
]
},
"style": "rule"
}
|
{
"index": 47,
"split": "test"
}
|
dev_1697
|
What is the date of death of the director of film Never Fear?
|
[
"3 August 1995"
] |
{
"context": {
"content": [
[
"Etan Boritzer( born 1950) is an American writer of children ’s literature who is best known for his book\" What is God?\" first published in 1989.",
"His best selling\" What is?\" illustrated children's book series on character education and difficult subjects for children is a popular teaching guide for parents, teachers and child- life professionals.",
"Boritzer gained national critical acclaim after\" What is God?\" was published in 1989 although the book has caused controversy from religious fundamentalists for its universalist views.",
"The other current books in the\" What is?\" series include",
"What is Love?,",
"What is Death?,",
"What is Beautiful?,",
"What is Funny?,",
"What is Right?,",
"What is Peace?,",
"What is Money?,",
"What is Dreaming?,",
"What is a Friend?,",
"What is True?,",
"What is a Family?,",
"What is a Feeling?\"",
"The series is now also translated into 15 languages.",
"Boritzer was first published in 1963 at the age of 13 when he wrote an essay in his English class at Wade Junior High School in the Bronx, New York on the assassination of John F. Kennedy.",
"His essay was included in a special anthology by New York City public school children compiled and published by the New York City Department of Education.",
"Boritzer now lives in Venice, California and maintains his publishing office there also.",
"He has helped numerous other authors to get published through\" How to Get Your Book Published!\" programs.",
"Boritzer is also a yoga teacher who teaches regular classes locally and guest- teaches nationally.",
"He is also recognized nationally as an erudite speaker on\" The Teachings of the Buddha.\""
],
[
"William Thomas Smith( born 9 April 1897, date of death unknown) was an English professional footballer."
],
[
"Never Fear 1949 is an American drama film directed and co-written by Ida Lupino, and produced by Lupino and Collier Young.",
"The film is also known as The Young Lovers."
],
[
"Theodred II was a medieval Bishop of Elmham.",
"The date of Theodred's consecration unknown, but the date of his death was sometime between 995 and 997."
],
[
"Ida Lupino (4 February 1918 – 3 August 1995) was an English-American actress, singer, director, and producer.",
"She is widely regarded as one of the most prominent, and one of the only, female filmmakers working during the 1950s in the Hollywood studio system.",
"With her independent production company, she co-wrote and co-produced several social-message films and became the first woman to direct a film noir with \"The Hitch-Hiker\" in 1953.",
"Throughout her 48-year career, she made acting appearances in 59 films and directed eight others, working primarily in the United States, where she became a citizen in 1948.",
"She also directed more than 100 episodes of television productions in a variety of genres including westerns, supernatural tales, situation comedies, murder mysteries, and gangster stories.",
"She was the only woman to direct episodes of the original \"The Twilight Zone\" series (most notably \"The Masks\"), as well as the only director to have starred in the show."
],
[
"Harry Wainwright( born 1899; date of death unknown) was an English footballer."
],
[
"Thomas Scott( 1907- date of death unknown) was an English diver."
],
[
"Ian Barry is an Australian director of film and TV."
],
[
"Peter Levin is an American director of film, television and theatre."
],
[
"Albert Thompson( born 1912, date of death unknown) was a Welsh footballer."
]
],
"title": [
"Etan Boritzer",
"Bill Smith (footballer, born 1897)",
"Never Fear",
"Theodred II (Bishop of Elmham)",
"Ida Lupino",
"Harry Wainwright (footballer)",
"Thomas Scott (diver)",
"Ian Barry (director)",
"Peter Levin",
"Albert Thompson (footballer, born 1912)"
]
},
"supporting_facts": {
"sent_id": [
0,
0
],
"title": [
"Never Fear",
"Ida Lupino"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: What is the date of death of the director of film Never Fear?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"3 August 1995"
]
},
"style": "rule"
}
|
{
"index": 48,
"split": "test"
}
|
dev_2200
|
What is the date of death of the director of film Law Of The Border?
|
[
"19 November 2011"
] |
{
"context": {
"content": [
[
"In trigonometry, the law of tangents is a statement about the relationship between the tangents of two angles of a triangle and the lengths of the opposing sides.",
"In Figure 1,,, and are the lengths of the three sides of the triangle, and,, and are the angles\" opposite\" those three respective sides.",
"The law of tangents states that The law of tangents, although not as commonly known as the law of sines or the law of cosines, is equivalent to the law of sines, and can be used in any case where two sides and the included angle, or two angles and a side, are known."
],
[
"The law of reciprocal proportions( also called law of equivalent proportions or law of permanent ratios) is one of the basic laws of stoichiometry.",
"It relates the proportions in which elements combine across a number of different elements.",
"It was first formulated by Jeremias Richter in 1791.",
"A simple statement of the law is :-",
"The acceptance of the law allowed tables of element equivalent weights to be drawn up.",
"These equivalent weights were widely used by chemists in the 19th century.",
"The other laws of stoichiometry are the law of definite proportions and the law of multiple proportions.",
"The law of definite proportions refers to the fixed composition of any compound formed between element A and element B.",
"The law of multiple proportions describes the stoichiometric relationship between two or more different compounds formed between element A and element B."
],
[
"Ewan Gordon McKendrick( born 1960) is Professor of English Private Law at the University of Oxford.",
"He is known for his academic work on the law of contract, as well as publications in the law of unjust enrichment and commercial law."
],
[
"In logic, the law of non-contradiction( LNC)( also known as the law of contradiction, principle of non-contradiction( PNC), or the principle of contradiction) states that contradictory propositions can not both be true in the same sense at the same time, e. g. the two propositions A is B and A is not B are mutually exclusive.",
"Formally this is expressed as the tautology ¬( p ∧ ¬p).",
"One reason to have this law is the principle of explosion, which states that anything follows from a contradiction.",
"The law is employed in a reductio ad absurdum proof.",
"To express the fact that the law is tenseless and to avoid equivocation, sometimes the law is amended to say\" contradictory propositions can not both be true' at the same time and in the same sense'\".",
"It is one of the so called three laws of thought, along with its complement, the law of excluded middle, and the law of identity.",
"The law of noncontradiction is logically equivalent to the law of excluded middle by De Morgan's laws.",
"However, no system of logic is built on just these laws, and none of these laws provide inference rules, such as modus ponens or De Morgan's laws.",
"The law of non contradiction and the law of excluded middle create a dichotomy in\" logical space\", wherein the two parts are\" mutually exclusive\" and\" jointly exhaustive\".",
"The law of non-contradiction is merely an expression of the mutually exclusive aspect of that dichotomy, and the law of excluded middle, an expression of its jointly exhaustive aspect."
],
[
"The Law of Five may refer to:"
],
[
"Law of the Border is a 1966 Turkish drama film, written by Yılmaz Güney and directed by Ömer Lütfi Akad."
],
[
"Etan Boritzer( born 1950) is an American writer of children ’s literature who is best known for his book\" What is God?\" first published in 1989.",
"His best selling\" What is?\" illustrated children's book series on character education and difficult subjects for children is a popular teaching guide for parents, teachers and child- life professionals.",
"Boritzer gained national critical acclaim after\" What is God?\" was published in 1989 although the book has caused controversy from religious fundamentalists for its universalist views.",
"The other current books in the\" What is?\" series include",
"What is Love?,",
"What is Death?,",
"What is Beautiful?,",
"What is Funny?,",
"What is Right?,",
"What is Peace?,",
"What is Money?,",
"What is Dreaming?,",
"What is a Friend?,",
"What is True?,",
"What is a Family?,",
"What is a Feeling?\"",
"The series is now also translated into 15 languages.",
"Boritzer was first published in 1963 at the age of 13 when he wrote an essay in his English class at Wade Junior High School in the Bronx, New York on the assassination of John F. Kennedy.",
"His essay was included in a special anthology by New York City public school children compiled and published by the New York City Department of Education.",
"Boritzer now lives in Venice, California and maintains his publishing office there also.",
"He has helped numerous other authors to get published through\" How to Get Your Book Published!\" programs.",
"Boritzer is also a yoga teacher who teaches regular classes locally and guest- teaches nationally.",
"He is also recognized nationally as an erudite speaker on\" The Teachings of the Buddha.\""
],
[
"Jean- Pierre Lévy( born 1935) is a French lawyer, author and diplomat.",
"He was the director of the United Nations Division for Ocean Affairs and the Law of the Sea in 1985."
],
[
"Theodred II was a medieval Bishop of Elmham.",
"The date of Theodred's consecration unknown, but the date of his death was sometime between 995 and 997."
],
[
"Ömer Lütfi Akad (2 September 1916 – 19 November 2011) was a Turkish film director, who directed movies from 1948 to 1974.",
"In 1949, he debuted as a film director with \"Vurun KahpeyeStrike the Whore\")",
"an adaptation of Halide Edip Adıvar's book of the same title.",
"He became one of the pioneers of the period in the \"Director Generation\".",
"The 1970s trilogy, \"The Bride; The Wedding; and The Sacrifice\", is considered his masterpiece.",
"Afterwards, he withdrew from movie making instead directing adaptations for TV.",
"Ömer Lütfi Akad was born on September 2, 1916.",
"Following his secondary education at French Jeanne d’Arc School and Galatasaray High School, he studied finance at Istanbul Economy and Commerce Higher School.",
"Beside his occupation as financial advisor at Sema Film company, he wrote articles on theatre and cinema.",
"After directing more than 100 movies, Ömer Lütfi Akad taught twenty years at the Mimar Sinan University of Fine Arts.",
"He died on November 19, 2011 at the age of 95 in Istanbul."
]
],
"title": [
"Law of tangents",
"Law of reciprocal proportions",
"Ewan McKendrick",
"Law of noncontradiction",
"Law of Five",
"Law of the Border",
"Etan Boritzer",
"Jean-Pierre Lévy",
"Theodred II (Bishop of Elmham)",
"Ömer Lütfi Akad"
]
},
"supporting_facts": {
"sent_id": [
0,
0
],
"title": [
"Law of the Border",
"Ömer Lütfi Akad"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: What is the date of death of the director of film Law Of The Border?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"19 November 2011"
]
},
"style": "rule"
}
|
{
"index": 49,
"split": "test"
}
|
dev_4333
|
What is the place of birth of the composer of song Bullet (Misfits Song)?
|
[
"Lodi",
"Lodi, NJ",
"Lodi, New Jersey"
] |
{
"context": {
"content": [
[
"\"Bullet\" is the second single released by the horror punk band the Misfits.",
"The four tracks comprising the EP were recorded, along with thirteen others, in early 1978 for the proposed \"Static Age\" album.",
"When the band could not find a record label to release the album, they instead released four of the songs as \"Bullet\" on singer Glenn Danzig's label Plan 9 Records.",
"The songs were re-released in different versions over subsequent years, until \"Static Age\" was finally released in its entirety in 1997."
],
[
"Alonso Mudarra( c. 1510 – April 1, 1580) was a Spanish composer of the Renaissance, and also played the vihuela, a guitar- shaped string instrument.",
"He was an innovative composer of instrumental music as well as songs, and was the composer of the earliest surviving music for the guitar."
],
[
"Alexandru Cristea( 1890 – 1942) was the composer of the music for\" Limba Noastră\", current national anthem of Moldova."
],
[
"Michelangelo Faggioli( 1666–1733) was an Italian lawyer and celebrated amateur composer of humorous cantatas in Neapolitan dialect.",
"A founder of a new genre of Neapolitan comedy, he was the composer of the opera buffa\" La Cilla\" in 1706."
],
[
"Glenn Danzig (born Glenn Allen Anzalone; June 23, 1955) is an American singer, songwriter, musician, and record producer from Lodi, New Jersey.",
"He is the founder of the bands Misfits, Samhain and Danzig.",
"He owns the Evilive record label as well as Verotik, an adult-oriented comic book publishing company.",
"Beginning in the mid-1970s, Danzig's musical career has encompassed a number of genres through the years, including punk rock, heavy metal, industrial, blues and classical music.",
"He has also written songs for other musicians, most notably Johnny Cash and Roy Orbison.",
"As a singer, he is noted for his baritone voice and tenor vocal range; his style has been compared to those of Elvis Presley, Jim Morrison, and Howlin' Wolf.",
"Danzig has also cited Bill Medley as a vocal influence."
],
[
"Tarcisio Fusco was an Italian composer of film scores.",
"He was the brother of the composer Giovanni Fusco and the uncle of operatic soprano Cecilia Fusco."
],
[
"Peter Dodds McCormick( 1834?30 October 1916), a Scottish- born schoolteacher, was the composer of the Australian national anthem\" Advance Australia Fair\"."
],
[
"Petrus de Domarto( fl. c. 1445– 1455) was a Franco- Flemish composer of the Renaissance.",
"He was a contemporary and probable acquaintance of Ockeghem, and was the composer of at least one of the first unified mass cycles to be written in continental Europe."
],
[
"Walter Robinson is an African American composer of the late 20th century.",
"He is most notable for his 1977 song\" Harriet Tubman\", which has been recorded by folk musicians such as Holly Near, John McCutcheon, and others.",
"He is also the composer of several operas."
],
[
"Alexander Mair\" Sandy\" Courage Jr.( December 10, 1919",
"May 15, 2008) was an American orchestrator, arranger, and composer of music, primarily for television and film.",
"He is best known as the composer of the theme music for the original\" Star Trek\" series."
]
],
"title": [
"Bullet (Misfits song)",
"Alonso Mudarra",
"Alexandru Cristea",
"Michelangelo Faggioli",
"Glenn Danzig",
"Tarcisio Fusco",
"Peter Dodds McCormick",
"Petrus de Domarto",
"Walter Robinson (composer)",
"Alexander Courage"
]
},
"supporting_facts": {
"sent_id": [
2,
0
],
"title": [
"Bullet (Misfits song)",
"Glenn Danzig"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: What is the place of birth of the composer of song Bullet (Misfits Song)?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Lodi",
"Lodi, NJ",
"Lodi, New Jersey"
]
},
"style": "rule"
}
|
{
"index": 50,
"split": "test"
}
|
dev_1891
|
Where was the director of film 33 Scenes From Life born?
|
[
"Cracovia",
"Krakau",
"Krakkó",
"Cracow",
"Kraków",
"Cracovie",
"Krako",
"Krakova",
"Krakov",
"Krakow",
"Kroke",
"Royal Capital City of Kraków",
"Royal City of Kraków",
"Krakiv",
"Krakuv",
"Krakoy"
] |
{
"context": {
"content": [
[
"33 Scenes From Life is a 2008 Polish film directed by Małgorzata Szumowska."
],
[
"Peter Levin is an American director of film, television and theatre."
],
[
"Brian Patrick Kennedy( born 5 November 1961) is an Irish- born art museum director who has worked in Ireland and Australia, and now lives and works in the United States.",
"He is currently the director of the Peabody Essex Museum.",
"He was the director of the Toledo Museum of Art in Ohio from 2010 to 2019.",
"He was the director of the Hood Museum of Art from 2005 to 2010, and the National Gallery of Australia( Canberra) from 1997- 2004."
],
[
"Olav Aaraas( born 10 July 1950) is a Norwegian historian and museum director.",
"He was born in Fredrikstad.",
"From 1982 to 1993 he was the director of Sogn Folk Museum, from 1993 to 2010 he was the director of Maihaugen and from 2001 he has been the director of the Norwegian Museum of Cultural History.",
"In 2010 he was decorated with the Royal Norwegian Order of St. Olav."
],
[
"Romà Ribera i Cirera( 13 December 1848, Barcelona- 29 May 1935, Barcelona) was a Catalan genre painter.",
"He specialized in contemporary scenes from upper-class social events, rendered in meticulous detail, but also did numerous scenes from life in the 17th and 18th centuries."
],
[
"Jeffry Life( born 1938) is a physician and author from the United States, known for maintaining a youthful body in old age."
],
[
"Ian Barry is an Australian director of film and TV."
],
[
"Jesse Edward Hobson( May 2, 1911 – November 5, 1970) was the director of SRI International from 1947 to 1955.",
"Prior to SRI, he was the director of the Armour Research Foundation."
],
[
"Emma Brownlow( 1832–1905) was a Victorian era artist who is best known for her paintings depicting scenes from life at the Foundling Hospital in London."
],
[
"Małgorzata Szumowska (born 26 February 1973) is a Polish film director, screenwriter and producer, born in Kraków.",
"Her 2013 film \"",
"In the Name Of\" received the Teddy Award for Best Feature at the 63rd Berlin International Film Festival, and received the Grand Prix at the 32nd annual Istanbul Film Festival.",
"Her film \"Body\" was selected to be screened in the main competition section of the 65th Berlin International Film Festival.",
"Szumowska won the Silver Bear for Best Director for \"Body\".",
"In 2018, she received Jury Grand Prix at the 68th Berlin International Film Festival for her film \"Mug\"."
]
],
"title": [
"33 Scenes from Life",
"Peter Levin",
"Brian Kennedy (gallery director)",
"Olav Aaraas",
"Romà Ribera",
"Jeffry Life",
"Ian Barry (director)",
"Jesse E. Hobson",
"Emma Brownlow",
"Małgorzata Szumowska"
]
},
"supporting_facts": {
"sent_id": [
0,
0
],
"title": [
"33 Scenes from Life",
"Małgorzata Szumowska"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Where was the director of film 33 Scenes From Life born?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Cracovia",
"Krakau",
"Krakkó",
"Cracow",
"Kraków",
"Cracovie",
"Krako",
"Krakova",
"Krakov",
"Krakow",
"Kroke",
"Royal Capital City of Kraków",
"Royal City of Kraków",
"Krakiv",
"Krakuv",
"Krakoy"
]
},
"style": "rule"
}
|
{
"index": 51,
"split": "test"
}
|
dev_1753
|
Who was born first out of Matteo Mancosu and Yoon Jong-Hoon?
|
[
"Yun Jong-hun",
"Yoon Jong-Hoon",
"Yoon Jong-hoon",
"윤종훈",
"尹仲勳"
] |
{
"context": {
"content": [
[
"Chang Jong- hoon( born April 10, 1968) is the current coach and former player of the Hanwha Eagles of the Korea Baseball Organization."
],
[
"Matteo Mancosu( born 22 December 1984) is an Italian footballer who plays for Virtus Entella as a forward.",
"A right- footed player, although he is usually deployed in the centre as a main striker, he is also capable of playing as a winger on either flank."
],
[
"Yoon Jong- hoon( born February 15, 1984) is a South Korean actor."
],
[
"Yoon Jong-bin( born December 20, 1979) is a South Korean film director."
],
[
"Park Jong- hoon( born 6 May 1965) is a South Korean former gymnast who competed in the 1988 Summer Olympics."
],
[
"Yoon Jong-shin( born October 15, 1969) is a South Korean singer- songwriter, record producer, and the CEO of Mystic Story."
],
[
"Henry Walter Moore( 1849 – 20 August 1916) was an English- born first- class cricketer who spent most of his life in New Zealand."
],
[
"Wale Adebanwi( born 1969) is a Nigerian- born first Black Rhodes Professor at St Antony's College, Oxford."
],
[
"Yoon Jong- Hwan( born 16 February 1973 in Gwangju, South Korea) is a South Korean manager and former football player."
],
[
"Yoon Jong- gyu( born 20 March 1998) is a South Korean football defender who plays for FC Seoul and the South Korea national under- 23 football team."
]
],
"title": [
"Chang Jong-hoon",
"Matteo Mancosu",
"Yoon Jong-hoon",
"Yoon Jong-bin",
"Park Jong-hoon",
"Yoon Jong-shin",
"Henry Moore (cricketer)",
"Wale Adebanwi",
"Yoon Jong-hwan",
"Yoon Jong-gyu"
]
},
"supporting_facts": {
"sent_id": [
0,
0
],
"title": [
"Matteo Mancosu",
"Yoon Jong-hoon"
]
},
"type": "comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Who was born first out of Matteo Mancosu and Yoon Jong-Hoon?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Yun Jong-hun",
"Yoon Jong-Hoon",
"Yoon Jong-hoon",
"윤종훈",
"尹仲勳"
]
},
"style": "rule"
}
|
{
"index": 52,
"split": "test"
}
|
dev_12163
|
Which film whose director is younger, Gypped In The Penthouse or A Summer'S Tale?
|
[
"A Summer'S Tale",
"A Summer's Tale",
"Conte d'été",
"Summer's Tale"
] |
{
"context": {
"content": [
[
"A Summer's Tale is a 1996 French romance film directed by Éric Rohmer.",
"It is the third film in his\" Contes des quatre saisons Tales of the Four Seasons\") series, which includes\" A Tale of Springtime\"( 1990),\" Conte d'étéAutumn Tale\"( 1998), and\" A Tale of Winter\"( 1992).",
"\" Conte d'été\" stars Melvil Poupaud, Amanda Langlet, Aurélia Nolin, and Gwenaëlle Simon.",
"The film was screened in the Un Certain Regard section at the 1996 Cannes Film Festival."
],
[
"Suzee Pai( born 8 August 1962) is a former actress and model and was the Penthouse Pet of the Month in January 1981."
],
[
"Miloš Zličić( born 29 December 1999) is a Serbian football forward.",
"He is a younger brother of Lazar Zličić."
],
[
"The Attic( also known as\" The Penthouse\") is a 1962 Italian comedy film directed by Gianni Puccini."
],
[
"Jean Marie Maurice Schérer or Maurice Henri Joseph Schérer, known as Éric Rohmer (21 March 192011 January 2010), was a French film director, film critic, journalist, novelist, screenwriter, and teacher.",
"Rohmer was the last of the post-World War II French New Wave directors to become established.",
"He edited the influential film journal, \"Cahiers du cinéma\", from 1957 to 1963, while most of his colleagues—among them Jean-Luc Godard and François Truffaut—were making the transition from film critics to filmmakers and gaining international attention.",
"Rohmer gained international acclaim around 1969 when his film \"My Night at Maud's\" was nominated at the Academy Awards.",
"He won the San Sebastián International Film Festival with \"Claire's Knee\" in 1971 and the Golden Lion at the Venice Film Festival for \"The Green Ray\" in 1986.",
"Rohmer went on to receive the Venice Film Festival's Career Golden Lion in 2001.",
"After Rohmer's death in 2010, his obituary in \"The Daily Telegraph\" described him as \"the most durable filmmaker of the French New Wave\", outlasting his peers and \"still making movies the public wanted to see\" late in his career."
],
[
"Jules White( born Julius Weiss; 17 September 190030 April 1985) was a Hungarian- born American film director and producer best known for his short- subject comedies starring The Three Stooges."
],
[
"Misty Stone is an American pornographic actress and nude model.",
"She was the\" Penthouse\" Pet of the Month for December 2014."
],
[
"La Bestia humana is a 1957 Argentine film whose story is based on the novel\" La Bête Humaine\" by the French writer Émile Zola."
],
[
"Gypped in the Penthouse is a 1955 short subject directed by Jules White starring American slapstick comedy team The Three Stooges( Moe Howard, Larry Fine and Shemp Howard).",
"It is the 161st entry in the series released by Columbia Pictures starring the comedians, who released 190 shorts for the studio between 1934 and 1959."
],
[
"Briana Banks( born 21 May 1978) is a German American pornographic actress and model.",
"She is the\" Penthouse\" Pet of the Month for June 2001."
]
],
"title": [
"A Summer's Tale",
"Suzee Pai",
"Miloš Zličić",
"The Attic (1962 film)",
"Éric Rohmer",
"Jules White",
"Misty Stone",
"La Bestia humana",
"Gypped in the Penthouse",
"Briana Banks"
]
},
"supporting_facts": {
"sent_id": [
0,
0,
0,
0
],
"title": [
"Gypped in the Penthouse",
"A Summer's Tale",
"Jules White",
"Éric Rohmer"
]
},
"type": "bridge_comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which film whose director is younger, Gypped In The Penthouse or A Summer'S Tale?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"A Summer'S Tale",
"A Summer's Tale",
"Conte d'été",
"Summer's Tale"
]
},
"style": "rule"
}
|
{
"index": 53,
"split": "test"
}
|
dev_9064
|
Who is the father of the performer of song Don'T Touch My Hair?
|
[
"Mathew Knowles"
] |
{
"context": {
"content": [
[
"\"Don't Touch My Hair\" is the ninth track on American singer and songwriter",
"Solange's third studio album, \"A Seat at the Table\".",
"It was released by Saint Records and Columbia Records on September 30, 2016 with its music video being released the following week.",
"It was written by Knowles, Bryndon Cook, Sampha Sisay, Dave Andrew Sitek, and Patrick Wimberly."
],
[
"\" Please Do n't Touch\" is the debut single by English rock and roll group Johnny Kidd& the Pirates, released in 1959 reaching number 25 on the UK Singles Chart."
],
[
"Don't Touch Me may refer to:"
],
[
"\" Touch My Body\" is a song by American singer and songwriter Mariah Carey, released as the lead single from her eleventh studio album,\" E= MC ²\"( 2008).",
"Carey co-produced the song with Christopher\" Tricky\" Stewart, and they co-wrote it with Crystal\" Cri$tyle\" Johnson, and Terius\" The Dream\" Nash.",
"\" Touch My Body\" was released as the album's lead single on February 12, 2008.",
"The song's lyrics feature a double message, with the first describing sexual fantasies with her lover, while also jokingly warning him against recording or releasing information regarding their rendezvous.",
"The song received generally positive reviews from music critics, with many highlighting the song's light pop melody and hook; however, it also garnered some backlash from certain music reviewers, who felt the song did not properly represent the singer's acclaimed 5- octave vocal range.",
"\" Touch My Body\" became Carey's eighteenth chart topper on the\" Billboard\"",
"Hot 100, making Carey the solo artist with the most number one singles in United States history, surpassing the record held by Elvis Presley.",
"Additionally, it gave Carey her 79th week atop the chart, tying Presley for most weeks at number one.",
"Throughout Europe, the song also achieved high charting, peaking within the top five on the charts in Italy, Japan, New Zealand, Switzerland, and the United Kingdom.",
"Carey performed\" Touch My Body\" on several live televised events and programs, debuting it on\" Saturday Night Live\" as a two- piece performance.",
"Similarly,",
"Carey opened the\" Good Morning AmericaSummer Concert Series\" on April 25, 2008, singing the song, as well as two other singles from\" E= MC ²\".",
"Similar renditions took place at the season premiere party of\" The Hills\", as well as a mash- up of the song with\" I'll Be Lovin' U Long Time\" at the Teen Choice Awards.",
"Aside from reprising performances of the British programs\" The Sunday Night Project\" and\" The Paul O' Grady Show\", and German talent show\" Deutschland sucht den Superstar\", the song was included on the set- list of Carey's North American Angels Advocate Tour, held throughout 2009 and 2010.",
"The song's music video was directed by filmmaker Brett Ratner, who had previously worked with Carey on five other music videos.",
"It follows a story revolving around a computer employee's fantasy as he visits Carey's home.",
"As he fixes her computer, he enters a fantasy in which the pair perform several activities together, including pillow fights, laser tag, Guitar Hero, slot cars, and frisbee throwing, all while Carey shows off her figure in several revealing outfits.",
"The music video won in the category of\" Best Comedic Video\" at the 2008 BET Awards, and won the\" MTV Video Vanguard Award\" at the 2008 MTV Video Music Awards Japan.",
"Additionally, the video was nominated for\" Best Female Video\" at the 2008 MTV Video Music Awards."
],
[
"The Greatest Hits: Do n't Touch My Moustache is a greatest hits album released by rock band Hoobastank.",
"It was released in August 2009 by Universal Records in Japan.",
"It includes tracks from their first four studio albums, as well as non-album material.",
"The Japanese title is 「どういたしまして」(\" Douitashimashite\"), which is Japanese for\" You're welcome\" and has a similar pronunciation to the phrase\" Do n't touch my moustache\"."
],
[
"\" O Valencia!\" is the fifth single by the indie rock band",
"The Decemberists, and the first released from their fourth studio album,\" The Crane Wife\".",
"The music was written by The Decemberists and the lyrics by Colin Meloy.",
"It tells a story of two star- crossed lovers.",
"The singer falls in love with a person who belongs to an opposing gang.",
"At the end of the song, the singer's lover jumps in to defend the singer, who is confronting his lover's brother( the singer's\" sworn enemy\") and is killed by the bullet intended for the singer."
],
[
"\" Another Girl\" is a song by the English rock band the Beatles from their 1965 album\" Help!\" and included in the film of the same name.",
"The song was written by Paul McCartney but credited to Lennon – McCartney.",
"The song is addressed to the singer's girlfriend, who is informed that the singer has found\" another girl.\""
],
[
"Solange Piaget Knowles (born June 24, 1986), also known mononymously as Solange, is an American singer, songwriter, record producer, performance artist, and actress.",
"Expressing an interest in music from an early age, Knowles had several temporary stints as a backup dancer for Destiny's Child, which featured her elder sister, Beyoncé, among its members, before signing with her father Mathew Knowles's Music World Entertainment label.",
"At age 16, Knowles released her first studio album \"Solo Star\" (2002).",
"Between 2005 and 2007, Knowles had several minor acting roles, including the direct-to-video (2006) and continued writing music for Beyoncé and for former Destiny's Child group members Kelly Rowland and Michelle Williams.",
"In 2007, Knowles began to record music again.",
"Her second studio album \"Sol-Angel and the Hadley St. Dreams\" (2008) deviated from the pop-oriented music of her debut to the Motown Sound of the 1960s and 1970s.",
"It peaked at number nine on US \"Billboard\" 200 and received positive reviews from critics.",
"She followed this up with the 1980s pop and R&B–inspired extended play \"True\" (2012) on Terrible Records and her imprint Saint Records.",
"Her third studio album, \"A Seat at the Table\" (2016) was released to widespread critical acclaim, and became her first number-one album in the United States.",
"The album's lead single, \"Cranes in the Sky\" won the Grammy for Best R&B Performance.",
"Her fourth studio album, \"When I Get Home\", was released in 2019.",
"Knowles was heavily influenced by Motown girl groups and says that her first passion is songwriting.",
"She is frequently compared by the media to her sister, Beyoncé, but Solange insists they have different aspirations and are musically different.",
"She has been ranked by \"Billboard\" as the 100th most successful dance artist of all-time, and in 2017 was honored with the \"Impact Award\" at \"Billboard\" Women in Music.",
"Her other ventures include an endorsement deal with Rimmel London and a line of hip-hop-oriented merchandise for young children.",
"On November 16, 2014, Knowles married music video director Alan Ferguson in New Orleans, Louisiana."
],
[
"\" Panda\" is a song of the Chilean band Astro and is the fifth song of the homonymous album of the year 2011.",
"The song was composed and produced by the singer of the band Andrés Nusser and released like third single of the album on February 1, 2013."
],
[
"Emma Dabiri is an Irish model, BBC television presenter, visual sociology researcher, and teaching fellow in the Africa department at SOAS.",
"She has published in a number of anthologies, and is the author of the book,\" Do n't Touch My Hair\"."
]
],
"title": [
"Don't Touch My Hair",
"Please Don't Touch (song)",
"Don't Touch Me (disambiguation)",
"Touch My Body",
"The Greatest Hits: Don't Touch My Moustache",
"O Valencia!",
"Another Girl",
"Solange Knowles",
"Panda (Astro song)",
"Emma Dabiri"
]
},
"supporting_facts": {
"sent_id": [
1,
1
],
"title": [
"Don't Touch My Hair",
"Solange Knowles"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Who is the father of the performer of song Don'T Touch My Hair?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Mathew Knowles"
]
},
"style": "rule"
}
|
{
"index": 54,
"split": "test"
}
|
dev_2546
|
Which film has the director who died later, Paper Bullets or Karakolda Ayna Var?
|
[
"Karakolda Ayna Var"
] |
{
"context": {
"content": [
[
"Karakolda Ayna Var is a 1966 Turkish comedy film, directed by Halit Refiğ and starring Sadri Alisik, Fatma Girik, and Suphi Tekniker."
],
[
"Paper Bullets is a 1941 American film directed by Phil Rosen and starring Joan Woodbury.",
"It was the first film produced by the King Brothers, launching their career.",
"The film was re-released by Eagle Lion as Gangs, Inc. giving top billing to Alan Ladd, who has a supporting role."
],
[
"Ben Palmer is a British film and television director who is known for being the director of\" Bo' Selecta\" and\" The Inbetweeners\"."
],
[
"Philip E. Rosen( May 8, 1888 – October 22, 1951) was an American film director and cinematographer.",
"He directed 142 films between 1915 and 1949.",
"He was born in Marienburg, German Empire( now, Malbork, Poland), grew up in Machias, Maine, and died in Hollywood, California of a heart attack.",
"He was one of the founders of the American Society of Cinematographers.",
"Rosen was married to model and actress Joyzelle Joyner."
],
[
"Brian Patrick Kennedy( born 5 November 1961) is an Irish- born art museum director who has worked in Ireland and Australia, and now lives and works in the United States.",
"He is currently the director of the Peabody Essex Museum.",
"He was the director of the Toledo Museum of Art in Ohio from 2010 to 2019.",
"He was the director of the Hood Museum of Art from 2005 to 2010, and the National Gallery of Australia( Canberra) from 1997- 2004."
],
[
"Abhishek Saxena is an Indian Bollywood and Punjabi film director who directed the movie Phullu.",
"The Phullu movie was released in theaters on 16 June 2017, in which film Sharib Hashmi is the lead role.",
"Apart from these, he has also directed Patiala Dreamz, this is a Punjabi film.",
"This film was screened in cinemas in 2014."
],
[
"Elliot Silverstein( born August 3, 1927, in Boston, Massachusetts) is an American director, who is best known for being the director of the movie\" Cat Ballou\"( 1965)."
],
[
"Drew Esocoff( born c. 1957) is an American television sports director, who as of 2006 has been the director of NBC Sunday Night Football."
],
[
"G. Marthandan is an Indian film director who works in Malayalam cinema."
],
[
"Halit Refiğ (5 March 1934 – 11 October 2009) was a Turkish film director, film producer, screenwriter and writer.",
"He made around sixty films, including feature films, documentaries and TV serials.",
"He is considered to be the pioneer of the National Cinema movement and the initiator of the production of TV serials in Turkey."
]
],
"title": [
"Karakolda Ayna Var",
"Paper Bullets",
"Ben Palmer",
"Phil Rosen",
"Brian Kennedy (gallery director)",
"Abhishek Saxena",
"Elliot Silverstein",
"Drew Esocoff",
"G. Marthandan",
"Halit Refiğ"
]
},
"supporting_facts": {
"sent_id": [
0,
0,
0,
0
],
"title": [
"Paper Bullets",
"Karakolda Ayna Var",
"Phil Rosen",
"Halit Refiğ"
]
},
"type": "bridge_comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which film has the director who died later, Paper Bullets or Karakolda Ayna Var?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Karakolda Ayna Var"
]
},
"style": "rule"
}
|
{
"index": 55,
"split": "test"
}
|
dev_4462
|
Which film came out first, P.K.P. (Pilsudski Bought Petliura) or Nalla Idathu Sammandham?
|
[
"P.K.P.",
"P.K.P. (Pilsudski Bought Petliura)"
] |
{
"context": {
"content": [
[
"P.K.P.( Pilsudski Bought Petliura) is a 1926 Soviet film directed by Aksel Lundin and Georgi Stabovoi."
],
[
"My Little Eye is a 2002 British horror film directed by Marc Evans about five adults who agree to spend six months together in an isolated mansion while being filmed at all times.",
"The idea for the film came from reality television shows such as\" Big Brother\".",
"The title refers to the guessing game I spy."
],
[
"La légion saute sur Kolwezi also known as\" Operation Leopard\" is a French war film directed by Raoul Coutard filmed in French Guiana.",
"The script is based on the true story of the Battle of Kolwezi that happened in 1978.",
"It was diligently described in a book of the same name by former 1 REP Captain Pierre Sergent.",
"He published his book in 1979; the film came out in 1980.",
"Raoul Coutard shot the film in a documentary style."
],
[
"The Night of Nights is a 1939 black- and- white drama film written by Donald Ogden Stewart and directed by Lewis Milestone for Paramount Pictures that starred Pat O' Brien, Olympe Bradna, and Roland Young.",
"The film received positive contemporary reviews from publications such as\" The New York Times\".",
"Director Milestone went on to other successful productions after the film came out, including\" Ocean's 11\" and\" Pork Chop Hill\".",
"The movie has no relation to the use of the phrase\" Night of Nights\" for the night of the Academy Awards ceremony, the album by musical group",
"The Seekers, or the Touhou Project song."
],
[
"Kabani Nadi Chuvannappol(\" When the River Kabani Turned Red\") is a 1975 Malayalam feature film directed by P. A. Backer, produced by Pavithran, and starring T. V. Chandran, Shalini, Raveendran and J. Siddiqui.",
"This leftist political drama film came out during the Emergency period.",
"It was the directorial debut of P. A. Backer, who won that year's awards for Best Director and Second Best Film at the Kerala State Film Awards.",
"Pavithran, who later directed many critically acclaimed Malayalam films produced the film.",
"T. V. Chandran, who also later went on to direct a bevy of award- winning films in Malayalam and Tamil, played the lead role.",
"After certain post-production controversies, the film debuted in theatres on 16 July 1976."
],
[
"The Fabulous Senorita is a 1952 American musical comedy film directed by R.G. Springsteen and starring Estelita Rodriguez, Robert Clarke and Nestor Paiva.",
"The film came at the tail- end of a cycle of Latin American- themed films, through it did introduce a new star Rita Moreno."
],
[
"Nalla Idathu Sammandham is 1958 Indian Tamil- language drama film directed by K. Somu.",
"It was the debut production of writer A. P. Nagarajan, who co-produced the film with V. K. Ramasamy.",
"The story was provided by Ramasamy himself, while screenplay and dialogues were written by Nagarajan.",
"It stars M. R. Radha, Prem Nazir, Sowcar Janaki and M. N. Rajam.",
"The film's soundtrack and background score were composed by K. V. Mahadevan."
],
[
"When The Mountains Tremble is a 1983 documentary film produced by Skylight Pictures about the war between the Guatemalan Military and the Mayan Indigenous population of Guatemala.",
"Footage from this film was used as forensic evidence in the Guatemalan court for crimes against humanity, in the genocide case against Efraín Ríos Montt.",
"The film centers on the experiences of Nobel Prize winner Rigoberta Menchú, a Quiché indigenous woman who won the Nobel Peace Prize in 1992, nine years after the film came out.",
"\" When The Mountains Tremble\" won the Special Jury Award at the Sundance Film Festival, the Blue Ribbon Award at the American Film Festival, and the Grand Coral Award/ Best North American Documentary at the Havana Film Festival.",
"A follow- up film was released in 2011, titled\" Granito: How to Nail a Dictator\"."
],
[
"Life Is What You Make",
"It is a 2017 documentary film which explores the life of award- winning Filipino theatre producer Jhett Tolentino from his migration into the United States and his entry into theatre production in New York.",
"The film came with a soundtrack album entitled\" Life Is What You Make It: Original Motion Picture Soundtrack\"."
],
[
"XX/ XY is a 2002 American romantic drama film written and directed by Austin Chick and starring Mark Ruffalo, Kathleen Robertson, and Maya Stange.",
"The title refers to the different chromosome pairings present in men and women.",
"It was nominated for the Grand Jury Prize at the Sundance Film Festival in the year it was released.",
"Although the funding for the film came from the US, the film was produced by British company Natural Nylon."
]
],
"title": [
"P.K.P. (Pilsudski Bought Petliura)",
"My Little Eye",
"Operation Leopard",
"The Night of Nights",
"Kabani Nadi Chuvannappol",
"The Fabulous Senorita",
"Nalla Idathu Sammandham",
"When the Mountains Tremble",
"Life Is What You Make It (film)",
"XX/XY"
]
},
"supporting_facts": {
"sent_id": [
0,
0
],
"title": [
"P.K.P. (Pilsudski Bought Petliura)",
"Nalla Idathu Sammandham"
]
},
"type": "comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which film came out first, P.K.P. (Pilsudski Bought Petliura) or Nalla Idathu Sammandham?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"P.K.P.",
"P.K.P. (Pilsudski Bought Petliura)"
]
},
"style": "rule"
}
|
{
"index": 56,
"split": "test"
}
|
dev_4616
|
Who was born later, Norman Hay Forbes or Bruno Foucher?
|
[
"Bruno Foucher"
] |
{
"context": {
"content": [
[
"Norman Hay Forbes of Forbes, FRS, FRSE, FRCSE, JP( 1 March 1863- 27 June 1916) was a British doctor and academic author, often under the name of Li’mach, the war- cry of the Forbes clan.",
"His writing ranges from therapeutic medicine to Scottish history."
],
[
"Edward Norman Hay( 19 April 1889 – 10 September 1943) was a Northern Irish composer and music critic."
],
[
"Bruno Henrique Fortunato Aguiar or Bruno Aguiar, is a Brazilian football player that plays as a Central Defender.",
"He currently plays for Goiás."
],
[
"Robert Jacobus Forbes or Robert James Forbes( 21 April 1900, Breda – 13 January 1973, Haarlem) was a Dutch chemist and historian of science and professor in the history of applied science and technology at the University of Amsterdam.",
"In his days Forbes was internationally one of the best known and respected historian of technology, and recipient of the first Leonardo da Vinci Medal, the highest award by the Society for the History of Technology( SHOT)."
],
[
"John Hay Forbes, Lord Medwyn( 19 September 1776 – 25 July 1854) was a British judge."
],
[
"Bruno Anderson da Silva Sabino, known simply as Bruninho or Bruno Sabino; born 29 September 1989) is a Brazilian football player."
],
[
"Catherine I( born, later known as Marta Samuilovna Skavronskaya; –) was the second wife of Peter the Great and Empress of Russia from 1725 until her death."
],
[
"Bruno Foucher( born 3 October 1960) is a French diplomat.",
"He has been ambassador of France in Lebanon since 2017."
],
[
"W. Augustus Barratt( 1873- 1947) was a Scottish- born, later American, songwriter and musician."
],
[
"Luke Bruno Milanzi( born 4 December 1994), variously known as Luka Milanzi or Bruno Milanzi, is a Malawian footballer."
]
],
"title": [
"Norman Hay Forbes",
"Edward Norman Hay",
"Bruno Henrique Fortunato Aguiar",
"Robert Jacobus Forbes",
"John Hay Forbes, Lord Medwyn",
"Bruninho (footballer, born 1989)",
"Catherine I of Russia",
"Bruno Foucher",
"W. Augustus Barratt",
"Luke Milanzi"
]
},
"supporting_facts": {
"sent_id": [
0,
0
],
"title": [
"Norman Hay Forbes",
"Bruno Foucher"
]
},
"type": "comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Who was born later, Norman Hay Forbes or Bruno Foucher?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Bruno Foucher"
]
},
"style": "rule"
}
|
{
"index": 57,
"split": "test"
}
|
dev_9909
|
Which film has the director who died later, Spring In A Small Town or Cannibal Tours?
|
[
"Cannibal Tours"
] |
{
"context": {
"content": [
[
"Ben Palmer is a British film and television director who is known for being the director of\" Bo' Selecta\" and\" The Inbetweeners\"."
],
[
"Spring in a Small Town is a Chinese film released in 1948 and directed by Fei Mu, a director known for having great empathy for women and reflecting that in his films.",
"The film is in black and white and its screenwriter is Li Tianji.",
"The film was produced and funded by Wenhua Film Company, which was facing a great financial deficit while producing this film.",
"In order to solve the financial problem, Wenhua decided to produce this film under a low budget.",
"Therefore, the film was made with a minimalist plot and setting which required only five characters.",
"The original print of the film is today kept at the China Film Archive."
],
[
"Fei Mu( October 10, 1906 — January 31, 1951), also romanised as Fey Mou, was a Chinese film director of the pre-Communist era.",
"His\" Spring in a Small Town\"( 1948) was declared the greatest Chinese film ever made by the Hong Kong Film Critics Society."
],
[
"Spring in My Hometown is a 1998 South Korean film."
],
[
"Cannibal Tours is a 1988 quasi-documentary film by Australian director and cinematographer Dennis O'Rourke.",
"While it borrows heavily from ethnographic modes of representation, the film is a biting commentary on the nature of modernity.",
"The film is also widely celebrated for its depiction of Western touristic desires and exploitation among a' tribal' people.",
"The film follows a number of affluent European and American tourists and ecotourists as they travel from village to village along the middle Sepik River in Papua New Guinea.",
"Most of the villages in the film are inhabited by the Iatmul people.",
"The film shows the tourists driving hard bargains for local handcrafts such as woodcarvings and baskets, relentlessly taking photos of local people, handing out cigarettes, balloons, and perfume, viewing staged dance performances, and offering naive comments on native people living in harmony with nature.",
"The film, too, tacks between the tourists and black- and- white photographs from the era of German colonialism of New Guinea( 1880s- 1914).",
"With some prodding, the tourists unwittingly reveal an unattractive and pervasive ethnocentrism to O'Rourke's cameras.",
"The tourists thus become somewhat dehumanized by the filmmaker's camera even as they themselves dehumanize and exoticize even the most mundane aspects of Sepik River life.",
"The title of the film can be read in at least a couple of ways.",
"At one point early in the film, a German tourist, clearly titillated, describes the bygone practice of raiding and cannibalism.",
"He is obsessed with cannibalism, asking local men about the former practice, and snapping photos of locations where local people once practiced headhunting; other tourists also attempt to discuss the\" symbolic\" meaning of cannibalism.",
"But the narrative plot of the film is to portray the tourists as the real cannibals who consume the world through their arrogance, acquisitiveness, primitivist fantasies of indigenous people, and photography( the cameras in the film double for the guns of past colonial administrators).",
"In short, the film presents the tourists as the people driven by truly bizarre beliefs and behaviors.",
"By contrast, the local people are represented as eminently practical and reasonable.",
"Thus the' natives' display the rationale logic of modernity, while the Western tourists are guilty of the very irrational traits they attribute to the natives.",
"The climax of the film is when a group of tourists, faces painted in' native fashion' by local men from one village( Tambunum), prance, dance, and assume a boxing stance to the music of Mozart.",
"The message is clearly: we have finally succeeded in our quest for the primitive, and he is us."
],
[
"Dennis O'Rourke( 14 August 1945 – 15 June 2013) was an Australian documentary filmmaker."
],
[
"Abhishek Saxena is an Indian Bollywood and Punjabi film director who directed the movie Phullu.",
"The Phullu movie was released in theaters on 16 June 2017, in which film Sharib Hashmi is the lead role.",
"Apart from these, he has also directed Patiala Dreamz, this is a Punjabi film.",
"This film was screened in cinemas in 2014."
],
[
"Wei Wei( born 17 May 1922) is a Chinese actress.",
"She is known for her leading role in the classic film\" Spring in a Small Town\"( 1948), and has continued to work in the 21st Century."
],
[
"Springtime in a Small Town is a 2002 Chinese film directed by Tian Zhuangzhuang.",
"The film is a remake of director Fei Mu's 1948 film and its original screenwriter is Li Tianji( Chinese: 李天濟; pinyin: Lǐ Tiānjì),\" Spring in a Small Town\".",
"Though the two movies are referred to by different English titles, they share the same title in Chinese.",
"\" Springtime in a Small Town\" marks the return of Tian Zhuangzhuang to the director's chair, following a nearly nine- year absence since his last film,\" The Blue Kite\"( 1993).",
"It was funded by several production companies from China( Beijing Film Studio, Beijing Rosart Film), France( Orly Films, Paradis Films), and the Netherlands( Fortissimo Films).",
"Unlike\" Kite\", which deployed a large cast, spanned decades, and carried a political message,\" Springtime\" is a small intimate chamber- piece."
],
[
"A Heartbeat Away is a 2011 Australian musical comedy film about a marching band in a small town."
]
],
"title": [
"Ben Palmer",
"Spring in a Small Town",
"Fei Mu",
"Spring in My Hometown",
"Cannibal Tours",
"Dennis O'Rourke",
"Abhishek Saxena",
"Wei Wei (actress)",
"Springtime in a Small Town",
"A Heartbeat Away"
]
},
"supporting_facts": {
"sent_id": [
0,
0,
0,
0
],
"title": [
"Spring in a Small Town",
"Cannibal Tours",
"Fei Mu",
"Dennis O'Rourke"
]
},
"type": "bridge_comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which film has the director who died later, Spring In A Small Town or Cannibal Tours?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Cannibal Tours"
]
},
"style": "rule"
}
|
{
"index": 58,
"split": "test"
}
|
dev_3450
|
Which magazine came out first, Tit-Bits or Illustreret Nyhedsblad?
|
[
"Illustreret Nyhedsblad"
] |
{
"context": {
"content": [
[
"Flagship Magazine was an independent magazine for gamers.",
"Published in the UK, it started in 1983 for Play- by- mail game( PBM) players.",
"In 2002 it extended its coverage to include boardgames, role- playing games, web games and massively multiplayer online games, along with collectible card games and computer games.",
"The magazine came out bimonthly with game reviews, diaries, news columns and an emphasis on reader feedback.",
"Founded in 1983 by Nick Palmer and Chris Harvey,\" Flagship\" was edited by Carol Mulholland from issue 70.",
"Shortly after publishing issue 130 in 2010 Carol Mulholland suffered a serious illness and the magazine ceased publication."
],
[
"Word Up! was an American magazine focusing on teen entertainment and music.",
"The magazine was part of Enoble Media Group.",
"Its main focuses were African- American teen singers, rappers, models, and prodigies.",
"The magazine came out monthly and usually had many posters and contests for fans of famous\" To Die For\" celebrities.",
"It was very popular in the 1980s.",
"The magazine mostly covered information concerning rap, Hip- hop and R&B music.",
"The headquarters was in Paramus, New Jersey.",
"As of 2012,\" Word Up\" ceased publication.",
"The last dated issue was April 2012."
],
[
"Personal Computer World( PCW)( February 1978- June 2009) was the first British computer magazine.",
"Although for at least the last decade it contained a high proportion of Windows PC content( reflecting the state of the IT field), the magazine's title was not intended as a specific reference to this.",
"At its inception in 1978' personal computer' was still a generic term( the Apple II, PET 2001 and TRS- 80 had been launched as personal computers in 1977.)",
"The magazine came out before the Wintel( or IBM PC compatible) platform existed; the original IBM PC itself was introduced in 1981.",
"Similarly, the magazine was unrelated to the Amstrad PCW."
],
[
"Vsyakaya vsyachina( which may be translated as\" Tutti- Frutti\" or All Sorts and Sundries) was a Russian weekly magazine, established in 1769.",
"In 1770, there were 18 issues of the magazine entitled\" Барышек всякия всячин ы\"( Baryshek vsyakiya vsyachiny).",
"Empress Catherine II was a private editor- in- chief of the magazine.",
"\" Vsyakaya vsyachina\" ridiculed the morals and manners of the Russian gentry and protected moderate moralizing satire.",
"At the same time, the magazine came out against oppositionary moods in the society, primarily attacking progressive satiric magazines published by Nikolay Novikov."
],
[
"Illustreret Nyhedsblad was a Norwegian weekly magazine, issued from 1851 to 1866 in Christiania, Norway.",
"Its first editor was Paul Botten- Hansen, who edited the magazine from 1851 to 1864 and from 1865 to 1866, with Frederik Bætzmann being editor from 1864 to 1865.",
"Among its contributors were Bjørnstjerne Bjørnson, Henrik Ibsen, Ernst Sars and Camilla Collett.",
"Ibsen's plays\" Hærmændene paa Helgeland\" and\" Kærlighedens komedie\" were published as supplements to the magazine."
],
[
"Ad Astra is the quarterly magazine of the National Space Society( NSS).",
"The name literally means To the Stars.",
"The magazine came into being following the merger of the L5 Society and the National Space Institute which became the NSS in 1987.",
"The magazine was established in 1989.",
"It is based in Washington DC.",
"Imaginova, publisher of Space.com, published\" Ad Astra\" from 2005 to 2008.",
"From 2008 until the company closed, the magazine was published by MM Publishing."
],
[
"Pearson's Weekly was a British weekly periodical founded in London in 1890 by Arthur Pearson, who had previously worked on\" Tit- Bits\" for George Newnes.",
"The first issue was well advertised and sold a quarter of a million copies.",
"The paper's stated aim was\" To Interest, to Elevate and to Amuse\"."
],
[
"LeftLion is a printed and online culture and listings magazine which covers Nottingham.",
"It was originally set up by three childhood friends, Jared Wilson, Alan Gilby and Tim Bates, and launched as a website on 1 September 2003.",
"The first issue of the printed magazine came out a year later in Autumn 2004.",
"The printed magazine was published bi-monthly from 2004- 2014.",
"Then, following a successful crowdfunding campaign on Kickstarter they raised the funds to publish the magazine monthly, which it has done ever since.",
"The online magazine is updated daily.",
"The publication takes its name from a stone lion in Nottingham's Old Market Square, which since Nottingham Council House was built in the 1920s has served as a meeting point for people in Nottingham City Centre."
],
[
"Railway Transportation was an Australian magazine published by Shennan Publications between 1951 and 1972.",
"It came out monthly."
],
[
"Tit-Bits from all the interesting Books, Periodicals, and Newspapers of the World, more commonly known as Tit-Bits, was a British weekly magazine founded by an early father of popular journalism George Newnes on 22 October 1881.",
"In 1886, the magazine's headquarters moved from Manchester to London where it paved the way for popular journalism — most significantly, the \"Daily Mail\" was founded by Alfred Harmsworth, a contributor to \"Tit-Bits\", and the \"Daily Express\" was launched by Arthur Pearson, who worked at \"Tit-Bits\" for five years after winning a competition to get a job on the magazine.",
"From the outset, the magazine was a mass-circulation commercial publication on cheap newsprint which soon reached sales of between 400,000 and 600,000.",
"Like a mini-encyclopedia it presented a diverse range of tit-bits of information in an easy-to-read format, with the emphasis on human interest stories concentrating on drama and sensation.",
"It also featured short stories and full-length fiction, including works by authors such as Rider Haggard and Isaac Asimov, plus three very early stories by Christopher Priest.",
"Virginia Woolf submitted her first article to the paper in 1890, at the age of eight, but it was turned down.",
"The\nfirst humorous article by P. G. Wodehouse, \"Men Who Missed Their Own Weddings\", appeared in \"Tit-Bits\" in November 1900.",
"During the first world war Ivor Novello won a \"Titbits\" competition to write a song soldiers could sing at the front: he penned \"Keep the Home Fires Burning\".",
"Pin-ups appeared on the magazine's covers from 1939 and by 1955 circulation peaked at 1,150,000.",
"At the beginning of 1973 \"Tit-Bits\" lost the hyphen from its masthead.",
"In 1979 \"Reveille\" (a weekly tabloid with a virtually identical demographic) was merged into \"Titbits\", and the magazine was briefly rebranded as \"Titbits incorporating Reveille\".",
"This however was dropped in July 1981.",
"On 18 July 1984, under its last editor Paul Hopkins, \"Titbits\" was selling only 170,000 copies and was taken over by Associated Newspapers' \"Weekend\".",
"At the time, the \"Financial Times\" described \"Titbits\" as “the 103-year-old progenitor of Britain's popular press”.",
"\"Weekend\" itself closed in 1989."
]
],
"title": [
"Flagship magazine",
"Word Up! (magazine)",
"Personal Computer World",
"Vsyakaya vsyachina",
"Illustreret Nyhedsblad",
"Ad Astra (magazine)",
"Pearson's Weekly",
"LeftLion",
"Railway Transportation",
"Tit-Bits"
]
},
"supporting_facts": {
"sent_id": [
0,
0
],
"title": [
"Tit-Bits",
"Illustreret Nyhedsblad"
]
},
"type": "comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which magazine came out first, Tit-Bits or Illustreret Nyhedsblad?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Illustreret Nyhedsblad"
]
},
"style": "rule"
}
|
{
"index": 59,
"split": "test"
}
|
dev_11757
|
Are both Unity Station and Saskatoon Station (Canadian Pacific Railway) located in the same country?
|
[
"yes"
] |
{
"context": {
"content": [
[
"Lacolle station is a former railway station in Lacolle, Quebec.",
"Its address is 21 rue Ste- Marie adjacent to the Canadian Pacific Railway( CPR) Lacolle Subdivision.",
"A large piece of land surrounds it and a long yard is located on the other side of the tracks.",
"Situated close to the Canada – US border, the Canadian Pacific Railway station at Lacolle played a role as both a railway station and a customs and immigration post for Canadian and American border officers.",
"Today the building is abandoned, but acquisition of the station from Canadian Pacific Railway is almost complete as of October 2011 by the municipality of Lacolle.",
"The municipality is planning for the restoration of the station by 2014 at an estimated cost of 1 million dollars.",
"The station is planned to be a municipal museum; the historical society would also have space for a conference room( as told by local newspapers and the mayor of Lacolle)."
],
[
"Edmonton Canadian Pacific Railway Station was the terminal station for passenger services along the Canadian Pacific Railway's subdivision from Calgary to Edmonton shortly after the completion of the High Level Bridge, with services commencing on September 2, 1913.",
"Passenger services across the North Saskatchewan River were discontinued in 1972, and the station building itself was demolished in 1978."
],
[
"The Kenora railway Station is a heritage Canadian Pacific Railway station in Kenora, Ontario."
],
[
"The Portage la Prairie Canadian Pacific Railway station is a former Canadian Pacific Railway station that is being restored as a museum.",
"Passenger rail service in the community is now provided at the Portage la Prairie railway station.",
"This station was designated a historic site in 1992."
],
[
"Empress Canadian Pacific Railway Station is a one- story timber building in Empress, Alberta that formerly served as the community's Canadian Pacific Railway station.",
"It was designated under the Heritage Railway Stations Protection Act in 1991.",
"It was recognized as a Provincial Historic Resource by the government of Alberta on June 28, 2002."
],
[
"Banff station is a railway station in Banff, Alberta, Canada.",
"It is used by the Rocky Mountaineer and Royal Canadian Pacific rail tour services.",
"The station is located on Canadian Pacific line.",
"The station was originally built for the Canadian Pacific Railway.",
"The station was declared a heritage railway station by the federal government in 1991."
],
[
"Don railway station was built in 1896 by the Canadian Pacific Railway( CPR) on the western bank of the Don River at the south side of Queen Street in Toronto."
],
[
"Unity railway station is on the Canadian National Railway mainline in Unity, Saskatchewan.",
"The station is served by Via Rail's The Canadian two times per week as a flag stop( 48 hours advance notice required)."
],
[
"Moose Jaw station is a former railway station in Moose Jaw, Saskatchewan, Canada.",
"It was designed by Hugh G. Jones and built by the Canadian Pacific Railway from 1920 – 1922.",
"The station comprises a two- story waiting area, four storey office block and six- storey Tyndall stone clock tower.",
"The building was designated a historic railway station in 1991.",
"The station was a stop on the Canadian Pacific Railway service.",
"The station was also a transfer point to the Minneapolis, St. Paul and Sault Ste. Marie Railroad, also known as the Soo Line Railroad, which operated from Saint Paul, Minnesota to Portal, North Dakota\" Soo- Pacific\" during the summer.",
"It ran through to Vancouver via a connection with Canadian Pacific Railway's\" The Dominion\" at Moose Jaw.",
"In the winter the\" Soo- Dominion\" terminated in Moose Jaw permitting a transfer to the Dominion.",
"It was discontinued in December 1963."
],
[
"Saskatoon station is a historic railway station building in Saskatoon, Saskatchewan, Canada.",
"It was built in 1908.",
"It was designated a National Historic Site of Canada in 1976, and has also been protected as a Heritage Railway Station of Canada since 1990.",
"Saskatoon earned the nickname\" Hub City\" from the contributions of the CPR, CNR and GTP.",
"The station was situated on the Regina, Saskatchewan to Edmonton, Alberta rail line."
]
],
"title": [
"Lacolle station",
"Edmonton Canadian Pacific Railway Station",
"Kenora railway station",
"Portage la Prairie Canadian Pacific Railway station",
"Empress Canadian Pacific Railway Station",
"Banff station",
"Don station",
"Unity station",
"Moose Jaw station (Canadian Pacific Railway)",
"Saskatoon station (Canadian Pacific Railway)"
]
},
"supporting_facts": {
"sent_id": [
0,
0
],
"title": [
"Unity station",
"Saskatoon station (Canadian Pacific Railway)"
]
},
"type": "comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Are both Unity Station and Saskatoon Station (Canadian Pacific Railway) located in the same country?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"yes"
]
},
"style": "rule"
}
|
{
"index": 60,
"split": "test"
}
|
dev_5617
|
Are Nastia Liukin and Alice Gerstenberg both from the same country?
|
[
"yes"
] |
{
"context": {
"content": [
[
"Anastasia Valeryevna\" Nastia\" Liukina( born October 30, 1989) is a Russian American former artistic gymnast.",
"She is the 2008 Olympic all- around champion, a five- time Olympic medalist, the 2005 and 2007 world champion on the balance beam, and the 2005 world champion on the uneven bars.",
"She is also a four- time all- around U.S. national champion, winning twice as a junior and twice as a senior.",
"With nine World Championships medals, seven of them individual, Liukin is tied with Shannon Miller for the third- highest tally of World Championship medals( among U.S. gymnasts).",
"Liukin also tied Miller's record( later equaled by Simone Biles) as the American gymnast having won the most medals in a single non-boycotted Olympic Games.",
"Liukin was a key member of the U.S. senior team.",
"She represented the United States at three World Championships and one Olympic Games.",
"In October 2011, Liukin announced that she was returning to gymnastics with the hopes of making a second Olympic team.",
"Liukin did not make the 2012 Olympic team, after several falls at the Olympic Trials and other pre-Olympic events, and retired from the sport in 2012."
],
[
"Iryna Krasnianska was the defending champion, but did not compete.",
"Nastia Liukin won her second world champion in balance beam after 2005."
],
[
"Natasha Kelley( born January 1, 1990 in Baton Rouge, Louisiana) is a former American artistic gymnast.",
"Kelley was a member of the silver- medal- winning U.S. team at the 2006 World Championships, along with Chellsie Memmel, Alicia Sacramone, Jana Bieger, Nastia Liukin, and Ashley Priess."
],
[
"Godwin Davy( born 15 December 1979) is an Anguillan international footballer who plays as a goalkeeper.",
"He made his international debut in a World Cup qualifying match against El Salvador, resulting in a lopsided match, losing 12- 0 in February 2008, and also played in the 4 – 0 loss to the same country in the return match in March 2008."
],
[
"Natasha Kelley( born January 1, 1990 in Baton Rouge, Louisiana) is a former American artistic gymnast.",
"Kelley was a member of the silver- medal- winning U.S. team at the 2006 World Championships, along with Chellsie Memmel, Alicia Sacramone, Jana Bieger, Nastia Liukin, and Ashley Priess."
],
[
"Alexandra Claire McMurtry( born December 8, 1996) is a retired American artistic gymnast.",
"She is the 2013 Nastia Liukin Cup champion and the 2017 NCAA Champion.",
"She competed in NCAA gymnastics for the Florida Gators and was the 10th person in NCAA history to record a Gym Slam."
],
[
"The Nastia Liukin Cup( known as the Nastia Liukin Supergirl Cup before 2012) is an annual artistic gymnastics competition held in the United States and hosted by Olympic gymnast Nastia Liukin.",
"First held in 2010, the event arose through a partnership between Liukin and USA Gymnastics, and is intended to serve as a debut for pre-elite competitors on the national stage.",
"Eligible to compete are Level 10 female gymnasts in both the junior and senior fields.",
"To be eligible to compete in the Nastia Liukin Cup, gymnast must compete at a qualified competition and place top 3 of their age group.",
"Nastia Liukin Cup 2020 Individual Invitationals California Grand Invitation – January",
"3 – January 5, Anaheim, California Alamo Classic: January",
"10 – January 12, San Antonio, Texas Brestyan ’s Invitational: January",
"17- January 20, Malden, Massachusetts Hill ’s Maryland Classic: January",
"17- January 20, Landover Maryland",
"Tampa Bay Turners invitational: January",
"17- January 20, Bradenton, Florida",
"Kim Zmeskal ’s Texas Prime",
"Meet: January 17- January 20, Irving Texas Atlanta Centennial Classic: January 24 – January 26, Atlanta, Georgia WOGA Classic – January",
"31 – February 2, Frisco Texas Circle of Stars Gymnastics Invitational: January 31 – February 2, Indianapolis, Indiana"
],
[
"Nastia Liukin was the defending champion, but did not compete."
],
[
"Alice Erya Gerstenberg( 2 August 1885 – 28 July 1972) was an American playwright, actress, and activist best known for her experimental, feminist drama and her involvement with the Little Theatre Movement in Chicago."
],
[
"Nastia Liukin was the defending champion, but was limited to uneven bars due to an ankle injury."
]
],
"title": [
"Nastia Liukin",
"2007 World Artistic Gymnastics Championships – Women's balance beam",
"Natasha Kelley",
"Godwin Davy",
"Natasha Kelley",
"Alex McMurtry",
"Nastia Liukin Cup",
"2009 World Artistic Gymnastics Championships – Women's balance beam",
"Alice Gerstenberg",
"2006 World Artistic Gymnastics Championships – Women's balance beam"
]
},
"supporting_facts": {
"sent_id": [
0,
0
],
"title": [
"Nastia Liukin",
"Alice Gerstenberg"
]
},
"type": "comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Are Nastia Liukin and Alice Gerstenberg both from the same country?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"yes"
]
},
"style": "rule"
}
|
{
"index": 61,
"split": "test"
}
|
dev_3335
|
Which film was released more recently, Sikuru Hathe or Daughters, Daughters?
|
[
"Sikuru Hathe"
] |
{
"context": {
"content": [
[
"The film was released in Japan on 10 October 2009."
],
[
"The film was released theatrically in the United States in the summer of 1960 by Columbia Pictures."
],
[
"Sikuru Hathe is a 2007 Sri Lankan Sinhala comedy film directed by Giriraj Kaushalya and produced by Hans Anton Vanstarex for New Imperial Talkies.",
"It stars Vijaya Nandasiri, Anarkali Akarsha and Suraj Mapa in lead roles along with Rodney Warnakula and Susila Kottage.",
"Music composed by Tharupathi Munasinghe, and songs composed by Rohana Weerasinghe and Navaratne Gamage.",
"It is the 1092nd Sri Lankan film in the Sinhala cinema.",
"One of best comedy films ever produced in the country,\" Sikuru Hathe\" became the highest- grossing film in 2007 and won many awards at several local film award ceremonies.",
"The film earned a record collection of Rs. 26 millions for 31 days with nearly 300,000 patrons in 25 cinema theaters across the country.",
"It collected Rs. 50 million with 500,000 patrons on October 30 completes 75th day."
],
[
"A Race for Life is a 1928 American silent drama film directed by D. Ross Lederman.",
"Originally, the film was presumed to be lost.",
"However, according to the Library of Congress Database, the film was found in the Netherlands.",
"The film was released with a Vitaphone soundtrack with a synchronised musical score and sound effects."
],
[
"The film was released in 1961 in Japan and was later released in the United States on television.",
"In 1998, the film was featured on an episode of\" Mystery Science Theater 3000\"."
],
[
"The film was released on YouTube and Steam on 14 June 2017."
],
[
"Daughters, Daughters is a 1973 Israeli film directed by Moshé Mizrahi.",
"It was entered into the 1974 Cannes Film Festival."
],
[
"Ronaldo is a 2015 British documentary film directed by Anthony Wonke.",
"It follows the life and career of Portuguese professional footballer Cristiano Ronaldo.",
"The film was released worldwide on 9 November 2015.",
"A trailer for the film was released on 28 September 2015."
],
[
"Coney Island Baby is a 2003 comedy- drama in which film producer Amy Hobby made her directorial debut.",
"Karl Geary wrote the film and Tanya Ryno was the film's producer.",
"The music was composed by Ryan Shore.",
"The film was shot in Sligo, Ireland, which is known locally as\" Coney Island\".",
"The film was screened at the Newport International Film Festival.",
"Hobby won the Jury Award for\" Best First Time Director\".",
"The film made its premiere television broadcast on the Sundance Channel."
],
[
"Lloyd is a 2001 American comedy film.",
"The film was released on May 4, 2001."
]
],
"title": [
"The Wonderful World of Captain Kuhio",
"Battle in Outer Space",
"Sikuru Hathe",
"A Race for Life",
"Invasion of the Neptune Men",
"Rakka (film)",
"Daughters, Daughters",
"Ronaldo (film)",
"Coney Island Baby (film)",
"Lloyd (film)"
]
},
"supporting_facts": {
"sent_id": [
0,
1
],
"title": [
"Sikuru Hathe",
"Daughters, Daughters"
]
},
"type": "comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which film was released more recently, Sikuru Hathe or Daughters, Daughters?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Sikuru Hathe"
]
},
"style": "rule"
}
|
{
"index": 62,
"split": "test"
}
|
dev_11263
|
Where was the director of film Politics Of Love born?
|
[
"T-O",
"The Six",
"Toronto",
"Toronto (Ont.)",
"Hogtown",
"The 416",
"Toronto, Canada",
"City of Toronto",
"Toronto, Ontario"
] |
{
"context": {
"content": [
[
"Jamie Love( born 1990) is a New Zealand representative softball player"
],
[
"Judi Love( born 4 June 1980) is an English stand- up comedian and radio presenter."
],
[
"Thomas E. Love( born October 1937) is an American billionaire businessman.",
"He is the owner, founder and executive chairman of Love's Travel Stops& Country Stores."
],
[
"Jordan Love( born November 2, 1998) is an American football quarterback for the Utah State Aggies."
],
[
"James Love( born 20 October 1987) is an English rugby union player who currently plays for London Scottish."
],
[
"Pamela Love( born 1982) is an American jewelry designer based in New York City."
],
[
"Jason Love( born 3 December 1965) is a former Australian rules footballer who played in the Australian Football League( AFL)."
],
[
"Politics of Love (also known as \"Love Barack\") is a 2011 film directed by William Dear.",
"Bollywood actress Mallika Sherawat and American actor Brian J. White play leading roles."
],
[
"Alex Love( born June 2, 1989) is an American boxer in the flyweight class."
],
[
"William Dear (born November 30, 1943) is a Canadian film director, producer and screenwriter known for directing \"Harry and the HendersonsIf Looks Could KillAngels in the OutfieldWild America\" and \"Santa Who?\".",
"He has directed \"Saturday Night LiveTelevision PartsAmazing StoriesDinosaursCovington Cross\" and \"The Wannabes Starring Savvy\".",
"Dear was born in Toronto, Ontario.",
"He is the father of actor and storyboard artist, Oliver Dear."
]
],
"title": [
"Jamie Love (softball)",
"Judi Love",
"Tom Love",
"Jordan Love",
"James Love (rugby union, born 1987)",
"Pamela Love",
"Jason Love (footballer)",
"Politics of Love",
"Alex Love",
"William Dear"
]
},
"supporting_facts": {
"sent_id": [
0,
2
],
"title": [
"Politics of Love",
"William Dear"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Where was the director of film Politics Of Love born?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"T-O",
"The Six",
"Toronto",
"Toronto (Ont.)",
"Hogtown",
"The 416",
"Toronto, Canada",
"City of Toronto",
"Toronto, Ontario"
]
},
"style": "rule"
}
|
{
"index": 63,
"split": "test"
}
|
dev_10391
|
Who is the spouse of the performer of song Everybody Loves You Now?
|
[
"Alexis Roderick"
] |
{
"context": {
"content": [
[
"\"Everybody Loves You Now\" is a song written by Billy Joel.",
"It was first released on his 1971 debut album \"Cold Spring Harbor\" and was also released as a b-side to his singles",
"\"She's Got a Way\" and \"Tomorrow Is Today.\"",
"Live versions were included on the albums \"Songs in the Attic12 Gardens Live\" and ."
],
[
"Adib Kheir was a leading Syrian nationalist of the 1920s.",
"He was the owner of the Librairie Universelle in Damascus.",
"His granddaughter is the spouse of Manaf Tlass."
],
[
"Princess Auguste of Bavaria( 28 April 1877 – 25 June 1964) was a member of the Bavarian Royal House of Wittelsbach and the spouse of Archduke Joseph August of Austria."
],
[
"Mehdi Abrishamchi is an Iranian People's Mujahedin of Iran( MEK) politician who has been described as\" the right hand man of Massoud Rajavi\".",
"He is the former spouse of the group's current leader, Maryam Rajavi, whom he divorced so that she could become the spouse of Massoud Rajavi."
],
[
"Maria Teresa, Grand Duchess of Luxembourg( born María Teresa Mestre y Batista; on 22 March 1956), is the spouse of Grand Duke Henri."
],
[
"Gertrude of Saxony and Bavaria( 1152/55–1197) was Duchess of Swabia as the spouse of Duke Frederick IV, and Queen of Denmark as the spouse of King Canute VI."
],
[
"\" About You Now\" is a song by British girl group Sugababes from their fifth studio album,\" Change\"( 2007).",
"Written and produced by Dr. Luke along with Cathy Dennis and Steven Wolf, it was released on 24 September 2007 by Island Records as the lead single from the album, the first to feature Amelle Berrabah on all tracks.",
"An uptempo pop track that combines heavy elements of pop rock, and dance- pop, it infuses light electronic sounds.",
"Lyrically,\" About You Now\" finds the protagonist thinking deeply over her relationship with her love interest from whom she parted.",
"The song was generally well received by music critics, who complimented its genre mixing production as well as its aurally remarkable US pop influence.",
"Some of them also described the song as a\" pop- electro- rock masterpiece\" and called it the best Sugababes single in years.",
"Upon its release,\" About You Now\" became the group's sixth UK number one hit and highest- charting single since 2005's\" Push the Button\".",
"It remained atop the UK Singles Chart for four weeks.",
"It also reached the top of the charts in Hungary, as well as the top ten in Austria, Belgium, Germany, Ireland, Norway, and Spain.",
"The Sugababes' highest- selling single to date, it has sold 540,500 copies as of October 2015.",
"\" About You Now\" was nominated for a 2008 BRIT Award for Best British Single.",
"In the 2009 edition of\" Guinness World Records About You Now\" was listed as the\" first track by a British pop act to top the singles chart solely on downloads\".",
"The song was also named as the\" biggest chart mover to the number one position in the UK\".",
"In December 2009, it was revealed by BBC that\" About You Now\" was the UK's fifth most- played song of the decade( 2000 – 09).",
"An acoustic version of the song appears as a bonus track on the group's sixth studio album\" Catfights and Spotlights\"( 2008)."
],
[
"William Martin Joel (born May 9, 1949) is an American singer-songwriter, composer, and pianist.",
"Commonly nicknamed the \"Piano Man\" after his first major hit and signature song of the same name, he has led a commercially successful career as a solo artist since the 1970s, having released twelve studio albums from 1971 to 1993 as well as one studio album in 2001.",
"He is one of the best-selling music artists of all time, as well as the sixth best-selling recording artist and the third best-selling solo artist in the United States, with over 150 million records sold worldwide.",
"His 1985 compilation album, \"Greatest Hits Vol.",
"1 & 2\", is one of the best-selling albums in the U.S.",
"Joel was born in 1949 in The Bronx, New York, and grew up on Long Island, both places that influenced his music.",
"Growing up, he took piano lessons at the insistence of his mother.",
"After dropping out of high school to pursue a musical career, Joel took part in two short-lived bands, The Hassles and Attila, before signing a record deal with Family Productions and kicking off a solo career in 1971 with his first release, \"Cold Spring Harbor\".",
"In 1972, Joel caught the attention of Columbia Records after a live radio performance of the song \"Captain Jack\" became popular in Philadelphia, prompting him to sign a new record deal with the company and release his second album, \"Piano Man\", in 1973.",
"After releasing the albums \"Streetlife Serenade\" and \"Turnstiles\" in 1974 and 1976 respectively, Joel released his critical and commercial breakthrough album, \"The Stranger\", in 1977.",
"This album became Columbia's best-selling release, selling over 10 million copies and spawning several hit singles, including \"Just the Way",
"You",
"AreMovin'",
"Out",
"(Anthony's Song)Only the Good Die Young\", and",
"\"She's Always a Woman\"; another song on this album, \"Scenes from an Italian Restaurant\", is Joel's favorite of his own songs and has become a staple of his live shows.",
"Joel's next album, \"52nd Street\", was released in 1978 and became his first album to peak at #1 on the \"Billboard\" 200 chart.",
"Joel released his seventh studio album, \"Glass Houses\", in 1980 in an attempt to further establish himself as a rock and roll artist; this release featured \"",
"It's Still Rock and Roll to Me\" (Joel's first single to top the \"Billboard\" Hot 100 chart),",
"\"You May Be",
"RightDon't Ask Me",
"Why\", and \"Sometimes a Fantasy\".",
"His next album, \"The Nylon Curtain\", was released in 1982, and stemmed from a desire from Joel to create more lyrically and melodically ambitious music. \"",
"An Innocent Man\", released in 1983, served as an homage to genres of music which Joel had grown up with in the 1950s, such as rhythm and blues and doo-wop; this release featured \"Uptown Girl\" and \"The Longest Time\", two of his most well-known songs today.",
"After releasing the albums \"The Bridge\" and \"Storm Front\" in 1986 and 1989 respectively, Joel released his twelfth and to date final studio album, \"River of Dreams\", in 1993.",
"He went on to release \"Fantasies and Delusions\", a 2001 album featuring classical compositions composed by Joel and performed by British-Korean pianist Richard Hyung-ki Joo.",
"Joel also provided voiceover work in 1988 for the animated Disney film, \"Oliver & Company\", in which he played the character Dodger, and contributed to the soundtracks to several different films, including \"Easy MoneyRuthless People\", and \"Honeymoon in Vegas\".",
"Across the 20 years of his solo career, Joel produced 33 Top 40 hits in the U.S., all of which he wrote himself, and three of which (\"It's Still Rock and Roll to MeTell Her About It\", and \"We Didn't Start the Fire\") managed to peak at the top of the Billboard Hot 100 charts.",
"Joel has been nominated for 23 Grammy Awards, winning five of them, including Album of the Year for \"52nd Street\".",
"Joel was inducted into the Songwriters Hall of Fame (1992), the Rock and Roll Hall of Fame (1999), and the Long Island Music Hall of Fame (2006).",
"In 2001, Joel received the Johnny Mercer Award from the Songwriters Hall of Fame.",
"In 2013, Joel received the Kennedy Center Honors, for influencing American culture through the arts.",
"Since the advent of his solo career, Joel has held a successful touring career, holding live performances across the globe in which he sings several of his written songs.",
"In 1987, he became one of the very first artists to hold a rock and roll tour in the Soviet Union following the country's alleviation of the ban on rock and roll music.",
"Despite largely retiring from writing and releasing pop music following the release of \"River of Dreams\", he continues to tour; he frequently performs at Madison Square Garden in Manhattan.",
"Joel has been in several relationships, including marriages to Elizabeth Weber Small, model Christie Brinkley, and chef Katie Lee; since 2015, he has been married to Alexis Roderick, his fourth spouse.",
"He also has three children: daughter Alexa Ray Joel with Brinkley, and daughters Della Rose and Remy Anne with Roderick."
],
[
"Marie Louise Coidavid( 1778 – March 11, 1851), was the Queen of the Kingdom of Haiti 1811 – 20 as the spouse of Henri I of Haiti."
],
[
"Sophia Magdalena of Denmark (3 July 1746 – 21 August 1813) was Queen of Sweden as the spouse of King Gustav III."
]
],
"title": [
"Everybody Loves You Now",
"Adib Kheir",
"Princess Auguste of Bavaria (1875–1964)",
"Mehdi Abrishamchi",
"Maria Teresa, Grand Duchess of Luxembourg",
"Gertrude of Bavaria",
"About You Now",
"Billy Joel",
"Marie-Louise Coidavid",
"Sophia Magdalena of Denmark"
]
},
"supporting_facts": {
"sent_id": [
0,
35
],
"title": [
"Everybody Loves You Now",
"Billy Joel"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Who is the spouse of the performer of song Everybody Loves You Now?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Alexis Roderick"
]
},
"style": "rule"
}
|
{
"index": 64,
"split": "test"
}
|
dev_4325
|
Who was born later, Alena Stolzová or Ibrahim Al-Saqqar?
|
[
"Ibrahim Al-Saqqar"
] |
{
"context": {
"content": [
[
"Ibrahim Al- Ibrahim( born 3 June 1992) is a Saudi football player who plays for Al- Riyadh as a midfielder."
],
[
"Ibrahim or Ibrahim Kura( 1840s- c. 1885) was Shehu of Borno from c. 1884 to c. 1885."
],
[
"Ibrahim al- Abdullah( born around 1924) is a Lebanese journalist and politician."
],
[
"W. Augustus Barratt( 1873- 1947) was a Scottish- born, later American, songwriter and musician."
],
[
"Alena Stolzová( born 6 November 1937) is a Czech sprinter.",
"She competed in the women's 200 metres at the 1960 Summer Olympics."
],
[
"Abraham Nehmé, BC or Ibrahim Naameh( born 8 October 1927 in Kaïtouly, Syria) was Archbishop of the Melkite Greek Catholic Archeparchy of Homs in Syria."
],
[
"Mohammad Ibrahim al- Shaar( born 1950) is a Syrian military leader who was later named Syrian Minister of the Interior."
],
[
"Ibrahim al- Yaziji( Arabic ابراهيم اليازجي,\" Ibrahim al- Yāzijī\"; 1847–1906) was an Arab philologist, poet and journalist.",
"He belonged to the Greek Catholic population of the Mutasarrifate of Mount Lebanon."
],
[
"Ibrahim Al- Saqqar( born January 20, 1984) is a Jordanian footballer who plays as a defender for Al- Baqa'a."
],
[
"Ibrahim Al- Shehri( born 23 February 1996) is a Saudi Arabian footballer who plays as a wineer."
]
],
"title": [
"Ibrahim Al-Ibrahim",
"Ibrahim Kura of Borno",
"Ibrahim al-Abdallah",
"W. Augustus Barratt",
"Alena Stolzová",
"Abraham Nehmé",
"Mohammad al-Shaar",
"Ibrahim al-Yaziji",
"Ibrahim Al-Saqqar",
"Ibrahim Al-Shehri"
]
},
"supporting_facts": {
"sent_id": [
0,
0
],
"title": [
"Alena Stolzová",
"Ibrahim Al-Saqqar"
]
},
"type": "comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Who was born later, Alena Stolzová or Ibrahim Al-Saqqar?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Ibrahim Al-Saqqar"
]
},
"style": "rule"
}
|
{
"index": 65,
"split": "test"
}
|
dev_8280
|
Who is the father of George, Emperor Of Trebizond?
|
[
"Manuel I Megas Komnenos",
"Manuel I of Trebizond"
] |
{
"context": {
"content": [
[
"Gerald Rudolff Ford( December 9, 1890 – January 26, 1962) was an American businessman, creator of Ford Industries, and Republican politician who was the stepfather of U.S. President Gerald Ford and for whom Ford legally changed his name."
],
[
"Alexander Komnenos or Skantarios Komnenos was a co-emperor of the Empire of Trebizond, first with his father Alexios IV of Trebizond then, after several years in exile, with his brother John IV."
],
[
"Jerrold Wexler( June 27, 1924 – November 10, 1992) was a noted American businessman and film producer.",
"He was the brother of cinematographer Haskell Wexler and the stepfather of actress Daryl Hannah."
],
[
"Irene of Trebizond( died around 1382) was the bigamous wife of Basil of Trebizond, by whom he had two sons, Alexios and John( later Alexios III of Trebizond), and possibly three daughters Anna, Maria and Theodora."
],
[
"Manuel II",
"Megas Komnenos",
"( Greek: Μανουήλ Β΄ Μέγας Κομνηνός,\" Manouēl II Megas Komnēnos\")( c. 1324–1333) was Emperor of Trebizond for eight months in 1332.",
"He was the only child known to hold the office of Emperor of Trebizond.",
"Manuel was the son of Emperor Andronikos III, whom he succeeded at the age of eight in January 1332."
],
[
"Maria of Gothia was the first wife of David of Trebizond, last Emperor of Trebizond, whom she married in 1426."
],
[
"Helena Kantakouzene( died 1463) was the second wife of David of Trebizond, the last Emperor of Trebizond.",
"Recent research has led to doubts regarding her existence."
],
[
"Irene Syrikaina was the third Empress-consort of Manuel I of Trebizond.",
"She may be the same Irene Syrikaina whom Michael Panaretos mentions was stoned to death in September 1332 in the purges that followed Basil Megas Komnenos's accession to the throne."
],
[
"Héctor Barrantes Sansoni( 1939–1990) was an Argentine polo player.",
"He was the stepfather of Sarah, Duchess of York."
],
[
"George Megas Komnenos (\"Geōrgios Megas Komnēnos\"), (1255 – after 1284), Emperor of Trebizond from 1266 to 1280.",
"He was the elder son of Emperor Manuel I and his third wife, Irene Syrikaina, a Trapezuntine noblewoman.",
"He succeeded his half-brother Andronikos in 1266 and ruled for 14 years."
]
],
"title": [
"Gerald Rudolff Ford",
"Alexander of Trebizond",
"Jerrold Wexler",
"Irene of Trebizond",
"Manuel II of Trebizond",
"Maria of Gothia",
"Helena Kantakouzene, Empress of Trebizond",
"Irene Syrikaina",
"Héctor Barrantes",
"George, Emperor of Trebizond"
]
},
"supporting_facts": {
"sent_id": [
1,
0
],
"title": [
"George, Emperor of Trebizond",
"Irene Syrikaina"
]
},
"type": "inference"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Who is the father of George, Emperor Of Trebizond?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Manuel I Megas Komnenos",
"Manuel I of Trebizond"
]
},
"style": "rule"
}
|
{
"index": 66,
"split": "test"
}
|
dev_8004
|
Which film was released more recently, Unseen Forces or The Twilight Saga: Breaking Dawn – Part 1?
|
[
"Twilight Saga: Breaking Dawn – Part 1",
"the twilight saga:Breaking Dawn",
"The Twilight Saga: Breaking Dawn – Part 1"
] |
{
"context": {
"content": [
[
"The Twilight Saga: Breaking Dawn – Part 2( commonly referred to as Breaking Dawn: Part 2) is a 2012 American romantic drama fantasy film directed by Bill Condon and based on the novel\" Breaking Dawn\" by Stephenie Meyer.",
"Constituting the second of a two- part adaptation of the novel, the film is the fifth and final installment in\" The Twilight Saga\" film series, following 2011's.",
"All three main cast members, Kristen Stewart, Robert Pattinson, and Taylor Lautner, reprise their roles, with Mackenzie Foy portraying Renesmee Cullen.",
"Alongside Pattinson, Lautner, Stewart, and Foy, the film also stars an ensemble cast including Billy Burke, Peter Facinelli, Elizabeth Reaser, Kellan Lutz, Nikki Reed, Jackson Rathbone, Ashley Greene, Michael Sheen, and Dakota Fanning.",
"\" Part 2\" was released on November 16, 2012.",
"The film, despite mixed critical reception, was a box office success, grossing nearly$ 830 million worldwide against production budget of$ 136 million, becoming the sixth- highest- grossing film of 2012 and the highest- grossing film of\" The Twilight Saga\" series."
],
[
"The film was released in Japan on 10 October 2009."
],
[
"The Twilight Saga: Breaking Dawn – Part 1( Original Motion Picture Soundtrack) is the soundtrack album to the film,.",
"It is the fourth soundtrack in the saga's chronology, and was released on November 8, 2011.",
"The soundtrack was once again produced by Alexandra Patsavas, the music director for the previous three films.",
"The track list for the album was revealed on September 26, 2011, followed by the release of the album's lead single the following day."
],
[
"The Twilight Saga: Breaking Dawn – Part 2( commonly referred to as Breaking Dawn: Part 2) is a 2012 American romantic drama fantasy film directed by Bill Condon and based on the novel\" Breaking Dawn\" by Stephenie Meyer.",
"Constituting the second of a two- part adaptation of the novel, the film is the fifth and final installment in\" The Twilight Saga\" film series, following 2011's.",
"All three main cast members, Kristen Stewart, Robert Pattinson, and Taylor Lautner, reprise their roles, with Mackenzie Foy portraying Renesmee Cullen.",
"Alongside Pattinson, Lautner, Stewart, and Foy, the film also stars an ensemble cast including Billy Burke, Peter Facinelli, Elizabeth Reaser, Kellan Lutz, Nikki Reed, Jackson Rathbone, Ashley Greene, Michael Sheen, and Dakota Fanning.",
"\" Part 2\" was released on November 16, 2012.",
"The film, despite mixed critical reception, was a box office success, grossing nearly$ 830 million worldwide against production budget of$ 136 million, becoming the sixth- highest- grossing film of 2012 and the highest- grossing film of\" The Twilight Saga\" series."
],
[
"The Twilight Saga: Breaking Dawn – Part 2( Original Motion Picture Soundtrack) is the soundtrack album to the film, released on November 13, 2012.",
"It is the fifth soundtrack in the saga's chronology, and it was once again produced by Alexandra Patsavas, the music director for the previous three films.",
"The track list for the album was revealed on October 4, 2012, which included the announcement of the album's lead single.",
"The album sold 229,000 copies in the US in 2012, making it the third best- selling soundtrack album of the year.",
"It has sold 303,000 copies as of April 2013."
],
[
"The Twilight Saga: New Moon (commonly referred to as New Moon) is a 2009 American romantic fantasy film based on Stephenie Meyer's 2006 novel \"New Moon\".",
"It is the second film in \"The Twilight Saga\" film series and is the sequel to 2008's \"Twilight\".",
"Summit Entertainment greenlit the sequel in late November 2008, following the early success of \"Twilight\".",
"Directed by Chris Weitz, the film stars Kristen Stewart, Robert Pattinson, and Taylor Lautner, reprising their roles as Bella Swan, Edward Cullen, and Jacob Black, respectively.",
"Melissa Rosenberg, who handed in a draft of the film script during the opening weekend of \"Twilight\", returned as screenwriter for \"New Moon\" as well.",
"Filming began in Vancouver in late March 2009, and in Montepulciano, Italy in late May 2009.",
"The film was released on November 20, 2009 in most countries, and set domestic box office records as the biggest midnight screening, grossing $26.3 million, which was superseded by its sequel, .",
"This led to the highest single-day, domestic gross on an opening day, with $72.7 million, until it was beaten in 2011 by \"Harry Potter and the Deathly Hallows – Part 2\", which made $91.1 million, by nearly $20 million.",
"Furthermore, \"New Moon\" opened with the third-highest domestic opening weekend since 2002 grossing a total of $142,839,137.",
"The film also became the highest-grossing film released by Summit Entertainment, and was the widest independent release, playing in over 4,100 theaters in its theatrical run, until it was surpassed by \"The Twilight Saga: Eclipse\".",
"\"New Moon\" was released on DVD and Blu-ray Disc on March 20, 2010 through midnight release parties.",
"As of July 2012, the film has grossed $184,916,451 in North American DVD sales, selling more than 8,835,501 units, 4 million of which were sold within its first weekend, beating \"Twilight\"s 3.8 million units sold in its first two days.",
"The film was well received by fans, but received generally negative reviews from critics."
],
[
"This is a list of the cast members from",
"The Twilight Saga film series, which is based on the novels by Stephenie Meyer.",
"The main stars of the films are Kristen Stewart as Bella Swan, Robert Pattinson as Edward Cullen, and Taylor Lautner as Jacob Black.",
"\" Twilight\"( 2008) is based on the\" New York Times\" best selling novel of the same name( 2005) and was directed by Catherine Hardwicke.",
"The second film,( 2009) is based on the first book's sequel( 2006).",
"It was directed by Chris Weitz.",
"The third film,, directed by David Slade, was released on June 30, 2010.",
"and is based on the third installment in the series( 2007).",
"The filming of Breaking Dawn pt 1 started on November 1, 2010.",
"The Twilight Saga: Breaking Dawn – Part 1( commonly referred to as Breaking Dawn – Part 1) released in theatres on November 18, 2011, and released to DVD on February 11, 2012 in the United States.",
"The film grossed over$ 712 million worldwide.",
"The Twilight Saga: Breaking Dawn – Part 2( commonly referred to as Breaking Dawn – Part 2) was released on November 16, 2012, by Lionsgate in the United States, in consequence of the merger between Lionsgate and Summit Entertainment.",
"The film( 101 days in release) was a box- office success, grossing over$ 829 million worldwide, becoming the 34th highest- grossing film, the 6th highest- grossing film of 2012 and the highest- grossing film of the Twilight series.",
"Key"
],
[
"The Twilight Saga: Breaking Dawn – Part 1( commonly referred to as Breaking Dawn: Part 1) is a 2011 American romantic fantasy film directed by Bill Condon, based on the novel\" Breaking Dawn\" by Stephenie Meyer.",
"Constituting the first of a two- part adaptation of the novel, the film is the fourth and penultimate installment in\" The Twilight Saga\" film series, the sequel to in 2010, and was followed by in 2012.",
"All three main cast members, Kristen Stewart, Robert Pattinson, and Taylor Lautner, reprised their roles.",
"Wyck Godfrey and Karen Rosenfelt served as producers of the film, along with the author of the series, Stephenie Meyer; the screenplay was written by Melissa Rosenberg, the screenwriter of the first three entries.",
"It was released in theaters on November 18, 2011, and released to DVD on February 11, 2012, in the United States.",
"Although negatively reviewed by critics, the film was commercially successful, grossing over$ 712 million worldwide."
],
[
"The Twilight Saga: Eclipse, commonly referred to as Eclipse, is a 2010 American romantic fantasy film based on Stephenie Meyer's 2007 novel\" Eclipse\".",
"It is the third installment of\" The Twilight Saga\" film series, following 2008's\" Twilight\" and 2009's.",
"Summit Entertainment greenlit the film in February 2009.",
"Directed by David Slade, the film stars Kristen Stewart, Robert Pattinson, and Taylor Lautner, reprising their roles as Bella Swan, Edward Cullen, and Jacob Black, respectively.",
"Melissa Rosenberg, who penned the scripts for both\" Twilight\" and\" New Moon\", returned as screenwriter.",
"Filming began on August 17, 2009, at Vancouver Film Studios, and finished in late October, with post-production began early the following month.",
"Bryce Dallas Howard was cast as Victoria, replacing Rachelle Lefevre who previously played her.",
"The film was released worldwide on June 30, 2010 in theatres, and became the first\" Twilight\" film to be released in IMAX.",
"The film has received mixed reception from critics.",
"It held the record for biggest midnight opening in the United States and Canada in box office history, grossing an estimated$ 30 million, until it was surpassed by\" Harry Potter and the Deathly Hallows – Part 2\" in 2011.",
"The film then scored the biggest Wednesday opening in the United States and Canada history with$ 68,533,840 beating\"'s\"$ 62 million.",
"\" Eclipse\" has also become the film with the widest independent release, playing in over 4,416 theaters, surpassing its predecessor,\" The Twilight Saga: New Moon\", which held the record since November 2009."
],
[
"Unseen Forces is an extant 1920 silent film drama directed by Sidney Franklin and starring Sylvia Breamer and Conrad Nagel.",
"It was long thought to be lost until a copy was located in New Zealand.",
"It was produced by the same production company that produced\" The Miracle Man\" the year before, Mayflower Productions.",
"Excellent print held by New Zealand archive with copy repatriated to Library of Congress."
]
],
"title": [
"The Twilight Saga: Breaking Dawn – Part 2",
"The Wonderful World of Captain Kuhio",
"The Twilight Saga: Breaking Dawn – Part 1 (soundtrack)",
"The Twilight Saga: Breaking Dawn – Part 2",
"The Twilight Saga: Breaking Dawn – Part 2 (soundtrack)",
"The Twilight Saga: New Moon",
"List of Twilight cast members",
"The Twilight Saga: Breaking Dawn – Part 1",
"The Twilight Saga: Eclipse",
"Unseen Forces"
]
},
"supporting_facts": {
"sent_id": [
0,
0
],
"title": [
"Unseen Forces",
"The Twilight Saga: Breaking Dawn – Part 1"
]
},
"type": "comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which film was released more recently, Unseen Forces or The Twilight Saga: Breaking Dawn – Part 1?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Twilight Saga: Breaking Dawn – Part 1",
"the twilight saga:Breaking Dawn",
"The Twilight Saga: Breaking Dawn – Part 1"
]
},
"style": "rule"
}
|
{
"index": 67,
"split": "test"
}
|
dev_4114
|
Who was born earlier, Kothapalli Jayashankar or Roberto Magris?
|
[
"Kothapalli Jayashankar"
] |
{
"context": {
"content": [
[
"Kothapalli Jayashankar.",
"popularly known as Professor Jayashankar (6 August 1934 – 21 June 2011)was an Indian academic and Social Activist.",
"He was a leading ideologue of Telangana Movement..",
"He has been fighting for a separate state since 1952.",
"He often stated that unequal distribution of river water was the root cause of separate Telangana movement.",
"He was the former Vice-Chancellor of Kakatiya University and an activist in the Separate Telangana Movement.",
"Prof. Jayashankar Telangana State Agricultural University (PJTSAU) named in honour and memory of Professor Jayashankar, is the only Farm University of Telangana state which came into being in the event of the bifurcation from Acharya N. G. Ranga Agricultural University.",
"worked as principal in ckm arts and science college warangal"
],
[
"Roberto de Noya, O.P. or Roberto de Noja( died 1515) was a Roman Catholic prelate who served as Archbishop of Naxos( 1504 – 1515), Bishop of Acerra( 1497 – 1504), and Bishop of Minervino Murge( 1492– 1497)."
],
[
"Roberto Roberti, O.P. or Roberto Vittori( 1575 – 1624) was a Roman Catholic prelate who served as Bishop of Tricarico( 1611 – 1624)."
],
[
"Robert Guiscard( – 17 July 1085) was a Norman adventurer remembered for the conquest of southern Italy and Sicily.",
"Robert was born into the Hauteville family in Normandy, went on to become Count of Apulia and Calabria( 1057 – 1059), and then Duke of Apulia and Calabria and Duke of Sicily( 1059 – 1085), and briefly Prince of Benevento( 1078 – 1081) before returning the title to the Pope.",
"His sobriquet, in contemporary Latin and Old French, is often rendered\" the Resourcefulthe Cunningthe Wilythe Fox\", or\" the Weasel\".",
"In Italian sources he is often Roberto II Guiscardo or Roberto d' Altavilla( from Robert de Hauteville)."
],
[
"Roberto Magris( born June 19, 1959) is an Italian jazz pianist.",
"He was born in Trieste, Italy, but is currently based in the United States.",
"In addition to being a bandleader and arranger, Magris is a composer, concert performer, and recording artist who has issued over 32 albums under his own name or as co-leader and has toured in over 42 different countries in Europe, America, Asia, Africa and Australia, performing at top jazz clubs, theatres and festivals around the world.",
"Magris began performing professionally in the late 1970s.",
"Throughout his career, he has performed and recorded with a list of jazz legends and outstanding musicians, including Art Davis, Herb Geller,\" Tootie\" Heath, Idris Muhammad, Ira Sullivan, Sam Reed, Brian Lynch, Tony Lakatos, Mark Colby, Eric Hochberg, Chuck Bergeron, Paul Carr, Franco Ambrosetti, Philip Catherine, Bill Molenhof, Janusz Muniak, Bosko Petrovic, Kai Winding, Eddie\" Lockjaw\" Davis, Sal Nistico, and with a bunch of rising stars such as Hermon Mehari, Logan Richardson, Kendall Moore, Brandon Lee, Eric Jacobson, Shareef Clayton, Jim Mair, Matt Otto, Elisa Pruett, Peter Schlamb, Dominique Sanders, Brian Steever, Alfredo Chacon, Rodolfo Zuniga, Pablo Sanhueza and others.",
"Since 2006, Magris has mainly focused his musical interests in the United States by becoming the leading artist and musical director of Kansas City- based JMood Records label, releasing 17 CDs under his name.",
"In the United States, Magris has performed in Los Angeles( The Jazz Bakery, Catalina Jazz Club in Hollywood), Kansas City( The Blue Room), Miami( The WDNA Jazz Gallery, FIU Music Festival at Wertheim Performing Arts Center), Chicago( The Jazz Showcase), Milwaukee( The Jazz Estate), St. Louis, Omaha, Des Moines, Ft. Lauderdale.",
"In 2007 he was honored by the City Council of Kansas City for his contribution to jazz music and performance at the American Jazz Museum.",
"In the 1980s he led the jazz trio Gruppo Jazz Marca and recorded three albums,\" Comunicazione Sonora\"( 1982),\" Aria Di Città\"( 1983) and\" Mitteleuropa\"( 1986), re-issued in 2006 by the English Label Arision.",
"In 1987 he founded his Italian quartet, which operated for nearly 20 years, performing tours in Europe, Asia, Australia and South America and recording two albums,\" Life in Israel\"( issued by Jazzis) and\" Maliblues\".",
"In the 1990s he founded the acid- jazz groups DMA Urban Jazz Funk and Alfabeats Nu Jazz, performing in Europe and in America and recording two albums,\" Up To The Beat\"( 2003) and\" Stones\"( 2006).",
"In 1998 Magris formed the Europlane Orchestra( a central- European jazz venture sponsored by INCE- CEI Central European Initiative) including some outstanding European jazz musicians.",
"With the Europlane Orchestra he recorded 3 albums,\" Live At ZooestPlays Kurt Weill\" with Austrian singer Ines Reiger, and\" Current Views\"( issued by Soul Note) with Philip Catherine, Bill Molenhof, Roberto Ottaviano a.o.",
"In 2005 Magris partnered with Hungarian saxophonist Tony Lakatos and Austrian saxophonist Michael Erian on the album\" Check- In\"( issued by Soul Note).",
"In 2006 Magris collaborated with bassist Art Davis and drummer Jimmy\" Junebug\" Jackson on the album\" Kansas City Outbound\"( issued by JMood).",
"That same year, he collaborated with alto saxophonist Herb Geller on the album\" Il Bello del Jazz\"( issued by Soul Note) and some years later, the JMood Records released a more album with Herb Geller entitled\" An Evening With Herb Geller& The Roberto Magris Trio- Live in Europe 2009\".",
"In the United States, Magris has also recorded two albums in tribute to the late trumpeter Lee Morgan,\" Morgan Rewind: A Tribute to Lee Morgan, Volumes 1 and 2\", two albums devoted to the music of pianist Elmo Hope, and another to the legacy of alto saxophonist Julian\" Cannonball\" Adderley.",
"He has also recorded one album in Los Angeles with Idris Muhammad and saxmen Paul Carr and Michael O' Neill,\" Mating Call\", one album with Philadelphia jazz icon Sam Reed,\" Sam Reed Meets Roberto Magris- Ready For Reed\", one double CD set in tribute to the historical be -bop era,\" Aliens In A Bebop Planet\", three albums with his trio from Kansas City including bassist Dominique Sanders, drummer Brian Steever, and percussionist Pablo Sanhueza(\" EnigmatixNeed To Bring Out LoveWorld Gardens\"), the critically acclaimed\" Roberto Magris Sextet Live in Miami@ The WDNA Jazz Gallery\" with a stellar group including Grammy Award winner Brian Lynch.,\" Sun Stone\" featuring jazz legend Ira Sullivan and his most recent masterpiece\" Suite!\", a 2 Cd set recorded in Chicago in 2018, with Mark Colby, Eric Jacobson, Erich Hochberg, Greg Artry and PJ Aubree Collins.",
"In Europe, Magris has been performing at festivals and jazz clubs with the MUH Trio( Roberto Magris/ Frantisek Uhlir/ Jaromir Helesic Trio), which has its base in Prague and has recorded the album\" Prague After Dark\"( issued by JMood).",
"Critic Brian Morton on\" Jazz Journal\" called Magris “ the most interesting bop/ post- bop pianist on the current European scene. ”",
"Critic Edward Blanco at\" All AboutJazz\" called Magris\" one of the finest piano players on the planet.\"",
"Noted jazz critic Ira Gitler wrote, “ As a pianist,[ Magris] reflects some of his most admired models- Wynton Kelly, Tommy Flanagan, Bill Evans, Kenny Drew, Jaki Byard, Randy Weston, McCoy Tyner, Andrew Hill, Paul Bley, Don Pullen and Steve Kuhn-( a varied group indeed) in his own way.",
"There is a quick, supple right hand and the rich harmonies of the two- handed transition in his playing that do n’t necessarily summon up instant comparisons to any of the above or others. ”",
"Magris has performed concerts for jazz piano, orchestra, and strings with the Big Band Ritmo Sinfonica from Verona and the Orchestra Giovanile del Veneto, recording the album\" Restless Spirits\"( issued by Velut Luna)."
],
[
"Antal\" Tony\" Lakatos( born 13 November 1958 in Budapest) is a Hungarian Jazz saxophonist( tenor, soprano saxophone), who currently lives in Frankfurt( Germany).",
"Lakatos attended the Béla Bartók Conservatory in Budapest from 1975 to 1980, then moved to Germany, where he worked with's band, Uwe Kropinski, Jasper van't Hof, and Wolfgang Haffner.",
"He has been leading his own ensembles since the late 1980s, including a group called Things; his sidemen have included JoAnne Brackeen, Terri Lynne Carrington, Al Foster, Billy Hart, and Anthony Jackson.",
"He has also worked with Randy Brecker, Kevin Mahogany, George Mraz, the Mingus Big Band, Chris Hinze, Kirk Lightsey, Dusko Goykovich, Michael Sagmeister, Roberto Magris, Art Farmer, and Kenny Werner."
],
[
"Joël Henry( born 1955) is a French journalist who was born in Strasbourg."
],
[
"Carlos Roberto de Oliveira, known as Roberto or Roberto Dinamite( born 13 April 1954) is a Brazilian former footballer and politician.",
"He was born in Duque de Caxias, Rio de Janeiro state.",
"With a career as centre forward spanning over twenty years, Roberto is Vasco da Gama's player with the most appearances and all- time top-scorer, as well as the overall leading scorer in the Brazilian Série A.",
"At the national level, Roberto played in the 1978 and 1982 FIFA World Cups and the 1972 Olympic Games.",
"He was president of Vasco da Gama from 2008 to 2014."
],
[
"Roberto Costa Cabral or Roberto Costa, as he was known,( born December 8, 1954 in Santos) is a former football player from Brazil.",
"He played goalkeeper( height: 1,85 m), mainly with Atlético Paranaense.",
"He twice received the Bola de Ouro award( Golden Ball in English) given to the best player in the Campeonato Brasileiro( Brazilian championship).",
"He has one cap with the Brazilian national team."
],
[
"Roberto Alejandro Avellanet Negrón( born June 30, 1975 in San Juan), known professionally as Robert Avellanet or Roberto Avellanet, is a Puerto Rican singer who was a member of\" Menudo\"."
]
],
"title": [
"Kothapalli Jayashankar",
"Roberto de Noya",
"Roberto Roberti (bishop)",
"Robert Guiscard",
"Roberto Magris",
"Tony Lakatos",
"Joël Henry (journalist)",
"Roberto Dinamite",
"Roberto Costa",
"Robert Avellanet"
]
},
"supporting_facts": {
"sent_id": [
1,
0
],
"title": [
"Kothapalli Jayashankar",
"Roberto Magris"
]
},
"type": "comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Who was born earlier, Kothapalli Jayashankar or Roberto Magris?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Kothapalli Jayashankar"
]
},
"style": "rule"
}
|
{
"index": 68,
"split": "test"
}
|
dev_832
|
Who is Ailwyn Fellowes, 1St Baron Ailwyn's paternal grandfather?
|
[
"William Fellowes",
"William Henry Fellowes"
] |
{
"context": {
"content": [
[
"Richard John Warburton Lytton(\" né\" Warburton; 26 August 1745 – 29 December 1810) was an English landowner and member of the Lytton family.",
"He was the father of Elizabeth Barbara Lytton, and the grandfather of Henry Bulwer, 1st Baron Dalling and Bulwer and Edward Bulwer- Lytton, 1st Baron Lytton."
],
[
"Lt.- Col. Ronald Townshend Fellowes, 2nd Baron Ailwyn",
"( 7 December 1886 – 30 August 1936) was a British peer, the son of Ailwyn Edward Fellowes, 1st Baron Ailwyn.",
"He succeeded to the Barony on 23 September 1924.",
"He lived at Honingham Hall in Norfolk until he sold it in 1935.",
"He married Mildred King on 21 August 1916.",
"He died, without issue, on 30 August 1936 at age 49, from wounds received in the First World War."
],
[
"William Henry Fellowes, 2nd Baron de Ramsey( 16 May 1848 – 8 May 1925), was a British Conservative politician.",
"De Ramsey was the eldest son of Edward Fellowes, 1st Baron de Ramsey, and Hon. Mary Julia Milles.",
"Ailwyn Fellowes, 1st Baron Ailwyn, was his younger brother.",
"He purchased a commission as a cornet in the Life Guards on 16 March 1867; he retired as a captain on 21 July 1877.",
"He succeeded his father as Member of Parliament for Huntingdonshire in 1880, a seat he held until 1885 when the constituency was abolished, and then represented Ramsey until August 1887, when he inherited the barony on his father's death and entered the House of Lords.",
"In 1890 de Ramsey was appointed a Lord- in- waiting( government whip in the House of Lords) in the Conservative administration of Lord Salisbury, a post he held until the Conservative defeat in the 1892 general election."
],
[
"Edward Fellowes, 1st Baron de Ramsey DL (14 May 1809 – 9 August 1887) was a British Conservative Member of Parliament.",
"De Ramsey was the son of William Henry Fellowes, of Ramsey Abbey in Huntingdonshire, and Emma Benyon.",
"He was elected to the House of Commons for Huntingdonshire in 1837, a seat he held for 43 years, until 1880.",
"In July 1887, only a month before his death, he was raised to the peerage as Baron de Ramsey, of Ramsey Abbey in the County of Huntingdon.",
"His seat was Haveringland Hall.",
"Lord de Ramsey married, in 1845, Hon.",
"Mary Julia Milles, daughter of George Milles, 4th Baron Sondes.",
"He died in August 1887, aged 78, and was succeeded in the barony by his eldest son William Henry Fellowes.",
"The Dowager Lady de Ramsey died 10 April 1901.",
"Their younger son Ailwyn Fellowes was a Conservative politician, elevated to the peerage as Baron Ailwyn in 1921."
],
[
"Carol Arthur Fellowes, 4th Baron Ailwyn, TD( 23 November 1896 – 27 September 1988) was a British peer, the son of Ailwyn Edward Fellowes, 1st Baron Ailwyn.",
"He was known as the Honourable Carol Fellowes from 1921, when his father was raised to the peerage, until he succeeded to the barony on 23 March 1976."
],
[
"Ailwyn Edward Fellowes, 1st Baron Ailwyn (10 November 1855 – 23 September 1924), was a British businessman, farmer and Conservative politician.",
"He was a member of Arthur Balfour's cabinet as President of the Board of Agriculture between March and December 1905."
],
[
"Richard Colley Wesley, 1st Baron Mornington (– 31 January 1758) was an Irish peer, best remembered as the grandfather of Arthur Wellesley, 1st Duke of Wellington."
],
[
"Captain Eric William Edward Fellowes, 3rd Baron Ailwyn( 24 November 1887 – 23 March 1976) was a British peer, the son of Ailwyn Edward Fellowes, 1st Baron Ailwyn.",
"He succeeded to the Barony on 30 August 1936.",
"Fellows was educated at Stubbington House School and at HMS\" Britannia\".",
"He married Cecil Lorna Barclay( d.1976), on 5 June 1935.",
"In 1942 he was a member of British Parliamentary Mission to China, and from 1943 to 1948 he was President of the China Association.",
"He died, without issue, on 23 March 1976."
],
[
"William Henry Fellowes( 15 July 1769 – 23 August 1837), of Ramsey Abbey in Huntingdonshire and Haverland Hall in Norfolk, was a British Member of Parliament.",
"Fellowes was the son of William Fellowes and Lavinia Smyth.",
"He was elected to the House of Commons for Huntingdon in 1796, a seat he held until 1807, and then represented Huntingdonshire from 1807 to 1830.",
"Fellowes married Emma Benyon, daughter of Richard Benyon MP.",
"Their eldest surviving son Edward Fellowes was elevated to the peerage as Baron de Ramsey in 1887.",
"Their third son, Richard, was Conservative MP for Berkshire.",
"Two of their grandsons, William Fellowes, 2nd Baron de Ramsey, and Ailwyn Fellowes, 1st Baron Ailwyn, were both Conservative government ministers.",
"Fellowes died in August 1837, aged 68.",
"Another grandson, James Herbert Benyon, was Lord Lieutenant of Berkshire."
],
[
"John Ailwyn Fellowes, 4th Baron de Ramsey( born 27 February 1942) is a British landowner, agriculturalist, and the first chairman of the Environment Agency."
]
],
"title": [
"Richard Warburton Lytton",
"Ronald Fellowes, 2nd Baron Ailwyn",
"William Fellowes, 2nd Baron de Ramsey",
"Edward Fellowes, 1st Baron de Ramsey",
"Carol Fellowes, 4th Baron Ailwyn",
"Ailwyn Fellowes, 1st Baron Ailwyn",
"Richard Wesley, 1st Baron Mornington",
"Eric Fellowes, 3rd Baron Ailwyn",
"William Henry Fellowes",
"John Fellowes, 4th Baron de Ramsey"
]
},
"supporting_facts": {
"sent_id": [
0,
1
],
"title": [
"Ailwyn Fellowes, 1st Baron Ailwyn",
"Edward Fellowes, 1st Baron de Ramsey"
]
},
"type": "inference"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Who is Ailwyn Fellowes, 1St Baron Ailwyn's paternal grandfather?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"William Fellowes",
"William Henry Fellowes"
]
},
"style": "rule"
}
|
{
"index": 69,
"split": "test"
}
|
dev_1512
|
Are both Pender Harbour Water Aerodrome and Kangiqsujuaq (Wakeham Bay) Airport located in the same country?
|
[
"yes"
] |
{
"context": {
"content": [
[
"Kangiqsujuaq( Wakeham Bay) Airport,, is located southeast of Kangiqsujuaq, Quebec, Canada."
],
[
"Put- in- Bay Airport is a public airport located in Put- in- Bay on South Bass Island in Ottawa County, Ohio, United States.",
"The approach to Runway 21 is complicated by Perry's Victory and International Peace Memorial, which is almost exactly in line with the runway.",
"The airport is located in the center of the southern half of the island.",
"Put-in- Bay's airport is open year- round for arrivals and departures from dawn to dusk.",
"Runways at the Put- in- Bay airport are modern and paved but not lighted so there are no departures after dusk.",
"You may expect light turbulence on approach when winds exceed 10 knots.",
"Be sure to remain at least 1000 feet from Perry's monument located 1.5 miles northeast of the field and having an elevation of 942 feet MSL.",
"There is no fuel service at the Put- in- Bay airport but may be purchased at Erie- Ottawa Regional Airport in Port Clinton.",
"The Put- in- Bay airport provides charter services, refreshments( vending) and rest rooms.",
"Credit cards are accepted."
],
[
"South West Bay Airport is an airport in Southwest Bay, Vanuatu."
],
[
"Bamfield Water Aerodrome,, is an airport located adjacent to Bamfield, British Columbia, Canada."
],
[
"Scammon Bay Airport is a state- owned public- use airport located in Scammon Bay, a city in the Kusilvak Census Area of the U.S. state of Alaska."
],
[
"Streaky Bay Airport is an airport located southeast of Streaky Bay, a town in the state of South Australia in Australia."
],
[
"Goose Bay Airport is a public airport located 14 miles north of the central business district( CBD) of Anchorage, Alaska, USA."
],
[
"Coal Harbour Water Aerodrome is located at the village of Coal Harbour, British Columbia, Canada."
],
[
"Tofino Harbour Water Aerodrome is located in Tofino Harbour adjacent to Tofino, British Columbia, Canada."
],
[
"Pender Harbour Water Aerodrome,, is located adjacent to Pender Harbour, British Columbia, Canada."
]
],
"title": [
"Kangiqsujuaq (Wakeham Bay) Airport",
"Put-in-Bay Airport",
"South West Bay Airport",
"Bamfield Water Aerodrome",
"Scammon Bay Airport",
"Streaky Bay Airport",
"Goose Bay Airport (Alaska)",
"Coal Harbour Water Aerodrome",
"Tofino Harbour Water Aerodrome",
"Pender Harbour Water Aerodrome"
]
},
"supporting_facts": {
"sent_id": [
0,
0
],
"title": [
"Pender Harbour Water Aerodrome",
"Kangiqsujuaq (Wakeham Bay) Airport"
]
},
"type": "comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Are both Pender Harbour Water Aerodrome and Kangiqsujuaq (Wakeham Bay) Airport located in the same country?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"yes"
]
},
"style": "rule"
}
|
{
"index": 70,
"split": "test"
}
|
dev_10392
|
Which film came out first, ...And God Created Them or The Little Shop Of Horrors?
|
[
"Little Shop of Horrors",
"The Little Shop of Horrors",
"The Little Shop Of Horrors"
] |
{
"context": {
"content": [
[
"Last Woman on Earth( often referred to as\" The Last Woman on Earth\", but it appeared without the word\" The\" in the film's actual title card) is a 1960 American science fiction film that was produced and directed by Roger Corman.",
"It tells the story of three survivors of a mysterious apocalypse, which appears to have wiped out all human life on earth.",
"The screenplay is by Robert Towne, who also appears in the film under the pseudonym Edward Wain.",
"The music was composed and conducted by Ronald Stein.",
"The film was originally released as a double feature with\" The Little Shop of Horrors\"."
],
[
"And God Created Woman( 1956) is a French drama film directed by Roger Vadim and starring Brigitte Bardot.",
"Though not her first film, it is widely recognized as the vehicle that launched Bardot into the public spotlight and immediately created her\" sex kitten\" persona, making her an overnight sensation.",
"When the film was released in the United States by distributor Kingsley- International Pictures in 1957, it pushed the boundaries of the representation of sexuality in American cinema, and most available prints of the film were heavily edited to conform with the prevailing censorial standards of 1957.",
"An English- language remake also titled\" And God Created Woman\" was directed by Vadim and released in 1988."
],
[
"Please Do n't Eat My Mother is a 1973 exploitation film directed by Carl J. Monson.",
"It is an adult- themed parody of Roger Corman's\" The Little Shop of Horrors\"."
],
[
"The Little Shop of Horrors is a 1960 American comedy film directed by Roger Corman.",
"Written by Charles B. Griffith, the film is a farce about an inadequate florist's assistant who cultivates a plant that feeds on human blood.",
"The film's concept is thought to be based on a 1932 story called \"Green Thoughts\", by John Collier, about a man-eating plant.",
"However, Dennis McDougal suggests that Griffith may have been influenced by Arthur C. Clarke's sci-fi short story from 1956, \"The Reluctant Orchid\" (which was in turn inspired by the 1905 H. G. Wells story \"The Flowering of the Strange Orchid\").",
"The film stars Jonathan Haze, Jackie Joseph, Mel Welles, and Dick Miller, all of whom had worked for Corman on previous films.",
"Produced under the title \"The Passionate People Eater\", the film employs an original style of humor, combining black comedy with farce and incorporating Jewish humor and elements of spoof.",
"\"The Little Shop of Horrors\" was shot on a budget of $28,000 (about $240,000 in 2019), with interiors being shot in two days utilizing sets that had been left standing from \"A Bucket of Blood\".",
"The film slowly gained a cult following through word of mouth when it was distributed as the B movie in a double feature with Mario Bava's \"Black Sunday\" and later with \"Last Woman on Earth\".",
"The film's popularity increased with local television broadcasts, in addition to the presence of a young Jack Nicholson, whose small role in the film has been prominently promoted on home video releases of the film.",
"The film was the basis for an Off-Broadway musical, \"Little Shop of Horrors\", which was notably made into a 1986 feature film and enjoyed a 2003 Broadway revival, all of which have attracted attention to the 1960 film."
],
[
"And God Created Woman is a 1988 American comedy- drama film directed by Roger Vadim and starring Rebecca De Mornay, Vincent Spano and Frank Langella.",
"It has the same title as the 1956 French film\" Et Dieu… créa la femme And God created woman\") starring Brigitte Bardot, also directed by Vadim, but has a completely different story."
],
[
"Chamber of Horrors is a 1929 British silent horror film directed by Walter Summers and starring Frank Stanmore.",
"It was made at Welwyn Studios.",
"A man spends the night in the Chamber of Horrors of Madame Tussauds."
],
[
"God Created",
"Them( Spanish: Dios los cría) is a 1953 Mexican comedy film directed by Gilberto Martínez Solares and starring Germán Valdés, Niní Marshall and Marcelo Chávez."
],
[
"Little Shop of Horrors is a 1986 American horror comedy musical film directed by Frank Oz.",
"It is a film adaptation of the off- Broadway musical comedy of the same name by composer Alan Menken and writer Howard Ashman, which in turn was based on the 1960 film\" The Little Shop of Horrors\", directed by Roger Corman, about a geeky florist shop worker who finds out his Venus flytrap can speak.",
"The film stars Rick Moranis, Ellen Greene, Vincent Gardenia, Steve Martin, and Levi Stubbs as the voice of Audrey II.",
"The film also featured special appearances by Jim Belushi, John Candy, Christopher Guest and Bill Murray.",
"It was produced by David Geffen through The Geffen Company and released by Warner Bros. on December 19, 1986.",
"\" Little Shop of Horrors\" was filmed on the Albert R. Broccoli 007 Stage at the Pinewood Studios in England, where a\" downtown\" set, complete with overhead train track, was constructed.",
"Produced on a budget of$ 25 million, in contrast to the original 1960 film, which, according to Corman, only cost$ 30,000, it was well received by critics and audiences alike, eventually developing a cult following.",
"The film's original 23- minute finale, based on the musical's ending, was rewritten and reshot after test audiences did not react positively to it.",
"For years only available as black- and- white workprint footage, the original ending was fully restored in 2012 by Warner Home Video."
],
[
"Little Shop of Horrors may refer to:"
],
[
"... And God Created",
"Them is a 1979 Puerto Rican comedy film directed by Jacobo Morales.",
"It was screened in the Un Certain Regard section at the 1981 Cannes Film Festival."
]
],
"title": [
"Last Woman on Earth",
"And God Created Woman (1956 film)",
"Please Don't Eat My Mother",
"The Little Shop of Horrors",
"And God Created Woman (1988 film)",
"Chamber of Horrors (1929 film)",
"God Created Them",
"Little Shop of Horrors (film)",
"Little Shop of Horrors",
"...And God Created Them"
]
},
"supporting_facts": {
"sent_id": [
1,
0
],
"title": [
"...And God Created Them",
"The Little Shop of Horrors"
]
},
"type": "comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which film came out first, ...And God Created Them or The Little Shop Of Horrors?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Little Shop of Horrors",
"The Little Shop of Horrors",
"The Little Shop Of Horrors"
]
},
"style": "rule"
}
|
{
"index": 71,
"split": "test"
}
|
dev_6939
|
Which film whose director is younger, Sujon Sokhi or Two Thousand Maniacs!?
|
[
"Sujon Sokhi"
] |
{
"context": {
"content": [
[
"2001",
"Maniacs is a 2005 American comedy horror film directed by Tim Sullivan and starring Robert Englund, Lin Shaye, Jay Gillespie, Dylan Edrington, and Matthew Carey.",
"It is a remake of the 1964 film\" Two Thousand Maniacs!\" written and directed by Herschell Gordon Lewis.",
"The film is distributed by Lions Gate Entertainment.",
"It was filmed in Westville, Georgia."
],
[
"Khan Ataur Rahman( known as Khan Ata; 11 December 1928 – 1 December 1997) was a Bangladeshi film actor, director, producer, screenplay writer, music composer, and singer.",
"He became renowned for his role in the film\" Jibon Theke Neya\"( 1970).",
"He received Bangladesh National Film Award for Best Screenplay for the film\" Sujon Sokhi\"( 1975) and\" Danpite Chhele\"( 1980).",
"He was awarded Ekushey Padak posthumously in 2003 by the Government of Bangladesh."
],
[
"Sujon Sokhi is a 1975 Bangladeshi film directed by Khan Ataur Rahman.",
"The film received critical acclaim, particularly for the performance given by Kabori Sarwar, who won Bachsas Award for Best Actress.",
"Both Sabina Yasmin and Abdul Alim earned Bangladesh National Film Awards for the song\" Saat Sokhire Par Korite\"."
],
[
"Two Thousand Maniacs! is a 1964 American splatter and hixploitation film written and directed by Herschell Gordon Lewis and starring 1963\" Playboy\" Playmate Connie Mason.",
"It follows a group of Northern tourists who are savagely tortured and murdered during a Confederate celebration of a small Southern community's centennial.",
"It was the second part of what the director's fans later dubbed\" The Blood Trilogy\", a trio of films starting with 1963's\" Blood Feast\" and ending with 1965's\" Color Me Blood Red\".",
"The film has been noted by critics as an early example of Southern caricature in grindhouse films, as well as for its sensationalizing of national perceptions between the North and South.",
"The film was remade in 2005 as\" 2001 Maniacs\".",
"The story of the film was inspired by the 1947 Lerner and Loewe musical\" Brigadoon\"."
],
[
"La Bestia humana is a 1957 Argentine film whose story is based on the novel\" La Bête Humaine\" by the French writer Émile Zola."
],
[
"Herschell Gordon Lewis( June 15, 1926 – September 26, 2016) was an American filmmaker, best known for creating the\" splatter\" subgenre of horror films.",
"He is often called the\" Godfather of Gore\"( a title also given to Lucio Fulci), though his film career included works in a range of exploitation film genres including juvenile delinquent films, nudie-cuties, two children's films and at least one rural comedy.",
"On Lewis' career, All Movie wrote,\" With his better- known gore films, Herschell Gordon Lewis was a pioneer, going farther than anyone else dared, probing the depths of disgust and discomfort onscreen with more bad taste and imagination than anyone of his era.\""
],
[
"2001",
"Maniacs:",
"Field of Screams is a 2010 comedy horror film written and directed by Tim Sullivan.",
"It is a sequel to his earlier film\" 2001 Maniacs\".",
"The story and characters are based on Herschell Gordon Lewis' original film\" Two Thousand Maniacs\".",
"The film stars Bill Moseley, Lin Shaye, Christa Campbell, Andrea Leon, and Nivek Ogre."
],
[
"Color Me Blood Red is a 1965 splatter film written and directed by Herschell Gordon Lewis about a psychotic painter who murders civilians and uses their blood as red paint.",
"It is the third part of what the director's fans have dubbed\" The Blood Trilogy\", including\" Blood Feast\"( 1963) and\" Two Thousand Maniacs!\"( 1964)."
],
[
"Connie Mason( born August 24, 1937 in Washington, D.C.) is an American model and actress who was\" Playboy\" magazine's Playmate of the Month for its June 1963 issue.",
"Mason then acted in the gore movies pioneered by Herschell Gordon Lewis,\" Blood Feast\" and\" Two Thousand Maniacs!\"",
"Her centerfold was photographed by Pompeo Posar.",
"She was also a Playboy Bunny at the Chicago club."
],
[
"Sushoma Sarkar is a Bangladeshi television and film actress who appeared in films like Sujon Sokhi( 1975), Gunda( 1978), Alangkar( 1979) etc."
]
],
"title": [
"2001 Maniacs",
"Khan Ataur Rahman",
"Sujon Sokhi",
"Two Thousand Maniacs!",
"La Bestia humana",
"Herschell Gordon Lewis",
"2001 Maniacs: Field of Screams",
"Color Me Blood Red",
"Connie Mason",
"Sushoma Sarkar"
]
},
"supporting_facts": {
"sent_id": [
0,
0,
0,
0
],
"title": [
"Sujon Sokhi",
"Two Thousand Maniacs!",
"Khan Ataur Rahman",
"Herschell Gordon Lewis"
]
},
"type": "bridge_comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which film whose director is younger, Sujon Sokhi or Two Thousand Maniacs!?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Sujon Sokhi"
]
},
"style": "rule"
}
|
{
"index": 72,
"split": "test"
}
|
dev_4533
|
What is the place of birth of the director of film It'S The Rage (Film)?
|
[
"U.S.A.",
"American",
"US of A",
"US",
"United States of America",
"USA",
"the United States of America",
"United States",
"the US",
"🇺🇸",
"the United States",
"U.S.",
"America",
"the USA"
] |
{
"context": {
"content": [
[
"The Rage is a 2008 Italian film directed by Louis Nero."
],
[
"Peter Levin is an American director of film, television and theatre."
],
[
"Ismail Aden Rage( born 3 January 1953) is a Tanzanian CCM politician and Member of Parliament for Bukene constituency since 2010."
],
[
"Ian Barry is an Australian director of film and TV."
],
[
"It's the Rage is a 1999 film version of Keith Reddin's play \"All The Rage\" about three interconnected stories and how handguns affect each of the nine people involved.",
"The film is the directorial debut of producer James D. Stern."
],
[
"The Rage is a 1997 American action- thriller film directed by Sidney J. Furie."
],
[
"James D. Stern is an American film and Broadway producer.",
"He won a 2003 Tony Award for \"Hairspray\", has been nominated for many other awards, and was a Drama Desk Award Winner for \"Stomp\"."
],
[
"Brian Patrick Kennedy( born 5 November 1961) is an Irish- born art museum director who has worked in Ireland and Australia, and now lives and works in the United States.",
"He is currently the director of the Peabody Essex Museum.",
"He was the director of the Toledo Museum of Art in Ohio from 2010 to 2019.",
"He was the director of the Hood Museum of Art from 2005 to 2010, and the National Gallery of Australia( Canberra) from 1997- 2004."
],
[
"Stasija Rage( born May 9, 1991) is a Latvian former competitive figure skater.",
"She was born in Riga, and was the 2008 Latvian national champion.",
"She started skating at age four."
],
[
"Olav Aaraas( born 10 July 1950) is a Norwegian historian and museum director.",
"He was born in Fredrikstad.",
"From 1982 to 1993 he was the director of Sogn Folk Museum, from 1993 to 2010 he was the director of Maihaugen and from 2001 he has been the director of the Norwegian Museum of Cultural History.",
"In 2010 he was decorated with the Royal Norwegian Order of St. Olav."
]
],
"title": [
"The Rage (2008 film)",
"Peter Levin",
"Ismail Rage",
"Ian Barry (director)",
"It's the Rage (film)",
"The Rage (1997 film)",
"James D. Stern",
"Brian Kennedy (gallery director)",
"Stasija Rage",
"Olav Aaraas"
]
},
"supporting_facts": {
"sent_id": [
1,
0
],
"title": [
"It's the Rage (film)",
"James D. Stern"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: What is the place of birth of the director of film It'S The Rage (Film)?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"U.S.A.",
"American",
"US of A",
"US",
"United States of America",
"USA",
"the United States of America",
"United States",
"the US",
"🇺🇸",
"the United States",
"U.S.",
"America",
"the USA"
]
},
"style": "rule"
}
|
{
"index": 73,
"split": "test"
}
|
dev_722
|
Which film has the director died later, Taipei Story or The Ballad Of Cossack Golota?
|
[
"Taipei Story"
] |
{
"context": {
"content": [
[
"The Ballad of Uhlans(\" Уланская баллада\") is a 2012 Russian historical adventure film directed by Oleg Fesenko, and stars Sergei Bezrukov, Anna Chipovskaya, Anton Sokolov and Vladimir Gostyukhin.",
"\" The Ballad of Uhlans\" was released on 1 November 2012 in Russia."
],
[
"David Chung( September 6, 1946- April 14, 2006) was a Korean- American actor.",
"He was nominated for an Independent Spirit Award for his performance in the 1993 film\" The Ballad of Little Jo\"."
],
[
"Mary Zophres is an American costume designer who has worked in the film industry since 1994.",
"She has been nominated for the Academy Award for Best Costume Design thrice for\" True Grit\"( 2010),\" La La Land\"( 2016), and\" The Ballad of Buster Scruggs\"( 2018).",
"She has also been nominated for the BAFTA Award for Best Costume Design four times for\" Catch Me If You Can\"( 2002),\" True GritLa La Land\", and\" The Ballad of Buster Scruggs\"."
],
[
"Querelle is a 1982 West German- French English- language drama film directed by Rainer Werner Fassbinder and starring Brad Davis, adapted from French author Jean Genet's 1947 novel\" Querelle of Brest\".",
"It marked Fassbinder's final film as a writer/ director; it was posthumously released just months after the director died of a drug overdose in June 1982."
],
[
"Igor Andreyevich Savchenko or Ihor Andriyovych Savchenko( 11 October 1906 – 14 December 1950) was a screenwriter and film director, often cited as one of the great early Soviet filmmakers, alongside Sergei Eisenstein, Vsevolod Pudovkin and Aleksandr Dovzhenko.",
"He is also known for teaching Sergei Parajanov at the Russian film school VGIK, also attended by Parajanov's best friend Mikhail Vartanov."
],
[
"Edward Yang( November 6, 1947 – June 29, 2007) was a Taiwanese filmmaker.",
"Yang, along with fellow auteurs Hou Hsiao -hsien and Tsai Ming- liang, was one of the leading film- makers of the Taiwanese New Wave and Taiwanese Cinema.",
"He won the Best Director Award at Cannes for his 2000 film\" Yi Yi\"."
],
[
"Eskimo Nell, also known as\" The Ballad of Eskimo Nell\" and as\" The Sexy Saga of Naughty Nell and Big Dick\", is a 1975 British film directed by Martin Campbell and produced by Stanley Long.",
"Though inspired by\" The Ballad of Eskimo Nell\", the movie owes little to the original bawdy song."
],
[
"The Ballad of Little Joe is a 2003 American computer- animated western film produced by Big Idea Productions and released theatrically by Classic Media on August 5, 2003.",
"It is the twentieth episode of VeggieTales.",
"The story is based after 1993's\" The Ballad of Little Jo\", which came out ten years before.",
"It was directed by Tim Hodge, produced by David Pitts and written by Robert G. Lee, the film stars the voices of Phil Vischer, Mike Nawrocki, Jim Poole, Jacquelyn Ritz, Mike Sage( in his final acting role), Lisa Vischer, and Tim Hodge."
],
[
"Taipei Story is a 1985 Taiwanese film directed, scored, and co-written by filmmaker Edward Yang — his second full- length feature film and third overall.",
"The film stars Yang's fellow filmmaker Hou Hsiao -hsien, and singer Tsai Chin, whom Yang subsequently married.",
"It is one of the earliest films of the New Taiwanese Cinema.",
"In the United States, Janus Films gave a limited release of the film's 4 K restoration, done by the World Cinema Project, on March 17, 2017."
],
[
"The Ballad of Cossack Golota is a 1937 Soviet action drama film directed by Igor Savchenko."
]
],
"title": [
"The Ballad of Uhlans",
"David Chung (actor)",
"Mary Zophres",
"Querelle",
"Igor Savchenko",
"Edward Yang",
"Eskimo Nell (film)",
"The Ballad of Little Joe",
"Taipei Story",
"The Ballad of Cossack Golota"
]
},
"supporting_facts": {
"sent_id": [
0,
0,
0,
0
],
"title": [
"Taipei Story",
"The Ballad of Cossack Golota",
"Edward Yang",
"Igor Savchenko"
]
},
"type": "bridge_comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which film has the director died later, Taipei Story or The Ballad Of Cossack Golota?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Taipei Story"
]
},
"style": "rule"
}
|
{
"index": 74,
"split": "test"
}
|
dev_58
|
Which country the director of film Daadagiri is from?
|
[
"pk",
"🇵🇰",
"Islamic Republic of Pakistan",
"Pakistan",
"PAK"
] |
{
"context": {
"content": [
[
"Ian Barry is an Australian director of film and TV."
],
[
"Daadagiri is a 1997 Bollywood film directed by Arshad Khan starring Mithun Chakraborty, Ayub Khan, Simran, Rituparna Sengupta in lead roles and Puneet Issar, Nishigandha Wad, Shakti Kapoor and Kader Khan."
],
[
"John Donatich is the Director of Yale University Press."
],
[
"Olav Aaraas( born 10 July 1950) is a Norwegian historian and museum director.",
"He was born in Fredrikstad.",
"From 1982 to 1993 he was the director of Sogn Folk Museum, from 1993 to 2010 he was the director of Maihaugen and from 2001 he has been the director of the Norwegian Museum of Cultural History.",
"In 2010 he was decorated with the Royal Norwegian Order of St. Olav."
],
[
"Brian Patrick Kennedy( born 5 November 1961) is an Irish- born art museum director who has worked in Ireland and Australia, and now lives and works in the United States.",
"He is currently the director of the Peabody Essex Museum.",
"He was the director of the Toledo Museum of Art in Ohio from 2010 to 2019.",
"He was the director of the Hood Museum of Art from 2005 to 2010, and the National Gallery of Australia( Canberra) from 1997- 2004."
],
[
"Michael Govan( born 1963) is the director of the Los Angeles County Museum of Art since 2006.",
"Prior to this, Govan worked as the director of the Dia Art Foundation in New York City."
],
[
"Arshad Khan (born March 22, 1971) is a Pakistani Australian cab driver and former cricketer.",
"He is a right-handed batsman and a right-arm off break bowler.",
"Arshad was first picked to play against the West Indies during the 1997–98 season, and the following year, was part of the team which won the Asian Test Championship at Dhaka against Sri Lanka.",
"He captained Pakistan in the 1998 Commonwealth Games Kuala Lumpur.",
"He was a regular inclusion in the Pakistani side until 2001.",
"Four years later, a strong performance in the Pakistani domestic championship meant that Arshad earned a recall for Pakistan's 2005 tour of India.",
"He performed credibly, particularly in the Bangalore Test, which Pakistan won in the last session to draw the series.",
"He toured the Caribbean in May 2005, and has retained his place for the upcoming England series.",
"During the 2005 One Day International series against England, Arshad was used in the second and fifth matches and proved effective at repressing the England batsmen, allowing very few runs to be scored off him and also taking wickets.",
"During the fifth match, his economy was just over 3 runs per over – a very good figure for any bowler, especially a spinner.",
"A tall man at 6'4\", Arshad bowls in a classical off-spinner's mould, preferring a nagging line to any great variation.",
"Khan now lives in Sydney, Australia."
],
[
"Peter Levin is an American director of film, television and theatre."
],
[
"Dana Blankstein- Cohen( born March 3, 1981) is the director of the Israeli Academy of Film and Television.",
"She is a film director, and an Israeli culture entrepreneur."
],
[
"John Farrell is the director of YouTube in Latin America."
]
],
"title": [
"Ian Barry (director)",
"Daadagiri",
"John Donatich",
"Olav Aaraas",
"Brian Kennedy (gallery director)",
"Michael Govan",
"Arshad Khan (cricketer)",
"Peter Levin",
"Dana Blankstein",
"John Farrell (businessman)"
]
},
"supporting_facts": {
"sent_id": [
0,
0
],
"title": [
"Daadagiri",
"Arshad Khan (cricketer)"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which country the director of film Daadagiri is from?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"pk",
"🇵🇰",
"Islamic Republic of Pakistan",
"Pakistan",
"PAK"
]
},
"style": "rule"
}
|
{
"index": 75,
"split": "test"
}
|
dev_5464
|
Where was the husband of Princess Maria Immaculata Of Bourbon-Two Sicilies (1844–1899) born?
|
[
"Fiuränza",
"Firenzi",
"Florentzia",
"Florentia",
"Florence, Italy",
"Florenz",
"Florencie",
"Florencia",
"Firense",
"Firenca",
"Florence",
"Firenz",
"Florence, Tuscany",
"Firenze"
] |
{
"context": {
"content": [
[
"Princess Maria Immaculata of Bourbon-Two Sicilies (Full Italian name: \"Maria Immacolata Clementina, Principessa di Borbone delle",
"Due Sicilie\")",
"(14 April 1844, Naples, Kingdom of the Two Sicilies – 18 February 1899, Vienna, Austria) was fifth child and second-eldest daughter of Ferdinand II of the Two Sicilies and his wife Maria Theresa of Austria.",
"Through her marriage to Archduke Karl Salvator of Austria, Maria Immaculata became an Austrian Archduchess."
],
[
"Princess Urraca of Bourbon- Two Sicilies( July 14, 1913, Nymphenburg Palace, Munich, Kingdom of Bavaria – May 3, 1999, Sigmaringen, Baden- Württemberg, Germany) was a member of the House of Bourbon- Two Sicilies and a Princess of Bourbon- Two Sicilies."
],
[
"Duchess Elisabeth of Württemberg( born 2 February 1933) is a member of the House of Württemberg and a Duchess of Württemberg by birth and through her marriage to Prince Antoine of Bourbon- Two Sicilies, a member of the House of Bourbon- Two Sicilies and a Princess of Bourbon- Two Sicilies."
],
[
"Prince Carlo of Bourbon- Two Sicilies, Duke of Castro( born 24 February 1963) is a claimant to the headship of the House of Bourbon- Two Sicilies."
],
[
"Archduchess María of Austria( née Princess María of Bourbon- Two Sicilies; born 5 April 1967) is a princess of Bourbon- Two Sicilies and the wife of Archduke Simeon of Austria."
],
[
"Archduke Karl Salvator of Austria, in Italian Carlo Salvatore Maria Giuseppe Giovanni Battista Filippo Jacopo Gennaro Lodovico Gonzaga Raniero, in German Karl Salvator Maria Joseph Johann Baptist",
"Philipp Jakob Januarius Ludwig Gonzaga Ranier (Florence, 30 April 1839 – Vienna, 18 January 1892)",
", was a member of the Tuscan branch of the House of Habsburg."
],
[
"Prince Philip of Bourbon- Two Sicilies( Filippo Maria Alfonso Antonio Ferdinando Francesco di Paola Lodovico Enrico Alberto Taddeo Francesco Saverio Uberto; 10 December 1885 – 9 March 1949) was a member of the House of Bourbon- Two Sicilies and a Prince of Bourbon- Two Sicilies."
],
[
"Leopoldo Giovanni Giuseppe Michele of Bourbon- Two Sicilies, Prince of Salerno( 2 July 1790, Naples, Kingdom of Naples – 10 March 1851, Naples, Kingdom of the Two Sicilies) was a member of the House of Bourbon- Two Sicilies and a Prince of Bourbon- Two Sicilies."
],
[
"Princess Maria Carolina of Bourbon- Two Sicilies, Duchess of Castro( née Countess Maria Carolina Zamoyska)( 22 September 1896 in Kraków, Austria- Hungary – 9 May 1968 in Marseille, France) was the wife of Prince Ranieri, Duke of Castro, claimant to the headship of the House of Bourbon- Two Sicilies.",
"Carolina was the sixth child and youngest daughter of Polish nobleman Andrzej Przemysław Zamoyski, Count Zamoyski and his wife Princess Maria Carolina of Bourbon- Two Sicilies.",
"Carolina's maternal grandparents were Prince Francis of Bourbon- Two Sicilies, Count of Trapani and his wife Archduchess Maria Isabella of Austria, Princess of Tuscany."
],
[
"Princess\" Maria Immaculata\" Cristina Pia Isabella of Bourbon- Two Sicilies( Full Italian name:\" Maria Immacolata Speranza Pia Teresa Cristina Filomena Lucia Anna Isabella Cecilia Apollonia Barbara Agnese Zenobia, Principessa di Borbone delle Due Sicilie\")( 30 October 1874 – 28 November 1947) was the fourth child and eldest daughter of Prince Alfonso of Bourbon- Two Sicilies, Count of Caserta and his wife Princess Maria Antonietta of Bourbon- Two Sicilies."
]
],
"title": [
"Princess Maria Immaculata of Bourbon-Two Sicilies (1844–1899)",
"Princess Urraca of Bourbon-Two Sicilies",
"Duchess Elisabeth of Württemberg (b. 1933)",
"Prince Carlo, Duke of Castro",
"Archduchess María of Austria (b. 1967)",
"Archduke Karl Salvator of Austria",
"Prince Philip of Bourbon-Two Sicilies",
"Leopold, Prince of Salerno",
"Maria Carolina Zamoyska",
"Princess Maria Immaculata of Bourbon-Two Sicilies (1874–1947)"
]
},
"supporting_facts": {
"sent_id": [
3,
1
],
"title": [
"Princess Maria Immaculata of Bourbon-Two Sicilies (1844–1899)",
"Archduke Karl Salvator of Austria"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Where was the husband of Princess Maria Immaculata Of Bourbon-Two Sicilies (1844–1899) born?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Fiuränza",
"Firenzi",
"Florentzia",
"Florentia",
"Florence, Italy",
"Florenz",
"Florencie",
"Florencia",
"Firense",
"Firenca",
"Florence",
"Firenz",
"Florence, Tuscany",
"Firenze"
]
},
"style": "rule"
}
|
{
"index": 76,
"split": "test"
}
|
dev_2143
|
Who is the paternal grandfather of Johann, Count Of Cleves?
|
[
"Dietrich VI, Count of Cleves"
] |
{
"context": {
"content": [
[
"Dietrich VIII( – 7 July 1347) was a German nobleman.",
"He was Count of Cleves from 1310 through 1347.",
"Dietrich was the son of Dietrich VII, Count of Cleves and his second wife Margaret of Habsburg.",
"He succeeded in 1310 his half- brother Otto, Count of Cleves who had died without sons.",
"Dietrich VIII married twice:",
"He was succeeded by his brother John.< br>",
"When his brother also died without a son, the County of Cleves went to his daughter's son Adolph III of the Marck and so to the Counts of Marck."
],
[
"Dietrich IX, Count of Mark( 1374 – 1398) was the Count of Mark from 1393 until 1398.",
"Dietrich was the second son of Count Adolf III of the Marck and Margaret of Jülich.",
"His father had acquired the County of Cleves in 1368 and reserved this title for his eldest son Adolph to succeed him after his death.",
"Dietrich already received the title of Count of Mark in 1393, when his father was still alive.",
"When Dietrich fell in battle in 1398, he was succeeded by his elder brother Adolph, who had become Count of Cleves in 1394.",
"Thus the County of Mark and the County of Cleves were reunited again."
],
[
"Adolph II of the Marck( died 19/20 October 1347, Fröndenberg) was Count of the Marck.",
"He was the eldest son of Engelbert II of the Mark and Mechtild of Arenberg.",
"Adolph was betrothed to Irmgard of Cleves, daughter of Otto, Count of Cleves and his( second) wife Mechtild of Virneburg.",
"After obtaining a papal dispensation Adolf was allowed to marry Margaret of Cleves, the daughter of Dietrich VIII, Count of Cleves and Margaret of Guelders.",
"Adolph and Margaret of Cleves had seven children:"
],
[
"Johann was last Count of Cleves, from 1347 through 1368.",
"Upon his death in 1368, the counties of Cleves and Count of Mark were united.",
"Johann was the youngest son of Dietrich VII, Count of Cleves and his second wife Margaret of Habsburg.",
"He succeeded in 1347 his brother Dietrich VIII, Count of Cleves who had died without sons.",
"He married Mechteld of Guelders (",
"† 1384), daughter of Reginald II, Duke of Guelders, but the marriage remained childless.",
"After his death, the County of Cleves went to Adolf III of the Marck and so to the Counts of Marck."
],
[
"Adolph I of Cleves( 2 August 1373 – 23 September 1448) was the second Count of Cleves and the fourth Count of Mark."
],
[
"Mary of Burgundy, Duchess of Cleves( 1393 – 30 October 1466) was the second child of John the Fearless and Margaret of Bavaria, and an elder sister of Philip the Good.",
"Born in Dijon, she became the second wife of Adolph, Count of Mark in May 1406.",
"He was made the 1st Duke of Cleves in 1417.",
"They were the grandparents of King Louis XII of France and the great- grandparents of John III, Duke of Cleves, father of Anne of Cleves, who was fourth Queen consort of Henry VIII of England.",
"By their daughter, Catherine, they were ancestors of Mary, Queen of Scots.",
"The Duke and Duchess of Cleves lived at Wijnendale Castle in West Flanders.",
"She died in Cleves in present- day Monterberg, Kalkar."
],
[
"John I, Duke of Cleves, Count of Mark( 16 February 1419 – 5 September 1481) was Duke of Cleves and Count of Mark."
],
[
"Karl Harst( fl. 1540) was a sixteenth- century diplomat from the German Duchy of Cleves.",
"He was the Duke of Cleves' representative in London during the marriage of Henry VIII of England and his fourth wife, Anne of Cleves."
],
[
"Dietrich VII (1256–1305) was Count of Cleves from 1275 through 1305.",
"He was the son of Dietrich VI, Count of Cleves and his wife Aleidis von Heinsberg.",
"The County of Cleves was a comital polity of the Holy Roman Empire in present Germany (part of North Rhine-Westphalia) and the Netherlands (parts of Limburg, Noord-Brabant and Gelderland).",
"Its rulers, called counts, had a special and privileged standing in the Empire.",
"The County of Cleves was first mentioned in the 11th century.",
"In 1417, the county became a duchy and its rulers were raised to the status of Dukes.",
"Its history is closely related to that of its neighbours: the Duchies of Jülich, Berg and Guelders and the County of Mark.",
"In 1368, Cleves and Mark were united.",
"In 1521 Jülich, Berg, Cleves and Mark formed the United Duchies of Jülich-Cleves-Berg.",
"The territory was situated on both sides of the river Rhine, around its capital Cleves and roughly covering today's districts of Cleves, Wesel and the city of Duisburg."
],
[
"Elisabeth of Cleves was the daughter of Adolph I, Duke of Cleves and Marie of Burgundy, Duchess of Cleves.",
"She married Count Henry XXVI of Schwarzburg- Blankenburg, on July 15, 1434."
]
],
"title": [
"Dietrich VIII, Count of Cleves",
"Dietrich IX, Count of Mark",
"Adolph II of the Marck",
"Johann, Count of Cleves",
"Adolph I, Duke of Cleves",
"Mary of Burgundy, Duchess of Cleves",
"John I, Duke of Cleves",
"Karl Harst",
"Dietrich VII, Count of Cleves",
"Elisabeth of Cleves, Countess of Schwarzburg-Blankenburg"
]
},
"supporting_facts": {
"sent_id": [
2,
1
],
"title": [
"Johann, Count of Cleves",
"Dietrich VII, Count of Cleves"
]
},
"type": "inference"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Who is the paternal grandfather of Johann, Count Of Cleves?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Dietrich VI, Count of Cleves"
]
},
"style": "rule"
}
|
{
"index": 77,
"split": "test"
}
|
dev_10438
|
Who is the father of the performer of song Theme From New York, New York?
|
[
"Vincente Minnelli",
"Lester Anthony Minnelli"
] |
{
"context": {
"content": [
[
"John Harold Kander( born March 18, 1927) is an American composer, known largely for his work in the musical theater.",
"As part of the songwriting team Kander and Ebb( featuring lyricist Fred Ebb), Kander wrote the scores for 15 musicals, including\" Cabaret\"( 1966) and\" Chicago\"( 1975), both of which were later adapted into acclaimed films.",
"He and Ebb also wrote the standard\" New York, New York\"( also known as\" Theme from\" New York, New York)."
],
[
"Frank H. Fleet( 1848 in New York, New York – June 13, 1900 in New York, New York) was a Major League Baseball player in the 19th century."
],
[
"Chris Duffy( born December 17, 1987 in New York, New York) is a minor league baseball player who is currently a free agent."
],
[
"The title song's theme is\" a girl's motivation\"."
],
[
"Liza May Minnelli (born March 12, 1946) is an American actress and singer.",
"Best known for her Academy Award-winning performance in \"Cabaret\" (1972), she is famous for her energetic stage presence and her powerful mezzo-soprano singing voice.",
"She is the daughter of Judy Garland and Vincente Minnelli.",
"In 1972, she became the only Oscar recipient whose parents also won Oscar awards.",
"Seeking theatrical work, Minnelli moved to New York City in 1961, where she began her career as a musical theatre actress, nightclub performer and traditional pop music artist.",
"She made her professional stage debut in the 1963 Off-Broadway revival of \"Best Foot Forward\" and won a Tony Award for starring in \"Flora the Red Menace\" in 1965, which marked the start of her lifelong collaboration with John Kander and Fred Ebb.",
"They wrote, produced or directed many of Minnelli's future stage acts and TV shows and helped create her stage persona of a stylized survivor, including her career-defining performances of anthems of survival (\"New York, New YorkCabaret\" and \"Maybe This Time\").",
"Along with her roles on stage and screen, this persona and her style of performance added to Minnelli's status as an enduring gay icon.",
"Critically lauded for her early non-musical screen performances—especially \"The Sterile Cuckoo\" (1969)—Minnelli rose to international stardom, starring in \"Cabaret\" and the Emmy Award-winning TV special \"Liza with a Z\" (1972).",
"Most of her following films—including \"Lucky Lady\" (1975), \"New York, New York\" (1977),",
"\"Rent-a-Cop\" (1988) and \"Stepping Out\" (1991)—were panned by the critics and bombed at the box office, and she had no more major movie hits except \"Arthur\" (1981).",
"She returned to Broadway on a number of occasions, including \"The Act\" (1977), \"The Rink\" (1984) and",
"\"Liza's at The Palace...\" (2008), worked on various television formats and has predominantly focused on music hall and nightclub performances since the late 1970s.",
"Her concert performances at Carnegie Hall in 1979 and 1987, and at Radio City Music Hall in 1991 and 1992 are recognized among her most successful.",
"From 1988 to 1990, she toured with Frank Sinatra and Sammy Davis Jr. in \"Frank, Liza & Sammy:",
"The Ultimate Event\".",
"Best known for her renditions of pop standards, Minnelli's early 1960s pop singles were produced to attract a young audience, and her albums from 1968 to 1977 contained much of the contemporary singer-songwriter material.",
"In 1989, she ventured into the contemporary pop scene by collaborating with the Pet Shop Boys on the album \"Results\".",
"After a hiatus due to serious health problems, Minnelli returned to the concert stage in 2002 with \"Liza's Back\" and was an acclaimed guest star in the sitcom \"Arrested Development\" between 2003 and 2013.",
"Since the 2010s, she has avoided huge concert tours in favor of small retrospective performances."
],
[
"Joey Scarbury( born June 7, 1955) is an American singer and songwriter best known for his hit song,\" Theme from The Greatest American Hero( Believe It or Not)\", in 1981."
],
[
"\"Theme from \"New York, New York (or \"New York, New York\") is the theme song from the Martin Scorsese film \"New York, New York\" (1977), composed by John Kander, with lyrics by Fred Ebb.",
"It was written for and performed in the film by Liza Minnelli.",
"It remains one of the best-known songs about New York City.",
"In 2004 it finished #31 on AFI's 100 Years...",
"100 Songs survey of top tunes in American Cinema."
],
[
"Christina Kirk( born November 10, 1970) in New York, New York, is an American actress."
],
[
"\" New York, New York( So Good They Named It Twice)\" is a song performed and composed by singer- songwriter Gerard Kenny in 1978.",
"The song is an ode to his hometown and state New York, New York.",
"Kenny wrote the song about myth versus fact and explains that the city is overall a great place(\" It was a really nice night for a good street fight or a robbery/ But I always knew in my home town that would never happen to me\").",
"The line\" So Good They Named It Twice\" is a play on the fact the name of the city and state are both\" New York\".",
"\" New York, New York\"( as it is sometimes simply called) was the first hit for Kenny after a number of non-charting singles in the United States.",
"In 1978, after moving to England, the song was released.",
"Although it only reached number 42 on the UK Singles Chart, it remained there for two months.",
"It also received some airplay, and was included on Kenny's\" Made It Through The Rain\" album, and the song's popularity helped the collection reach the top 20 in the UK Albums Chart.",
"Shirley Bassey later covered the song, as part of her\" New York Medley\", along with\" Theme from New York, New York\".",
"The song was included on the following albums released by Kenny:",
"The song was also parodied in the Beautiful South song\" Liar's Bar\" in which there was the line\" singing:' whisky, whiskey ... so good they named it twice'\".",
"On the 1959 album by George Russell,\" New York, NY\"( Decca Records), on the first track, Rodgers and Hart's\" Manhattan\", vocalist Jon Hendricks says at 0:44\" ... a city so nice, they had to name it twice ...\""
],
[
"Eric Wong, born in New York, New York, is the Chief Operating Officer of Island Records."
]
],
"title": [
"John Kander",
"Frank Fleet",
"Christopher Duffy (baseball)",
"Ima ni Kitto... In My Life",
"Liza Minnelli",
"Joey Scarbury",
"Theme from New York, New York",
"Christina Kirk",
"New York, New York (So Good They Named It Twice)",
"Eric Wong"
]
},
"supporting_facts": {
"sent_id": [
1,
2
],
"title": [
"Theme from New York, New York",
"Liza Minnelli"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Who is the father of the performer of song Theme From New York, New York?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Vincente Minnelli",
"Lester Anthony Minnelli"
]
},
"style": "rule"
}
|
{
"index": 78,
"split": "test"
}
|
dev_4291
|
Does Shunya Yoshimi have the same nationality as Lily Van Den Broecke?
|
[
"no"
] |
{
"context": {
"content": [
[
"He is a professor at the Interfaculty Initiative in Information Studies, the University of Tokyo.",
"He is one of the most influential cultural sociologists in Japan and best known for helping introduce cultural studies to Japan.",
"His research topics include cultural studies, media studies and urban sociology.",
"Shunya Yoshimi has authored several books on subjects such as cultural theory, urban culture, international exposition, media culture, information technology, the emperor system, and the Americanization of modern Japan and East Asia.",
"Yoshimi's works in Japanese include\" Dramaturgy in the City: A Social History of Popular Entertainments in Modern Tokyo\"( Kobundo, 1987),\" The Politics of Exposition: Imperialism, Commercialism and Popular Entertainment\"( Chuokoronsha, 1992),\" Cultural Sociology in the Media Age\"( Shinyosha, 1994),\" Voice of Capitalism: The Social Construction of Telephone, Gramophone and Radio in Japan\"( Kodansha, 1995),\" Expo Syndrome: Postwar Politics and Cultural Struggle in Postwar Japan\"( Chikuma Shobo, 2005) and\" Pro- America, Anti- America: Political Unconsciousness in Postwar Japan\"( Iwanami Shoten, 2007).",
"He was the dean of the Interfaculty Initiative in Information Studies, the University of Tokyo( 2006- 2008) and currently a vice president of the University of Tokyo."
],
[
"Franciscus van den Enden( c. 5 February 1602 in Antwerp- 27 November 1674 in Paris) was a former Jesuit, Neo- Latin poet, physician, art dealer, philosopher, and plotter against Louis XIV of France, who is mainly known as the teacher of Baruch de Spinoza( 1632- 1677).",
"His name is also written as' Van den Ende',' Van den Eijnde',' Van den Eijnden', etc.",
"At the end of his life he was also known as' Affinius'( Latinized form of' Van den Enden')."
],
[
"Maarten Albert van den Bergh( born 19 April 1942 in New York City) is a Dutch businessman.",
"Van den Bergh is the son of Maria Meijers( 1905 – 1957) and Sidney James van den Bergh, long- term chairman of Unilever and Dutch Minister of Defense in 1959.",
"his grandfather Samuel van den Bergh was one of the founders of Unilever.",
"Van den Bergh spent 32 years at Royal Dutch Shell, where he rose to president, and to vice- president of the committee of managing directors.",
"In 2001, he left Shell to become chairman of Lloyds TSB, a financial services company based in the United Kingdom.",
"Board memberships:",
"In 2005, The Times named him the most powerful businessman in Great Britain."
],
[
"Henri\" Hans\" van den Broek( born 11 December 1936) is a retired Dutch politician and diplomat of the defunct Catholic People's Party( KVP) and later the Christian Democratic Appeal( CDA) party and jurist.",
"He was granted the honorary title of Minister of State on 25 February 2005.",
"Van den Broek applied at the Utrecht University in June 1959 majoring in Law and obtaining an Bachelor of Laws degree in June 1961 before graduating with an Master of Laws degree in July 1965.",
"Van den Broek worked as a lawyer and prosecutor at\" Blom& Dutilh\" in Rotterdam from August 1965 until May 1968 and as a corporate director at the synthetic fiber company in Arnhem from May 1968 until 12 October 1976.",
"Van den Broek served on the Municipal Council of Rheden from 1 September 1970 until 21 August 1974.",
"Van den Broek became a Member of the House of Representatives after the resignation of Theo van Schaik, taking office on 12 October 1976.",
"After the election of 1981 Van den Broek was appointed as State Secretary for Foreign Affairs in the Cabinet Van Agt II, taking office on 11 September 1981.",
"The Cabinet Van Agt II fell just seven months into its term on 12 May 1982 and continued to serve in a demissionary capacity until it was replaced by the caretaker Cabinet Van Agt III with Van den Broek continuing as State Secretary for Foreign Affairs, taking office on 29 May 1982.",
"After the election of 1982 Van den Broek returned as a Member of the House of Representatives, taking office on 16 September 1982.",
"Following the cabinet formation of 1982 Van den Broek appointed as Minister of Foreign Affairs in the Cabinet Lubbers I, taking office on 4 November 1982.",
"After election of 1986 Van den Broek again returned as a Member of the House of Representatives, taking office on 3 June 1986.",
"Following cabinet formation of 1986 Van den Broek continued as Minister of Foreign Affairs in the Cabinet Lubbers II, taking office on 14 July 1986.",
"The Cabinet Lubbers II fell on 3 May 1989 and continued to serve in a demissionary capacity.",
"After the election of 1989 Van den Broek once again returned as a Member of the House of Representatives, taking office on 14 September 1989.",
"Following the cabinet formation of 1989 Van den Broek remained as Minister of Foreign Affairs in the Cabinet Lubbers III, taking office on 7 November 1989.",
"In November 1992 Van den Broek was nominated as the next European Commissioner from the Netherlands.",
"Van den Broek was giving the heavy portfolios of External Relations and Enlargement in the Third Delors Commission, he resigned as a Minister of Foreign Affairs on 3 January 1993 and was installed as European Commissioner, taking office on 6 January 1993.",
"In December 1994 Van den Broek was re-nominated as European Commissioner.",
"He kept the portfolios of External Relations and Enlargement and got the additional portfolio of Neighbourhood Policy in the Santer Commission, serving from 23 January 1995 until 16 September 1999.",
"Van den Broek retired after spending 23 years in national politics and became active in the private sector and public sector and occupied numerous seats as a corporate director and nonprofit director on several boards of directors and supervisory boards( Institute of International Relations Clingendael, Carnegie Foundation, Radio Netherlands Worldwide and the Schiphol Group) and served as an diplomat and lobbyist for several economic delegations on behalf of the government and as an advocate and activist for Human rights, European integration and Nuclear disarmament.",
"Van den Broek is known for his abilities as a negotiator and debater.",
"Van den Broek continued to comment on political affairs as a statesman until his retirement in 2010 and holds the distinction as the second longest- serving Minister of Foreign Affairs with after World War II.",
"His youngest daughter Princess Marilène is married to Prince Maurits, the oldest son of Princess Margriet the younger sister of former Queen Beatrix."
],
[
"Pieter van den Broecke( 25 February 1585, Antwerp – 1 December 1640, Strait of Malacca) was a Dutch cloth merchant in the service of the Dutch East India Company( VOC), and one of the first Dutchmen to taste coffee.",
"He also went to Angola three times.",
"He was one of the first Europeans to describe societies in West and Central Africa and in detail trade strategies along the African coast."
],
[
"Willem van den Broecke, Willem van den Broeck or Guillelmus Paludanus( 1530 – 1579) was a Flemish sculptor, painter, draughtsman and architect.",
"He was a scion of a family of artists, which had its origins in Mechelen and some members of which later moved to Antwerp.",
"Willem was active in Antwerp and also worked and likely trained in Italy for a long period.",
"He was known for his small scale works, many of which were executed in alabaster.",
"He was, along with Cornelis Floris, the leading sculptor in Antwerp in the second half of the 16th century.",
"He was also one of the designers of the Antwerp City Hall.",
"He enjoyed the patronage of an elite international clientele of church institutions, Spanish nobility, princes of Protestant territories as well as patricians from Augsburg."
],
[
"Mavis Akuamoah Boateng( born September 11, 1977), better known as Lily M, is a Ghanaian Afropop musician and an activist."
],
[
"Lily Jacoba van den Broecke( born 8 January 1992) is a British rower who competed at the 2012 Summer Paralympics as the coxswain in the mixed coxed four for Great Britain, and won the gold medal."
],
[
"Jorien van den Herik( born 11 November 1943 in Sliedrecht) is a Dutch business man and was chairman of Feyenoord Rotterdam till 2006.",
"Van den Herik helped Feyenoord to purchase some players in 1984 as chairman of the\" Friends of Feyenoord Group\".",
"In a few years time he had invested 3 million Dutch guldens in the club.",
"In 1989 be became president of the Stadion Feyenoord NV.",
"When Feyenoord almost was bankrupt due to their sponsor HCS, Van den Herik claimed his position in the Feyenoord board by saying\" If I lose my money, I would like to be there myself\".",
"In 1992, he became Feyenoord's chairman by taking over from.",
"In 1998 the Fiscal Intelligence and Investigation Service( FIOD) visited Feyenoord because of suspected fraud.",
"Mainly based on the signings of Aurelio Vidmar, Christian Gyan and Patrick Allotey.",
"This became a continuous subject in the years to come with Van den Herik as the main suspect.",
"After several visits in court Van den Herik and Feyenoord were found to be not guilty in 2002.",
"The Attorney General however appealed, but in 2005, after three years of investigations the results stayed the same.",
"They still did not give up and went into cassation.",
"On 19 December 2006 Van den Herik and Feyenoord came out of court as the winners again and this time it was the final decision.",
"Van den Herik has already said he will take the Attorney General to court to reclaim the damage his personality has received in all these years.",
"The 2005/06 season ended in disappointment for Feyenoord.",
"The team was challenging the Dutch Championship for most of the season, but eventually lost it to PSV Eindhoven.",
"The newly born Dutch play- offs( held for the first time that year) then became fatal for Feyenoord.",
"Ajax Amsterdam which ended on a fair distance in the regular league was Feyenoord's opponent in the play- offs and overclassed them and Feyenoord was eliminated for winning an UEFA Champions League ticket.",
"During the summer break Feyenoord had to sell two of their key players, Salomon Kalou and the very popular Dirk Kuyt.",
"Only a small amount of the money became available to invest in new players.",
"When Angelos Charisteas was signed from arch rivals Ajax to replace Dirk Kuyt hundreds of die- hard fans moved to the Feyenoord board to force Van den Herik to resign.",
"Van den Herik stayed at the team and did not want to give up his position, unless the growing pressure, the underperforming of the team and several other problems that had not been solved yet.",
"In the meanwhile Feyenoord was placed under special guardianship because of their financial problems.",
"A group of some Feyenoord icons, with former chairman Gerard Kerkum as their leader, operated under the name of\" Commissie Kerkum\" to restructure Feyenoord's future.",
"In December 2006 they published their thoughts and came to the conclusion that Van den Herik should leave the club.",
"Still Van den Herik did not want to leave, however after hooligan incidents in Nancy and the growing pressure he announced his retirement in mid December.",
"Van den Herik himself said not the thoughts of the Commissie Kerkum were the key factor for his decision, but the upcoming final result in the fraud case was.",
"After the case was finished in December",
"2006- Van den Herik was acquitted- he stepped down.",
"In the summer transfer window of 2007, Feyenoord surprised friend and foe with the signing of big names like Roy Makaay and Giovanni van Bronckhorst.",
"According to player manager Rob Jansen, Van den Herik played an important role in the realization of the transfers."
],
[
"Kelly Rutherford( November 6, 1968) is an American actress.",
"She is known for her television roles as Stephanie\" Sam\" Whitmore on the NBC daytime soap opera\" Generations\"( 1989 – 1991), as Megan Lewis on the Fox primetime soap opera\" Melrose Place\"( 1996 – 1999), and as Lily van der Woodsen on The CW series\" Gossip Girl\"( 2007 – 2012)."
]
],
"title": [
"Shunya Yoshimi",
"Franciscus van den Enden",
"Maarten van den Bergh",
"Hans van den Broek",
"Pieter van den Broecke",
"Willem van den Broecke",
"Lily M",
"Lily van den Broecke",
"Jorien van den Herik",
"Kelly Rutherford"
]
},
"supporting_facts": {
"sent_id": [
1,
0
],
"title": [
"Shunya Yoshimi",
"Lily van den Broecke"
]
},
"type": "comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Does Shunya Yoshimi have the same nationality as Lily Van Den Broecke?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"no"
]
},
"style": "rule"
}
|
{
"index": 79,
"split": "test"
}
|
dev_2647
|
Who died first, Gustav Philipp Mörl or Giuseppe Diotti?
|
[
"Gustav Philipp Mörl",
"Gustav Philipp Morl",
"Gustavus Philippus Moerl"
] |
{
"context": {
"content": [
[
"Giuseppe Diotti (1 March 1779 – 30 January 1846) was an Italian painter of the Neoclassic style."
],
[
"\" Nobile\" Giuseppe Maria Buonaparte or Giuseppe Maria di Buonaparte( 31 May 1713, Ajaccio – 13 December 1763, Ajaccio) was an Italian politician.",
"He was the son of Sebastiano Nicola Buonaparte and his wife Maria Anna Tusoli( 1690 – 1760).",
"In 1749, Giuseppe was the Delegate who represented the City of Ajaccio in the Council of Corte."
],
[
"Giuseppe Rottario or\"' Giuseppe Roverio\"( 1657–1720) was a Roman Catholic prelate who served as Bishop of Alba( 1697 – 1720)."
],
[
"Joseph or Giuseppe Ruffini (1690 - February 7, 1749) was an Italian-Austrian painter, mainly active in Germany."
],
[
"Johann Sigismund",
"Mörl( March 3, 1710 – February 22, 1791) was a German theologian.",
"Son of Gustav Philipp",
"Mörl, he was born at Nuremberg March 3, 1710, and was educated in his native place until ready for the university at Altdorf, where he studied theology after 1727.",
"In 1735 he was appointed dean of a church at Nuremberg.",
"He preached until 1759, when he was appointed minister and inspector of the\" Egidianum.\"",
"In 1765 he was elected in this gymnasium to the professorship of Greek.",
"Towards the close of 1770 he was called to the position of minister of St. Lawrence's church.",
"In 1773 he accepted the position of first minister at St. Sebald's church, the superintendency of the consistory of Nuremberg, the office of city librarian, and also a professorship of positive divinity and moral philosophy.",
"He died February 22, 1791.",
"Besides several contributions to the\" Hachische Allgeneine Welthistorie\" and the\" Antideistische Bibel\"( Erlangen, 1768), to which he contributed a new computation of time from the exodus of the Jews to the time of Solomon, he published\" Scholia philologica et critica ad selecta S. Codicis loca\"( Norimb. 1737, 8vo; improved ed. by Wilder, ibid. 1793, 8vo): —\" Schediasma philologico- geographicum, in quo Jo. Harduini disquisitio de situ Paradisi terrestris examinatur\"( ibid. 1750, fol.): —\" Oratio de meritis Norimbergensium in Geographiam\"( ibid. 1750, 8 vo)."
],
[
"Giuseppe Pamphilj( also Giuseppe Panfili or Giuseppe Panphili)( 1525 – 20 November 1581) was a Roman Catholic prelate who served as Bishop of Segni( 1570 – 1581)."
],
[
"Gustav Philipp",
"Koerner, also spelled Gustave or Gustavus Koerner( 20 November 1809 – 9 April 1896) was a revolutionary, journalist, lawyer, politician, judge, and statesman in Illinois and Germany and a Colonel of the U.S. Army who was a confessed enemy of slavery.",
"He married on 17 June 1836 in Belleville Sophia Dorothea Engelmann( 16 November 1815 – 1 March 1888), they had 9 children.",
"He belonged to the co-founders and was one of the first members of the Grand Old Party; and he was a close confidant of Abraham Lincoln and his wife Mary Todd and had an essential role in his nomination and election for president in 1860."
],
[
"Gustav Philipp Mörl or Gustav Philipp Morl; Gustavus Philippus",
"Moerl( December 26,",
"1673 – May 7, 1750) was a German theologian, was born at Nuremberg December 26, 1673, and was educated first in the schools of his native place and then at the university in Altdorf, where he studied philosophy and philology from 1690 to 1692, when he was removed to Jena to study theology and the ancient languages.",
"He traveled through Holland, and visited its most important universities.",
"After his return home he was appointed assistant of the philosophic faculty at Halle, and in 1698 became professor and ecclesiastical inspector at Altdorf.",
"He resigned this position in 1703, and was appointed dean of St. Sebald's church at Nuremberg.",
"In 1706 he was appointed minister of the St. Aegidien church, and inspector of the gymnasium; in 1714 minister at St. Lawrence, in connection with which he had the supervision of the ecclesiastical seminary.",
"In 1724 he was appointed minister of the church of St. Sebald, superintendent of the consistory of Nuremberg, city librarian, and professor of divinity of the Aegidische gymnasium.",
"He died May 7, 1750.",
"Besides several dissertations in journals, he published\" Diss. de distinctione essentiali agnoscenda et attributis\"( Jenae, 1694, 4to): —\" Diss. continens theses miscellaneas\"( Halae, 1694, 4to): —\" Diss. continens vindicationem regularem methodi Cartesianae\"( ibid. 1694, 4to): —\" Diss. i et ii de mense humana\"( ibid. 1696, 1697, 4to): —\" Repetitio doctrinae orthodoxae de fundamento fidei, occasione disputationis",
"Halensis de questione:",
"An haeresis sit crimen?\"( ibid. 1696, 4to): —\" Defensio repetitionis hujus adversus Chr. Thomasium\"( ibid. 1697, 4to): —\" Disquisitio defide, occasione epistolae ad Chr. Thomasium scriptae\"( ibid. 1698, 4to): —\" Diss. de modo dirigendi omnes actiones nostras ad gloriam Dei\"( Altdorf, 1701, 4to): —\" Vindiciae doctrinae Lutheranae de gratia proedestinationis\"( Norimb. 1702, 8vo): —\" Die Lehre von der Busse, in\" 122\" Predigten, nach den Lehrsatzen und Texten entworfen\"( Nurnberg, 1711, 4to): —\" Ordnung der Knige in Juda und Israel, in einer Tabelle\"( ibid. 1740, fol.):",
"—\" Sterbeschule; eine Sammlung von Predigtentwurfen\"( ibid. 1743, 1744, 2 volumes, fol.)."
],
[
"Joseph Bossi or Giuseppe Bossi( born 29 August 1911- date of death unknown) was a Swiss footballer who played for Switzerland in the 1934 FIFA World Cup.",
"He also played for FC Lausanne- Sport, FC Bern 1894, and CA Paris- Charenton."
],
[
"Gustav Philipp Zwinger( 3 January 1779 – 15 January 1819), painter and etcher, was born at Nuremberg.",
"He was the son and pupil of the painter and engraver Sigmund Zwinger( 1744 — 1813), now chiefly remembered as a teacher.",
"Gustav completed his studies under Heinrich Füger in Vienna, and returning to his native town, became in turn professor and director of the Art School.",
"He was also known as an historical painter, both in oil and watercolour, and as a designer of book illustrations.",
"He etched a few plates and tried his hand at lithography.",
"He died at Nuremberg."
]
],
"title": [
"Giuseppe Diotti",
"Giuseppe Maria Buonaparte",
"Giuseppe Rottario",
"Joseph Ruffini",
"Johann Sigismund Mörl",
"Giuseppe Pamphilj (bishop of Segni)",
"Gustav Koerner",
"Gustav Philipp Mörl",
"Joseph Bossi",
"Gustav Philipp Zwinger"
]
},
"supporting_facts": {
"sent_id": [
2,
0
],
"title": [
"Gustav Philipp Mörl",
"Giuseppe Diotti"
]
},
"type": "comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Who died first, Gustav Philipp Mörl or Giuseppe Diotti?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Gustav Philipp Mörl",
"Gustav Philipp Morl",
"Gustavus Philippus Moerl"
]
},
"style": "rule"
}
|
{
"index": 80,
"split": "test"
}
|
dev_12144
|
Which film has the director who was born first, Hell Up In Harlem or The Soviet Story?
|
[
"Hell Up In Harlem",
"Hell Up in Harlem"
] |
{
"context": {
"content": [
[
"The Soviet Story is a 2008 documentary film about Soviet Communism and Soviet – German collaboration before 1941 written and directed by Edvīns Šnore and sponsored by the UEN Group in the European Parliament.",
"The film features interviews with western and Russian historians such as Norman Davies and Boris Sokolov, Russian writer Viktor Suvorov, Soviet dissident Vladimir Bukovsky, members of the European Parliament and the participants, as well as survivors of Soviet terror.",
"Using these interviews together with historical footage and documents the film argues that there were close philosophical, political and organizational connections between the Nazi and Soviet systems.",
"It highlights the Great Purge as well as the Great Famine, Molotov – Ribbentrop Pact, Katyn massacre, Gestapo –NKVD collaboration, Soviet mass deportations and medical experiments in the GULAG.",
"The documentary goes on to argue that the successor states to Nazi Germany and the USSR differ in the sense that postwar Germany condemns the actions of Nazi Germany while the opinion in contemporary Russia is summarized by the quote of Vladimir Putin:\" One needs to acknowledge, that the collapse of the Soviet Union was the greatest geopolitical catastrophe of the century\"."
],
[
"Frederick Robert Williamson( born March 5, 1938), also known as The Hammer, is an American actor and former professional American football defensive back who played mainly in the American Football League during the 1960s.",
"Williamson is perhaps best known for his film career, starring as Tommy Gibbs in the 1973 crime drama film\" Black Caesar\" and its sequel\" Hell Up in Harlem\".",
"Williamson also had other notable roles in other 1970s blaxploitation films such as\" Hammer\"( 1972),\" That Man Bolt\"( 1973) and\" Three the Hard Way\"( 1974)."
],
[
"Hartley W Lobban (9 May 1926 – 15 October 2004) was a Jamaican-born first-class cricketer who played 17 matches for Worcestershire in the early 1950s."
],
[
"Hell Up in Harlem is a 1973 blaxploitation American neo- noir film, starring Fred Williamson and Gloria Hendry.",
"Written and directed by Larry Cohen, it is a sequel to the film\" Black Caesar\".",
"The film's soundtrack was recorded by Edwin Starr and released by Motown Records in January 1974."
],
[
"Wale Adebanwi( born 1969) is a Nigerian- born first Black Rhodes Professor at St Antony's College, Oxford."
],
[
"Hassan Zee is a Pakistani- American film director who was born in Chakwal, Pakistan."
],
[
"Lawrence George Cohen( July 15, 1936 – March 23, 2019) was an American screenwriter, producer, and director of film and television, best known as a B-movie auteur of horror and science fiction films — often containing police procedural and satirical elements — during the 1970s and 1980s, such as\" It's Alive\"( 1974),\" God Told Me To\"( 1976),\" It Lives Again\"( 1978),\" The Stuff\"( 1985) and\" A Return to Salem's Lot\"( 1987).",
"After that, he concentrated mainly on screenwriting, including\" Phone Booth\"( 2002),\" Cellular\"( 2004) and\" Captivity\"( 2007).",
"Early in his career, he was a prolific television writer, creating series such as\" BrandedBlue LightCoronet Blue\", and\" The Invaders\".",
"In 2006, Cohen returned to the directing chair for Mick Garris' anthology series\" Masters of Horror\", directing the episode\" Pick Me Up\".",
"In 2017, he was the recipient of a Lifetime Achievement Award from the Fantasia International Film Festival."
],
[
"Henry Walter Moore( 1849 – 20 August 1916) was an English- born first- class cricketer who spent most of his life in New Zealand."
],
[
"Gloria Hendry( born March 3, 1949), also known inaccurately as Gloria Henry, is an American actress and former Playboy model.",
"She is best known for her roles in films from the 1970s, most notably: portraying Rosie Carver in 1973's James Bond film\" Live and Let Die\"; and Helen Bradley in the blaxploitation film\" Black Caesar\", and the sequel,\" Hell Up in Harlem\"."
],
[
"Edvīns Šnore( born 21 March 1974, in Saulkrasti) is a Latvian film director and politician.",
"He was elected to a four- year term in the Latvian Saeima in 2014 and 2018.",
"Šnore's family comes from Kuldīga.",
"He went to high school in Riga.",
"During the Revolutions of 1989 he sympathized with the Popular Front of Latvia.",
"He first studied political science in Norway, and then earned his master's degree at the University of Latvia.",
"In 2013 he completed his Doctoral thesis on the Western European view of Holodomor.",
"Šnore has become known for his 2008 documentary\" The Soviet Story\".",
"For this film, he was awarded with the Latvian Order of the Three Stars in 2008, and the Estonian Order of the Cross of Terra Mariana in 2009.",
"In 2016 Šnore released another documentary called\" The Unknown War: Baltic Resistance\" on the guerrilla war in the Baltic states."
]
],
"title": [
"The Soviet Story",
"Fred Williamson",
"Hartley Lobban",
"Hell Up in Harlem",
"Wale Adebanwi",
"Hassan Zee",
"Larry Cohen",
"Henry Moore (cricketer)",
"Gloria Hendry",
"Edvīns Šnore"
]
},
"supporting_facts": {
"sent_id": [
1,
0,
0,
0
],
"title": [
"Hell Up in Harlem",
"The Soviet Story",
"Larry Cohen",
"Edvīns Šnore"
]
},
"type": "bridge_comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which film has the director who was born first, Hell Up In Harlem or The Soviet Story?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Hell Up In Harlem",
"Hell Up in Harlem"
]
},
"style": "rule"
}
|
{
"index": 81,
"split": "test"
}
|
dev_7239
|
Who was born earlier, Agustín Lavezzi or Jean Guillaume Audinet-Serville?
|
[
"Jean Guillaume Audinet-Serville",
"Audinet-Serville",
"Jean-Guillaume Audinet-Serville"
] |
{
"context": {
"content": [
[
"Angela is the type genus of mantises of the family Angelidae.",
"The genus was described by the French entomologist Jean Guillaume Audinet- Serville in 1839."
],
[
"Charles Jean-Baptiste Amyot( 23 September 1799, in Vendreeuv – 13 October 1866, in Paris) was a French lawyer and entomologist especially interested in the Hemiptera.",
"After his father died, Amyot lived with a neighbor, a wealthy merchant, who was also an entomologist, Jean Guillaume Audinet- Serville.",
"They become life- long friends, and Audinet- Serville advised Amyot to specialize in the Hemiptera, which at the time was being ignored by serious entomologists.",
"In 1822, Amyot became a lawyer, but he continued to study the Hemiptera.",
"In 1833, he published a work on civil law,\" Institutes, ou Principes des lois civiles Institutes, or the principles of civil law\").",
"In 1843, together with Audinet- Serville, he published\" Histoire naturelle des insectes hémiptères The Natural History of the Hemiptera Insects\").",
"Amyot was also interested in applied entomology and wrote several publications devoted to insect pests and how to fight them.",
"Amyot later became the president of the Entomological Society of France(\" Société entomologique de France\"), where he argued for his monomial nomenclature( see binomial nomenclature) for classification purposes, to the point where his fellow members withheld the honorary membership usually awarded to past presidents."
],
[
"Derobrachus brevicollis is a species of beetle in the family Cerambycidae.",
"It was described by Jean Guillaume Audinet- Serville in 1832."
],
[
"Jean Guillaume Audinet- Serville( his name, before the Revolution, included a particle: Audinet de Serville) was a French entomologist, born on 11 November 1775 in Paris.",
"He died on 27 March 1858 in La Ferté- sous- Jouarre.",
"He was introduced to entomology by Madame de Grostête- Tigny who was fascinated, like her husband, by chemistry and insects.",
"Through her, Audinet- Serville met Pierre André Latreille( 1762- 1833).",
"Latreille worked with him on the\" Dictionnaire des Insectes de l’ Encyclopédie méthodique The Methodical Encyclopedia Dictionary of Insects\").",
"Then, working with Guillaume- Antoine Olivier( 1756- 1814), he finished the book\" Faune française French Fauna\") in 1830.",
"Audinet- Serville is particularly known for his work on the Orthoptera.",
"He published,\" Revue méthodique de l’ordre des Orthoptères Methodical Review of the Order of Orthoptera\") which appeared in\" Annales des sciences naturelles\" in 1831.",
"Then, in 1839, in the series of works entitled\" les Suites à Buffon\", a volume on the same order,\" Histoire naturelle des Insectes OrthoptèresNatural History of Orthoptera Insects\").",
"He was a friend of Charles Jean- Baptiste Amyot and wrote with him\" Histoire naturelle des insectes HemipteresNatural History of Hemiptera Insects\").",
"Paris,",
"Libraire Encyclopedique de Roret: 1-675( 1843). and with Amédée Louis Michel le Peletier, comte de Saint- Fargeau he contributed a treatise on Hemiptera to Guillaume- Antoine Olivier's Histoire naturelle.",
"\" Entomologie, ou histoire naturelle des Crustacés, des Arachnides et des InsectesEncyclopédie Méthodique\")"
],
[
"Phaneroptera is an Old World genus of katydids in the subfamily Phaneropterinae.",
"It was described by Jean Guillaume Audinet- Serville in 1831.",
"It contains the following species:"
],
[
"Agustín Lavezzi( born 15 January 1996) is an Argentine professional footballer who plays as a midfielder for Deportivo Morón."
],
[
"Neocorus ibidionoides is a species of beetle in the family Cerambycidae.",
"It was described by Jean Guillaume Audinet- Serville in 1834."
],
[
"Euura oblita is a species of sawfly belonging to the family Tenthredinidae( common sawflies).",
"The larvae feed on the leaves of willows(\" Salix\" species) and was first described by Jean Guillaume Audinet- Serville in 1823."
],
[
"Euura proxima is a species of sawfly belonging to the family Tenthredinidae( common sawflies).",
"The larvae feed on the leaves of willows(\" Salix\" species) and was first described by Jean Guillaume Audinet- Serville in 1823."
],
[
"Rhinotragus analis is a species of beetle in the family Cerambycidae.",
"It was described by Jean Guillaume Audinet- Serville in 1833."
]
],
"title": [
"Angela (genus)",
"Charles Jean-Baptiste Amyot",
"Derobrachus brevicollis",
"Jean Guillaume Audinet-Serville",
"Phaneroptera",
"Agustín Lavezzi",
"Neocorus ibidionoides",
"Euura oblita",
"Euura proxima",
"Rhinotragus analis"
]
},
"supporting_facts": {
"sent_id": [
0,
0
],
"title": [
"Agustín Lavezzi",
"Jean Guillaume Audinet-Serville"
]
},
"type": "comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Who was born earlier, Agustín Lavezzi or Jean Guillaume Audinet-Serville?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Jean Guillaume Audinet-Serville",
"Audinet-Serville",
"Jean-Guillaume Audinet-Serville"
]
},
"style": "rule"
}
|
{
"index": 82,
"split": "test"
}
|
dev_9038
|
What is the date of death of Christopher Bathurst, 3Rd Viscount Bledisloe's father?
|
[
"17 September 1979"
] |
{
"context": {
"content": [
[
"Etan Boritzer( born 1950) is an American writer of children ’s literature who is best known for his book\" What is God?\" first published in 1989.",
"His best selling\" What is?\" illustrated children's book series on character education and difficult subjects for children is a popular teaching guide for parents, teachers and child- life professionals.",
"Boritzer gained national critical acclaim after\" What is God?\" was published in 1989 although the book has caused controversy from religious fundamentalists for its universalist views.",
"The other current books in the\" What is?\" series include",
"What is Love?,",
"What is Death?,",
"What is Beautiful?,",
"What is Funny?,",
"What is Right?,",
"What is Peace?,",
"What is Money?,",
"What is Dreaming?,",
"What is a Friend?,",
"What is True?,",
"What is a Family?,",
"What is a Feeling?\"",
"The series is now also translated into 15 languages.",
"Boritzer was first published in 1963 at the age of 13 when he wrote an essay in his English class at Wade Junior High School in the Bronx, New York on the assassination of John F. Kennedy.",
"His essay was included in a special anthology by New York City public school children compiled and published by the New York City Department of Education.",
"Boritzer now lives in Venice, California and maintains his publishing office there also.",
"He has helped numerous other authors to get published through\" How to Get Your Book Published!\" programs.",
"Boritzer is also a yoga teacher who teaches regular classes locally and guest- teaches nationally.",
"He is also recognized nationally as an erudite speaker on\" The Teachings of the Buddha.\""
],
[
"Albert Thompson( born 1912, date of death unknown) was a Welsh footballer."
],
[
"William Thomas Smith( born 9 April 1897, date of death unknown) was an English professional footballer."
],
[
"Benjamin Ludlow Bathurst, 2nd Viscount Bledisloe, QC (2 October 1899 – 17 September 1979) was a British barrister."
],
[
"Thomas Scott( 1907- date of death unknown) was an English diver."
],
[
"Harry Johnson( 1903- date of death unknown) was an English wrestler."
],
[
"Charles Bathurst, 1st Viscount Bledisloe,( 21 September 1867 – 3 July 1958) was a British Conservative politician and colonial governor.",
"He was Governor- General of New Zealand from 1930 to 1935."
],
[
"Theodred II was a medieval Bishop of Elmham.",
"The date of Theodred's consecration unknown, but the date of his death was sometime between 995 and 997."
],
[
"Christopher Hiley Ludlow Bathurst, 3rd Viscount Bledisloe, QC (24 June 1934 – 12 May 2009) was a British barrister and politician.",
"Bledisloe was the son of Benjamin Bathurst, 2nd Viscount Bledisloe.",
"He was educated at Eton, and Trinity College, Oxford.",
"He served in the army as a Second Lieutenant of the 11th Hussars from 1954 to 1955 and was called to the Bar at Gray's Inn in 1959.",
"In 1978 he became a Queen's Counsel (QC).",
"He was one of the hereditary peers elected by the other hereditary peers to take a seat in the House of Lords, which most hereditary peers lost by the House of Lords Act 1999.",
"The Bledisloe seat is Lydney Park, Gloucestershire, from which the territorial designation of the peerage was taken.",
"He sat as a crossbencher.",
"Bledisloe married Elizabeth Mary Thompson in 1962.",
"They had two sons and one daughter and divorced in 1986.",
"His elder son and successor, Rupert Bathurst, 4th Viscount Bledisloe, is a noted portrait artist.",
"Bledisloe died on 12 May 2009.",
"Bledisloe was the President of the St. Moritz Tobogganing Club (SMTC) also known as the Cresta."
],
[
"Harry Wainwright( born 1899; date of death unknown) was an English footballer."
]
],
"title": [
"Etan Boritzer",
"Albert Thompson (footballer, born 1912)",
"Bill Smith (footballer, born 1897)",
"Benjamin Bathurst, 2nd Viscount Bledisloe",
"Thomas Scott (diver)",
"Harry Johnson (wrestler)",
"Charles Bathurst, 1st Viscount Bledisloe",
"Theodred II (Bishop of Elmham)",
"Christopher Bathurst, 3rd Viscount Bledisloe",
"Harry Wainwright (footballer)"
]
},
"supporting_facts": {
"sent_id": [
1,
0
],
"title": [
"Christopher Bathurst, 3rd Viscount Bledisloe",
"Benjamin Bathurst, 2nd Viscount Bledisloe"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: What is the date of death of Christopher Bathurst, 3Rd Viscount Bledisloe's father?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"17 September 1979"
]
},
"style": "rule"
}
|
{
"index": 83,
"split": "test"
}
|
dev_11561
|
Where was the place of death of the performer of song Dream-A-Lot'S Theme (I Will Live For Love)?
|
[
"Naples",
"Naples, Florida"
] |
{
"context": {
"content": [
[
"James Robert Morrison, known as Jim Bob, is a British musician and author.",
"He was the singer of indie punk band Carter the Unstoppable Sex Machine."
],
[
"Caspar Babypants is the stage name of children's music artist Chris Ballew, who is also widely known as the singer of The Presidents of the United States of America."
],
[
"Astrid North( Astrid Karina North Radmann; 24 August 1973, Berlin – 25 June 2019, Berlin) was a German soul singer and songwriter.",
"She was the singer of the German band, with whom she released five Albums.",
"As guest singer of the band she published three albums."
],
[
"Kristian Leontiou (born February 1982) is a British singer of Greek Cypriot descent, and is the singer for the indie rock band One eskimO."
],
[
"LaDonna Adrian Gaines (December 31, 1948May 17, 2012), widely known by her stage name based on her married name Donna Summer, was an American singer, songwriter and actress.",
"She gained prominence during the disco era of the late 1970s and became known as the \"Queen of Disco\", while her music gained a global following.",
"While influenced by the counterculture of the 1960s, Summer became the lead singer of a psychedelic rock band named Crow and moved to New York City.",
"Joining a touring version of the musical \"Hair\", she left New York and spent several years living, acting and singing in Europe, where she met music producers Giorgio Moroder and Pete Bellotte in Munich, where they recorded influential disco hits such as \"Love to Love You Baby\" and \"I Feel Love\", marking her breakthrough into an international career.",
"Summer returned to the United States in 1975, and other hits such as \"Last DanceMacArthur ParkHeaven KnowsHot StuffBad GirlsDim",
"All the LightsNo More Tears (Enough Is Enough)\"",
"(duet with Barbra Streisand) and \"On the Radio\" followed.",
"Summer earned a total of 42 hit singles on the US \"Billboard\" Hot 100 in her lifetime, with 14 of those reaching the top-ten.",
"She claimed a top 40 hit every year between 1975 and 1984, and from her first top-ten hit in 1976, to the end of 1982, she had 12 top-ten hits (10 were top-five hits), more than any other act during that time period.",
"She returned to the Hot 100's top-five in 1983, and claimed her final top-ten hit in 1989 with \"This Time",
"I Know",
"It's for Real\".",
"She was the first artist to have three consecutive double albums reach number one on the US \"Billboard\" 200 chart and charted four number-one singles in the US within a 12-month period.",
"She also charted two number-one singles on the R&B Singles chart in the US and a number-one single in the United Kingdom.",
"Her most recent Hot 100 hit came in 1999 with",
"\"I Will Go with You (Con Te Partiro)\".",
"While her fortunes on the Hot 100 waned through those decades, Summer remained a force on the \"Billboard\" Dance Club Songs chart over her entire career.",
"Summer died on May 17, 2012, from lung cancer, at her home in Naples, Florida.",
"She reportedly sold over 100 million records worldwide, making her one of the best-selling music artists of all time.",
"She won five Grammy Awards.",
"In her obituary in \"The Times\", she was described as the \"undisputed queen of the Seventies disco boom\" who reached the status of \"one of the world's leading female singers.\"",
"Giorgio Moroder described Summer's work with them on the song",
"\"I Feel Love\" as \"really the start of electronic dance\" music.",
"In 2013, Summer was inducted into the Rock and Roll Hall of Fame.",
"In December 2016, \"Billboard\" ranked her at No. 6 on its list of the Greatest of All Time Top Dance Club Artists ."
],
[
"\" Panda\" is a song of the Chilean band Astro and is the fifth song of the homonymous album of the year 2011.",
"The song was composed and produced by the singer of the band Andrés Nusser and released like third single of the album on February 1, 2013."
],
[
"\"Dream-A-Lot's Theme (I Will Live for Love)\" is a song recorded by Donna Summer in 2003.",
"The song was not released as a commercial single.",
"It was included as one of three new tracks on Universal's greatest hits package and released as a 12\" promo single to club DJs.",
"\"Dream-A-Lot's Theme\" reached #20 on the US Club Play Chart in 2004.",
"In 2005 the track was included on a second Universal compilation, \"Gold\"."
],
[
"Billy Milano is a Bronx- born heavy metal musician now based in Austin, Texas.",
"He is the singer and- occasionally- guitarist and bassist of crossover thrash band M.O.D., and he was also the singer of its predecessor, Stormtroopers of Death.",
"He was also the singer of United Forces, which also featured his Stormtroopers of Death bandmate Dan Lilker."
],
[
"\" O Valencia!\" is the fifth single by the indie rock band",
"The Decemberists, and the first released from their fourth studio album,\" The Crane Wife\".",
"The music was written by The Decemberists and the lyrics by Colin Meloy.",
"It tells a story of two star- crossed lovers.",
"The singer falls in love with a person who belongs to an opposing gang.",
"At the end of the song, the singer's lover jumps in to defend the singer, who is confronting his lover's brother( the singer's\" sworn enemy\") and is killed by the bullet intended for the singer."
],
[
"Bernard Bonvoisin, known as Bernie Bonvoisin( born 9 July 1956 in Nanterre, Hauts- de- Seine), is a French hard rock singer and film director.",
"He is best known for having been the singer of Trust.",
"He was one of the best friends of Bon Scott the singer of AC/ DC and together they recorded the song\" Ride On\" which was one of the last songs by Bon Scott."
]
],
"title": [
"Jim Bob",
"Caspar Babypants",
"Astrid North",
"Kristian Leontiou",
"Donna Summer",
"Panda (Astro song)",
"Dream-A-Lot's Theme (I Will Live for Love)",
"Billy Milano",
"O Valencia!",
"Bernie Bonvoisin"
]
},
"supporting_facts": {
"sent_id": [
0,
17
],
"title": [
"Dream-A-Lot's Theme (I Will Live for Love)",
"Donna Summer"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Where was the place of death of the performer of song Dream-A-Lot'S Theme (I Will Live For Love)?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Naples",
"Naples, Florida"
]
},
"style": "rule"
}
|
{
"index": 84,
"split": "test"
}
|
dev_7007
|
Where did the director of film Barakat! study?
|
[
"Institut des hautes études cinématographiques",
"Institut des hautes etudes cinematographiques"
] |
{
"context": {
"content": [
[
"The Seventh Company Outdoors is a 1977 French comedy film directed by Robert Lamoureux.",
"It is a sequel to Now Where Did the 7th Company Get to ?."
],
[
"Peter Levin is an American director of film, television and theatre."
],
[
"Dana Blankstein- Cohen( born March 3, 1981) is the director of the Israeli Academy of Film and Television.",
"She is a film director, and an Israeli culture entrepreneur."
],
[
"Ian Barry is an Australian director of film and TV."
],
[
"Djamila Sahraoui (born 1950) is an Algerian filmmaker.",
"Djamila Sahraoui was born in Algiers on 23 October 1950.",
"She studied literature before attending the renowned Parisian film school IDHEC",
"(Institut des hautes études cinématographiques), specialising in scriptwriting and direction.",
"Sahraoui moved to France in 1975 where she began her career as a documentary filmmaker.",
"She made her first short film, \"Houria\" in 1980, then worked as an editor and assistant, before going on to make her own documentaries from the 1990s.",
"Her 1995 documentary, \"La moitié du ciel d'Allah\", featured interviews with Algerian women about work and their struggles for equality and freedom.",
"In 1997, she was named the laureate of la Villa Medicis due to her success.",
"While Sahraoui is best known for her documentaries, she has also achieved success with her fiction films \"Barakat!\"",
"(2006) and \"Yema\" (2013)."
],
[
"Jason Moore( born October 22, 1970) is an American director of film, theatre and television."
],
[
"Barakat!",
"is a 2006 French/Algerian drama film directed by Djamila Sahraoui.",
"It premiered at the Berlin International Film Festival on 16 February 2006."
],
[
"Olav Aaraas( born 10 July 1950) is a Norwegian historian and museum director.",
"He was born in Fredrikstad.",
"From 1982 to 1993 he was the director of Sogn Folk Museum, from 1993 to 2010 he was the director of Maihaugen and from 2001 he has been the director of the Norwegian Museum of Cultural History.",
"In 2010 he was decorated with the Royal Norwegian Order of St. Olav."
],
[
"Now Where Did the 7th Company Get To? is a 1973 French- Italian comedy war film directed by Robert Lamoureux.",
"The film portrays the adventures of a French Army squad lost somewhere on the front in May 1940 during the Battle of France."
],
[
"Brian Patrick Kennedy( born 5 November 1961) is an Irish- born art museum director who has worked in Ireland and Australia, and now lives and works in the United States.",
"He is currently the director of the Peabody Essex Museum.",
"He was the director of the Toledo Museum of Art in Ohio from 2010 to 2019.",
"He was the director of the Hood Museum of Art from 2005 to 2010, and the National Gallery of Australia( Canberra) from 1997- 2004."
]
],
"title": [
"The Seventh Company Outdoors",
"Peter Levin",
"Dana Blankstein",
"Ian Barry (director)",
"Djamila Sahraoui",
"Jason Moore (director)",
"Barakat!",
"Olav Aaraas",
"Now Where Did the 7th Company Get to?",
"Brian Kennedy (gallery director)"
]
},
"supporting_facts": {
"sent_id": [
1,
3
],
"title": [
"Barakat!",
"Djamila Sahraoui"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Where did the director of film Barakat! study?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Institut des hautes études cinématographiques",
"Institut des hautes etudes cinematographiques"
]
},
"style": "rule"
}
|
{
"index": 85,
"split": "test"
}
|
dev_9189
|
Which country the performer of song Sweet Leilani is from?
|
[
"U.S.A.",
"US of A",
"US",
"United States of America",
"USA",
"the United States of America",
"United States",
"the US",
"🇺🇸",
"the United States",
"U.S.",
"America",
"the USA"
] |
{
"context": {
"content": [
[
"Dáithí Sproule( born 23 May 1950) is a guitarist and singer of traditional Irish music.",
"His niece is the singer Claire Sproule."
],
[
"Harold Lillis \"Bing\"",
"Crosby Jr. (May 3, 1903 – October 14, 1977) was an American singer, comedian and actor.",
"The first multimedia star, Crosby was a leader in record sales, radio ratings, and motion picture grosses from 1931 to 1954.",
"His early career coincided with recording innovations that allowed him to develop an intimate singing style that influenced many male singers who followed him, including Perry Como, Frank Sinatra, Dick Haymes, and Dean Martin.",
"\"Yank\" magazine said that he was \"the person who had done the most for the morale of overseas servicemen\" during World War II.",
"In 1948, American polls declared him the \"most admired man alive\", ahead of Jackie Robinson and Pope Pius XII.",
"Also in 1948, \"Music Digest\" estimated that his recordings filled more than half of the 80,000 weekly hours allocated to recorded radio music.",
"Crosby won an Oscar for Best Actor for his role as Father Chuck O'Malley in the 1944 motion picture \"Going My Way\" and was nominated for his reprise of the role in \"The Bells of St. Mary's\" opposite Ingrid Bergman the next year, becoming the first of six actors to be nominated twice for playing the same character.",
"In 1963, Crosby received the first Grammy Global Achievement Award.",
"He is one of 33 people to have three stars on the Hollywood Walk of Fame, in the categories of motion pictures, radio, and audio recording.",
"He was also known for his collaborations with longtime friend Bob Hope, starring in the \"Road to...\" films from 1940 to 1962.",
"Crosby influenced the development of the postwar recording industry.",
"After seeing a demonstration of a German broadcast quality reel-to-reel tape recorder brought to America by John T. Mullin, he invested $50,000 in a California electronics company called Ampex to build copies.",
"He then convinced ABC to allow him to tape his shows.",
"He became the first performer to pre-record his radio shows and master his commercial recordings onto magnetic tape.",
"Through the medium of recording, he constructed his radio programs with the same directorial tools and craftsmanship (editing, retaking, rehearsal, time shifting) used in motion picture production, a practice that became an industry standard.",
"In addition to his work with early audio tape recording, he helped to finance the development of videotape, bought television stations, bred racehorses, and co-owned the Pittsburgh Pirates baseball team."
],
[
"Bernard Bonvoisin, known as Bernie Bonvoisin( born 9 July 1956 in Nanterre, Hauts- de- Seine), is a French hard rock singer and film director.",
"He is best known for having been the singer of Trust.",
"He was one of the best friends of Bon Scott the singer of AC/ DC and together they recorded the song\" Ride On\" which was one of the last songs by Bon Scott."
],
[
"Kristian Leontiou (born February 1982) is a British singer of Greek Cypriot descent, and is the singer for the indie rock band One eskimO."
],
[
"Caspar Babypants is the stage name of children's music artist Chris Ballew, who is also widely known as the singer of The Presidents of the United States of America."
],
[
"Harry Owens( 18 April 1902 – 12 December 1986) was an American composer, bandleader and songwriter best known for his song\" Sweet Leilani.\""
],
[
"\"Sweet Leilani\" is a song featured in the 1937 film, \"Waikiki Wedding\".",
"It won the Academy Award for Best Original Song, and Bing Crosby's record became one of the biggest hits of 1937.",
"Harry Owens wrote the song on October 20, 1934 for his daughter Leilani, who was born the previous day.",
"Leilani is a popular Hawaiian name, meaning \"heavenly garland of flowers\" (not \"heavenly flower\").",
"It also has a figurative meaning: Small Hawaiian children were carried on their parents' shoulders like a \"lei\" (garland), so the name took on the meaning \"heavenly child.\"",
"Prior to \"Waikiki Wedding\", the song had been recorded by Sol Hoʻopiʻi under the title \"Leilani\" as the B-side of \"Hawaiian Honeymoon\" (Brunswick Records 55085).",
"Harry Owens and his Royal Hawaiians performed \"Sweet Leilani\" in the 1938 film \"Cocoanut Grove\" starring Fred MacMurray."
],
[
"\" Panda\" is a song of the Chilean band Astro and is the fifth song of the homonymous album of the year 2011.",
"The song was composed and produced by the singer of the band Andrés Nusser and released like third single of the album on February 1, 2013."
],
[
"\" O Valencia!\" is the fifth single by the indie rock band",
"The Decemberists, and the first released from their fourth studio album,\" The Crane Wife\".",
"The music was written by The Decemberists and the lyrics by Colin Meloy.",
"It tells a story of two star- crossed lovers.",
"The singer falls in love with a person who belongs to an opposing gang.",
"At the end of the song, the singer's lover jumps in to defend the singer, who is confronting his lover's brother( the singer's\" sworn enemy\") and is killed by the bullet intended for the singer."
],
[
"Billy Milano is a Bronx- born heavy metal musician now based in Austin, Texas.",
"He is the singer and- occasionally- guitarist and bassist of crossover thrash band M.O.D., and he was also the singer of its predecessor, Stormtroopers of Death.",
"He was also the singer of United Forces, which also featured his Stormtroopers of Death bandmate Dan Lilker."
]
],
"title": [
"Dáithí Sproule",
"Bing Crosby",
"Bernie Bonvoisin",
"Kristian Leontiou",
"Caspar Babypants",
"Harry Owens",
"Sweet Leilani",
"Panda (Astro song)",
"O Valencia!",
"Billy Milano"
]
},
"supporting_facts": {
"sent_id": [
1,
1
],
"title": [
"Sweet Leilani",
"Bing Crosby"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which country the performer of song Sweet Leilani is from?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"U.S.A.",
"US of A",
"US",
"United States of America",
"USA",
"the United States of America",
"United States",
"the US",
"🇺🇸",
"the United States",
"U.S.",
"America",
"the USA"
]
},
"style": "rule"
}
|
{
"index": 86,
"split": "test"
}
|
dev_158
|
Which film was released first, The Beginning Or The End or Hold Back The Night?
|
[
"Beginning or the End",
"The Beginning or the End",
"The Beginning Or The End"
] |
{
"context": {
"content": [
[
"The Beginning or the End is a 1947 American docudrama film about the development of the atomic bomb in World War II, directed by Norman Taurog, starring Brian Donlevy and Hume Cronyn, and released by Metro- Goldwyn- Mayer.",
"The film dramatizes the creation of the atomic bomb in the Manhattan Project and the bombing of Hiroshima.",
"The film originated in October 1945 as a project of actress Donna Reed and her high school science teacher, Edward R. Tompkins, who was a chemist at the Oak Ridge National Laboratory.",
"Bob Considine wrote the treatment, which was sent to MGM script writers.",
"The title was supplied by President Harry S. Truman.",
"At the time there was a legal requirement that permission be obtained to depict living well- known public figures.",
"Many refused, but others, such as J. Robert Oppenheimer, co-operated.",
"Major General Leslie R. Groves, Jr., the director of the Manhattan Project, was hired as a consultant for$ 10,000.",
"Although the filmmakers put considerable effort into historical accuracy, particularly in details, the film is known for some key distortions of history.",
"An entirely fictional sequence was added in which Truman agonizes over whether to authorize the attack; anti-aircraft shells are shown bursting around the\" Enola Gay\" on its bombing run over Hiroshima; and it is said that leaflets were dropped on Hiroshima for ten days in advance of the mission warning the citizens of the forthcoming raid.",
"The film received generally mixed reviews, and was a box office disappointment."
],
[
"Take Back the Night may refer to:"
],
[
"Hold Back the Night is a 1956 film about the Korean War.",
"Hold Back the Night may also refer to:"
],
[
"Hold Back( original title: Rengaine) is a 2012 French drama film directed and written by Rachid Djaidani."
],
[
"Ca n't Hold Back or I Ca n't Hold Back may refer to:"
],
[
"Hold Back Tomorrow( 1955) is an American feature film released by Universal- International."
],
[
"Hold Back the River may refer to:"
],
[
"Hold Back Your Love may refer to:"
],
[
"Do n't Hold Back may refer to:"
],
[
"Hold Back the Night is a 1956 American war film about the Korean War based on the 1951 novel by Pat Frank, who had been a war correspondent in Korea.",
"The film was directed by Allan Dwan; his third film with John Payne and his third film about the United States Marine Corps, the others being\" Abroad with Two Yanks\"( 1944) and\" Sands of Iwo Jima\"( 1949)."
]
],
"title": [
"The Beginning or the End",
"Take Back the Night",
"Hold Back the Night (disambiguation)",
"Hold Back",
"Can't Hold Back",
"Hold Back Tomorrow",
"Hold Back the River",
"Hold Back Your Love",
"Don't Hold Back",
"Hold Back the Night"
]
},
"supporting_facts": {
"sent_id": [
0,
0
],
"title": [
"The Beginning or the End",
"Hold Back the Night"
]
},
"type": "comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which film was released first, The Beginning Or The End or Hold Back The Night?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Beginning or the End",
"The Beginning or the End",
"The Beginning Or The End"
]
},
"style": "rule"
}
|
{
"index": 87,
"split": "test"
}
|
dev_1832
|
When was the director of film Gilbert Dying To Die born?
|
[
"11 November 1887"
] |
{
"context": {
"content": [
[
"Ian Barry is an Australian director of film and TV."
],
[
"Dying to Belong is a 1997 American television film directed by William A. Graham."
],
[
"Theodred II was a medieval Bishop of Elmham.",
"The date of Theodred's consecration unknown, but the date of his death was sometime between 995 and 997."
],
[
"Terence D. Robinson( date of birth and death unknown) was a male wrestler who competed for England."
],
[
"Peter Levin is an American director of film, television and theatre."
],
[
"Brian Saunders( date of birth and death unknown) was a male weightlifter who competed for England."
],
[
"Dying To Be Me is a 2015 Telugu short film written and directed by Deva Katta, starring Ravi Varma and Smita.",
"The film highlights woman empowerment\".",
"The film takes its name from a New York Times best selling book\" Dying To Be Me\" by Anita Moorjani.",
"At the 1:20 mark in the video, the cover of this book is shown briefly but the author's name is blacked out.",
"On 15 April 2015, Deadline reported that the New York Times best selling book\" Dying To Be Me\" has been optioned to internationally acclaimed producer Ridley Scott's company Ridley Scott Films.",
"In the book, the author recounts stories of her childhood in Hong Kong, her challenge to establish her career and find true love, as well as how she eventually ended up in that hospital bed where she defied all medical knowledge.",
"As part of a traditional Hindu family residing in a largely Chinese and British society, she had been pushed and pulled by cultural and religious customs since she had been a little girl.",
"After years of struggling to forge her own path while trying to meet everyone else's expectations, she had the realization that she had the power to heal herself ... and that there are miracles in the Universe that she had never even imagined.",
"In\" Dying to Be Me\", Anita freely shares all she has learned about illness, healing, fear,\" being loved,\" and the true magnificence of each and every human being.",
"Deva Katta appears to have based his film script on the synopsis of the book\" Dying To Be Me\" as it is shown on Amazon.com."
],
[
"Etan Boritzer( born 1950) is an American writer of children ’s literature who is best known for his book\" What is God?\" first published in 1989.",
"His best selling\" What is?\" illustrated children's book series on character education and difficult subjects for children is a popular teaching guide for parents, teachers and child- life professionals.",
"Boritzer gained national critical acclaim after\" What is God?\" was published in 1989 although the book has caused controversy from religious fundamentalists for its universalist views.",
"The other current books in the\" What is?\" series include",
"What is Love?,",
"What is Death?,",
"What is Beautiful?,",
"What is Funny?,",
"What is Right?,",
"What is Peace?,",
"What is Money?,",
"What is Dreaming?,",
"What is a Friend?,",
"What is True?,",
"What is a Family?,",
"What is a Feeling?\"",
"The series is now also translated into 15 languages.",
"Boritzer was first published in 1963 at the age of 13 when he wrote an essay in his English class at Wade Junior High School in the Bronx, New York on the assassination of John F. Kennedy.",
"His essay was included in a special anthology by New York City public school children compiled and published by the New York City Department of Education.",
"Boritzer now lives in Venice, California and maintains his publishing office there also.",
"He has helped numerous other authors to get published through\" How to Get Your Book Published!\" programs.",
"Boritzer is also a yoga teacher who teaches regular classes locally and guest- teaches nationally.",
"He is also recognized nationally as an erudite speaker on\" The Teachings of the Buddha.\""
],
[
"Gilbert Dying to Die is a 1915 British silent short comedy film directed by Maurice Elvey and starring Fred Groves.",
"A drunken man named Gilbert attempts to commit suicide and fails.",
"He then discovers that he has inherited a fortune.",
"Another film portraying the same character",
"\"Gilbert Gets Tiger-",
"It is\" was also released the same year."
],
[
"Maurice Elvey (11 November 1887 – 28 August 1967) was the most prolific film director in British history.",
"He directed nearly 200 films between 1913 and 1957.",
"During the silent film era he directed as many as twenty films per year.",
"He also produced more than fifty films - his own as well as films directed by others."
]
],
"title": [
"Ian Barry (director)",
"Dying to Belong",
"Theodred II (Bishop of Elmham)",
"Terence Robinson",
"Peter Levin",
"Brian Saunders (weightlifter)",
"Dying to Be Me",
"Etan Boritzer",
"Gilbert Dying to Die",
"Maurice Elvey"
]
},
"supporting_facts": {
"sent_id": [
0,
0
],
"title": [
"Gilbert Dying to Die",
"Maurice Elvey"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: When was the director of film Gilbert Dying To Die born?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"11 November 1887"
]
},
"style": "rule"
}
|
{
"index": 88,
"split": "test"
}
|
dev_1232
|
Who is the spouse of the director of film Soldier (1998 American Film)?
|
[
"Milla",
"Milla Jovovich"
] |
{
"context": {
"content": [
[
"Sophia Magdalena of Denmark (3 July 1746 – 21 August 1813) was Queen of Sweden as the spouse of King Gustav III."
],
[
"Mehdi Abrishamchi is an Iranian People's Mujahedin of Iran( MEK) politician who has been described as\" the right hand man of Massoud Rajavi\".",
"He is the former spouse of the group's current leader, Maryam Rajavi, whom he divorced so that she could become the spouse of Massoud Rajavi."
],
[
"Paul William Scott Anderson (born 4 March 1965) is an English film director, producer, and screenwriter who regularly works in science fiction films and video game adaptations.",
"Anderson made his feature film debut with the British independent film \"Shopping\" (1994), and found commercial success with his second film, the Hollywood-produced \"Mortal Kombat\" (1995), based on the first couple of video games of the same name by Midway Games.",
"Today, he is best known as the creative voice behind the \"Resident Evil\" film series (2002–2016), which stars his eventual wife Milla Jovovich, and is based on the Capcom video game series of the same name.",
"The series consists of six films, of which Anderson directed four, that have collectively grossed over $1 billion worldwide, making it the most commercially successful video game adaptation to this date.",
"Other notable films of Anderson's are \"Event Horizon\" (1997), an initial critical and commercial disappointment that found renewed appreciation on home video; \"Alien vs. Predator\" (2004), based on the crossover concept of the same name between the \"Alien\" and \"Predator\" franchises; and \"Death Race\" (2008), a remake/prequel to 1975's \"Death Race 2000\".",
"Anderson and producer Jeremy Bolt founded Impact Pictures in 1992, under which most of Anderson's films have been made.",
"Critical reception of his films has been mixed to negative.",
"While \"Mortal Kombat\" and some of the \"Resident Evil\" films commonly feature on lists about the best film adaptations of video games, such lists mention that films of the genre are, at best, lackluster.",
"Anderson has repeatedly stated he considers himself a \"populist filmmaker\", who only cares about whether his movies entertain the audience and make them cheer in the cinema, rather than their reception by professional critics."
],
[
"Soldier is a 1998 American science fiction action film directed by Paul W. S. Anderson, written by David Webb Peoples, and starring Kurt Russell, Jason Scott Lee, Jason Isaacs, Connie Nielsen, Sean Pertwee and Gary Busey.",
"The film tells the story of a highly skilled soldier defying his commanders and facing a relentless and brutal genetically-enhanced rival soldier.",
"The film was released worldwide on October 23, 1998.",
"Upon its release, \"Soldier\" received generally negative reviews but many praised the action sequences and Russell's performance.",
"The film was a commercial failure, grossing $14 million worldwide against a production budget of $60 million."
],
[
"Heather Denise Gibson is a Scottish economist currently serving as Director- Advisor to the Bank of Greece( since 2011).",
"She is the spouse of Euclid Tsakalotos, former Greek Minister of Finance."
],
[
"Marie Louise Coidavid( 1778 – March 11, 1851), was the Queen of the Kingdom of Haiti 1811 – 20 as the spouse of Henri I of Haiti."
],
[
"Maria Teresa, Grand Duchess of Luxembourg( born María Teresa Mestre y Batista; on 22 March 1956), is the spouse of Grand Duke Henri."
],
[
"Princess Auguste of Bavaria( 28 April 1877 – 25 June 1964) was a member of the Bavarian Royal House of Wittelsbach and the spouse of Archduke Joseph August of Austria."
],
[
"Adib Kheir was a leading Syrian nationalist of the 1920s.",
"He was the owner of the Librairie Universelle in Damascus.",
"His granddaughter is the spouse of Manaf Tlass."
],
[
"Gertrude of Saxony and Bavaria( 1152/55–1197) was Duchess of Swabia as the spouse of Duke Frederick IV, and Queen of Denmark as the spouse of King Canute VI."
]
],
"title": [
"Sophia Magdalena of Denmark",
"Mehdi Abrishamchi",
"Paul W. S. Anderson",
"Soldier (1998 American film)",
"Heather D. Gibson",
"Marie-Louise Coidavid",
"Maria Teresa, Grand Duchess of Luxembourg",
"Princess Auguste of Bavaria (1875–1964)",
"Adib Kheir",
"Gertrude of Bavaria"
]
},
"supporting_facts": {
"sent_id": [
0,
2
],
"title": [
"Soldier (1998 American film)",
"Paul W. S. Anderson"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Who is the spouse of the director of film Soldier (1998 American Film)?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Milla",
"Milla Jovovich"
]
},
"style": "rule"
}
|
{
"index": 89,
"split": "test"
}
|
dev_11321
|
Which film has the director who is older than the other, A Dangerous Woman (1993 Film) or Emmy Of Stork'S Nest?
|
[
"Emmy Of Stork'S Nest"
] |
{
"context": {
"content": [
[
"Logan Sandler is an American writer and director who is best known for his first feature film\" Live Cargo\"."
],
[
"A Dangerous Woman is a 1993 American romantic drama film directed by Stephen Gyllenhaal.",
"The screenplay was written by his wife Naomi Foner, loosely based on the award- winning novel of the same name by Mary McGarry Morris.",
"The feature was co-produced by Amblin Entertainment and Gramercy Pictures.",
"It stars Debra Winger, Barbara Hershey and Gabriel Byrne.",
"It included Gyllenhaal and Foner's two children, Jake and Maggie, who later developed acting careers.",
"Debra Winger was nominated for a Golden Globe Award for her performance and also won Best Actress at the Tokyo International Film Festival.",
"The film has never been released on Region 1 DVD.",
"It was once released on video in the United Kingdom by First Independent Films."
],
[
"Stephen Roark Gyllenhaal( born October 4, 1949) is an American film director and poet.",
"He is the father of actors Jake Gyllenhaal and Maggie Gyllenhaal."
],
[
"Elliot Silverstein( born August 3, 1927, in Boston, Massachusetts) is an American director, who is best known for being the director of the movie\" Cat Ballou\"( 1965)."
],
[
"William Nigh( October 12, 1881 – November 27, 1955) was an American film director, writer, and actor.",
"His film work sometimes lists him as either\" Will Nigh\" or\" William Nye\"."
],
[
"Scott Fox is a pornographic film director who is a member of the AVN Hall of Fame."
],
[
"Dan Milne is a British actor/ director who is possibly best known for his role in\" EastEnders\"."
],
[
"Emmy of Stork's Nest is a 1915 silent film directed by William Nigh.",
"The film is based on the novel\" Stork's Nest\" by J. Breckenridge Ellis and was shot on location."
],
[
"Ben Palmer is a British film and television director who is known for being the director of\" Bo' Selecta\" and\" The Inbetweeners\"."
],
[
"Zapatos Viejos is a Spanish- language 1993 film directed by Sergio Andrade."
]
],
"title": [
"Logan Sandler",
"A Dangerous Woman (1993 film)",
"Stephen Gyllenhaal",
"Elliot Silverstein",
"William Nigh",
"Scotty Fox",
"Dan Milne",
"Emmy of Stork's Nest",
"Ben Palmer",
"Zapatos Viejos (1993 film)"
]
},
"supporting_facts": {
"sent_id": [
0,
0,
0,
0
],
"title": [
"A Dangerous Woman (1993 film)",
"Emmy of Stork's Nest",
"Stephen Gyllenhaal",
"William Nigh"
]
},
"type": "bridge_comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which film has the director who is older than the other, A Dangerous Woman (1993 Film) or Emmy Of Stork'S Nest?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Emmy Of Stork'S Nest"
]
},
"style": "rule"
}
|
{
"index": 90,
"split": "test"
}
|
dev_2442
|
Which country the director of film Hollow (2014 Film) is from?
|
[
"U.S.A.",
"US of A",
"US",
"United States of America",
"USA",
"the United States of America",
"United States",
"the US",
"🇺🇸",
"the United States",
"U.S.",
"America",
"the USA"
] |
{
"context": {
"content": [
[
"Hollow Man 2 is a 2006 American science fiction horror film directed by Claudio Fäh and starring Peter Facinelli, Laura Regan and Christian Slater.",
"It is the stand- alone sequel to the film\" Hollow Man\".",
"It was released direct- to- video on May 23, 2006."
],
[
"Dana Blankstein- Cohen( born March 3, 1981) is the director of the Israeli Academy of Film and Television.",
"She is a film director, and an Israeli culture entrepreneur."
],
[
"Ian Barry is an Australian director of film and TV."
],
[
"Stan Parks is a visual effects artist.",
"He was nominated at the 73rd Academy Awards for his work on the film\" Hollow Man\".",
"He shared his nomination with Scott E. Anderson, Craig Hayes and Scott Stokdyk.",
"This was in the category of Best Visual Effects."
],
[
"John Farrell is the director of YouTube in Latin America."
],
[
"Michael Govan( born 1963) is the director of the Los Angeles County Museum of Art since 2006.",
"Prior to this, Govan worked as the director of the Dia Art Foundation in New York City."
],
[
"Ham Tran (\"Trần Hàm\") is a Vietnamese American film writer, editor, and director.",
"He is of Hoa (Chinese Vietnamese) ancestry, specifically Teochew.",
"He earned an MFA in film directing from the UCLA Film School and is most famous for his thesis film \"The Anniversary\", which was shortlisted for the Academy Award for Live Action Short Film.",
"His most recent effort is the feature film \"How to Fight in Six Inch Heels\" which had its U.S. premiere on 13 March 2014 at CAAMFest in San Francisco.",
"Tran also directed the full-length Vietnamese boat people and re-education camp drama, \"Journey From the Fall\", which was picked up in North America by ImaginAsian Pictures and released nationally on March 23, 2007.",
"In 2009, Tran received the Vilcek Prize for Creative Promise in Filmmaking."
],
[
"Peter Levin is an American director of film, television and theatre."
],
[
"Hollow (Vietnamese: Đoạt Hồn) is a 2014 Vietnamese horror film directed by Ham Tran."
],
[
"John Donatich is the Director of Yale University Press."
]
],
"title": [
"Hollow Man 2",
"Dana Blankstein",
"Ian Barry (director)",
"Stan Parks",
"John Farrell (businessman)",
"Michael Govan",
"Ham Tran",
"Peter Levin",
"Hollow (2014 film)",
"John Donatich"
]
},
"supporting_facts": {
"sent_id": [
0,
0
],
"title": [
"Hollow (2014 film)",
"Ham Tran"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which country the director of film Hollow (2014 Film) is from?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"U.S.A.",
"US of A",
"US",
"United States of America",
"USA",
"the United States of America",
"United States",
"the US",
"🇺🇸",
"the United States",
"U.S.",
"America",
"the USA"
]
},
"style": "rule"
}
|
{
"index": 91,
"split": "test"
}
|
dev_8938
|
Who died first, Christopher Monck, 2Nd Duke Of Albemarle or Chang Kyou-Chul?
|
[
"Christopher Monck, 2nd Duke of Albemarle",
"Christopher Monck, 2Nd Duke Of Albemarle",
"Christopher Monck"
] |
{
"context": {
"content": [
[
"Charles Lennox, 2nd Duke of Richmond, 2nd Duke of Lennox, 2nd Duke of Aubigny,( 18 May 17018 August 1750) was a British nobleman, peer, and politician.",
"He was the son of Charles Lennox, 1st Duke of Richmond, an illegitimate son of King Charles II.",
"He held a number of posts in connection with his high office but is best remembered for his patronage of cricket.",
"He has been described as the most important of the sport's early patrons and did much to help its evolution from village cricket to first- class cricket."
],
[
"\" Louis\"- Charles- Élie- Amanien Decazes de Glücksbierg, 2nd Duke Decazes and 2nd Duke of Glücksbierg( 29 May 1819 – 16 September 1886), was a French diplomat and statesman."
],
[
"Diana Cicely Keppel, Countess of Albemarle, DBE( née Grove, 6 August 1909 – 6 May 2013), married Walter Egerton George Lucian Keppel, son of Arnold Allen Cecil Keppel, 8th Earl of Albemarle, as his second wife on 24 February 1931 at St Columba's Church, London.",
"She became Countess of Albemarle from 12 April 1942 when her husband became the 9th Earl of Albemarle."
],
[
"Chang Kyou-chul( a.k.a. Chang Sun-gil; June 19, 1946 – April 19, 2000) was an Olympic boxing bronze medalist in South Korea.",
"He renamed his name Kyou- chul to Sun-gil."
],
[
"James Francis( Jacobo Francisco) Fitz- James Stuart, 2nd Duke of Berwick, 2nd Duke of Liria and Xérica( Saint- Germain- en- Laye, France, 21 October",
"1696 – Naples, Italy, 2 June 1738) was a Jacobite and Spanish nobleman.",
"He inherited titles in the Jacobite and Spanish nobility on the death of his father in battle in 1734 at Philippsburg, during the War of the Polish Succession.",
"He was also Consort Duke of Veragua and of la Vega and a Consort- Marquis of four further titles by his Spanish marriage in 1716.",
"His full title was 2nd Duke of Berwick, 2nd Earl of Tinmouth, 2nd Baron Bosworth, 2nd Duke of Liria and Xerica, Grandee of Spain 1st class( from 1716), Knight of the Order of the Golden Fleece( from 29 September 1714), as well as the holder of several earldoms."
],
[
"William Graham, 2nd Duke of Montrose( 27 August 1712 – 23 September 1790) was the son of James Graham, 1st Duke of Montrose, and his wife, Christian Carnegie.",
"He married Lucy Manners, daughter of John Manners, 2nd Duke of Rutland on 28 October 1742.",
"He died at age 78 at Twickenham, London, England."
],
[
"Edmond de Talleyrand- Périgord, 2nd Duke of Talleyrand, 2nd Duke of Dino( 1 August 1787, Paris – 14 May 1872, Florence), was a French general of the Napoleonic Wars."
],
[
"Elizabeth Monck, Duchess of Albermarle( 22 February",
"1654 – 11 September 1734), later Elizabeth Montagu, Duchess of Montagu, was the eldest daughter of Henry Cavendish, 2nd Duke of Newcastle, and his wife, Frances Pierrepont( 1630 – 1695; daughter of the Hon. William Pierrepont).",
"She was known for most of her life as\" the Mad Duchess of Albemarle\".",
"Lady Elizabeth Cavendish married Christopher Monck( later Duke of Albemarle) on 30 December 1669 at Whitehall, London.",
"She went with her husband to Jamaica when he was appointed Lieutenant Governor in 1687; there Monck amassed a small fortune, which Elizabeth acquired and brought with her back to England upon his death in the following year( 1688).",
"From this time, the widowed Elizabeth declared that she would only marry into royalty and was convinced that the Kangxi Emperor of Qing Dynasty China wished to marry her; suitably dressed as the Emperor of China, her sister- in- law Elizabeth's stepfather, the Duke of Montagu, asked for her hand in marriage and they were wed on 8 September 1692 in Newcastle House, London.",
"Elizabeth died in 1734 and was buried in Westminster Abbey."
],
[
"Christopher Monck, 2nd Duke of Albemarle (14 August 1653 – 6 October 1688) was an English soldier and politician who sat in the House of Commons from 1667 to 1670 when he inherited the Dukedom and sat in the House of Lords."
],
[
"James Douglas, 2nd Duke of Queensberry and 1st Duke of Dover( 18 December 16626 July 1711) was a Scottish nobleman."
]
],
"title": [
"Charles Lennox, 2nd Duke of Richmond",
"Louis, duc Decazes",
"Diana Keppel, Countess of Albemarle",
"Chang Kyou-chul",
"James Fitz-James Stuart, 2nd Duke of Berwick",
"William Graham, 2nd Duke of Montrose",
"Edmond de Talleyrand-Périgord",
"Elizabeth Monck, Duchess of Albemarle",
"Christopher Monck, 2nd Duke of Albemarle",
"James Douglas, 2nd Duke of Queensberry"
]
},
"supporting_facts": {
"sent_id": [
0,
0
],
"title": [
"Christopher Monck, 2nd Duke of Albemarle",
"Chang Kyou-chul"
]
},
"type": "comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Who died first, Christopher Monck, 2Nd Duke Of Albemarle or Chang Kyou-Chul?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Christopher Monck, 2nd Duke of Albemarle",
"Christopher Monck, 2Nd Duke Of Albemarle",
"Christopher Monck"
]
},
"style": "rule"
}
|
{
"index": 92,
"split": "test"
}
|
dev_590
|
What is the date of death of Matilda Of Boulogne, Duchess Of Brabant's mother?
|
[
"25 July 1182"
] |
{
"context": {
"content": [
[
"Baldwin V of Flanders( c. 1012 – 1 September 1067) was Count of Flanders from 1035 until his death.",
"He was the son of Baldwin IV, Count of Flanders, who died in 1035."
],
[
"Marie I or Mary (1136 – 25 July 1182 in St Austrebert, Montreuil, France) was the \"suo jure\" Countess of Boulogne from 1159 to 1170.",
"She also held the post of Abbess of Romsey for five years until her abduction by Matthew of Alsace, who forced her to marry him."
],
[
"Matilda of Boulogne (1170 – 16 October 1210) was the younger daughter of Matthew, Count of Boulogne and Marie I, Countess of Boulogne.",
"Matilda became Duchess of Brabant by her marriage to Henry I, Duke of Brabant.",
"Matilda's parents' marriage was annulled the year she was born and her mother became a Benedictine nun at St. Austrebert, Montreuil and died in 1182.",
"Matilde's father continued to reign as Count of Boulogne until his death in 1173, when her older sister Ida became countess.",
"At the age of nine, Matilda married Henry I, Duke of Brabant, in 1179.",
"The couple went on to have:",
"Matilde died in 1210 or 1211.",
"She was buried at St. Peter's in Leuven."
],
[
"Theresa of Portugal( Coimbra, c. 1151; – Veurne, 6 May 1218; or) was Countess of Flanders by marriage to Philip I, Count of Flanders, and Duchess of Burgundy by marriage to Odo III, Duke of Burgundy.",
"She was the daughter of the Portuguese king Afonso I and Matilda of Savoy.",
"She served as co-regent of Portugal with her brother during the illness of their father Afonso I of Portugal from 1172 until 1173, and regent of Flanders during the interim period after the death of her spouse and the accession of his heir."
],
[
"Michèle Kahn Spike is an American lawyer, historian, and lecturer.",
"She graduated from Boston University School of Law in 1976 and became a member of the Bar of the State of New York in 1977, concentrating in international corporate law.",
"From the late 1980s until 2011, she lived in Florence, Italy together with her husband, noted art historian John Spike.",
"Since fall 2012 she has been teaching at the Marshall- Wythe School of Law at the College of William& Mary, and she was appointed as Visiting Professor of the Practice of Law in 2017.",
"She is known for her biography of Matilda of Tuscany,\" Tuscan Countess: The Life and Extraordinary Times of Matilda of Canossa\", which was published to favorable reviews in 2004.",
"She also collaborated with John Spike on his study of Caravaggio and Mattia Preti.",
"Spike has been a member of the Bar of the State of New York since 1977 and during those 40 years engaged in the practice of international corporate law.",
"In the spring of 2015, in honor of the 900th anniversary of the death of Matilda of Canossa, Spike curated the exhibition, Matilda of Canossa and the Origins of the Renaissance, for the Muscarelle Museum of Art in coordination with the Marshall- Wythe School of Law at the College of William and Mary.",
"The catalogue for that exhibition was published by the Muscarelle Museum of Art.",
"In 2015, Spike also published, An Illustrated Guide to the ‘ One Hundred Churches ’ of Matilda of Canossa, Countess of Tuscany( Centro Di, Florence, 2015).",
"In 2016, also in honor of the 900th anniversary of the death of the Countess, she curated the exhibition,\" Matilda di Canossa( 1046- 1115)",
"La donna che mutò il corso della stori/ Matilda of Canossa( 1046- 1115) the Woman who Changed the Course of History\" for the Casa Buonarroti in Florence.",
"The catalogue, available in Italian and in English, was published by Centro Di, Florence.",
"As a member of the Standing Commission on Anglican and International Peace with Justice Concerns of the Episcopal Church( USA) 2003- 2009, Spike participated in delegations to Palestine and Israel as well as to United Nations conferences in New York City and Amman, Jordan.",
"Since 2005, Spike has been Chancellor to the Bishop of the Convocation of Episcopal Churches in Europe.",
"She most recently lectured on the Rule of Law at the Jerusalem Peacebuilders Camp in Brattleboro, Vermont."
],
[
"Judith of Flanders( or Judith of France)( 843 – c. 870) was queen consort of Wessex and countess consort of Flanders.",
"She was the eldest daughter of the West Frankish King and later Holy Roman Emperor Charles the Bald and his wife Ermentrude of Orléans.",
"Through her marriages to two Kings of Wessex, Æthelwulf and Æthelbald, she was twice a queen.",
"Her first two marriages were childless, but through her third marriage to Baldwin, she became the first Countess of Flanders and an ancestor of later Counts of Flanders.",
"One of her sons by Baldwin married Ælfthryth, a daughter of Æthelbald's brother, Alfred the Great.",
"She was also an ancestor of Matilda of Flanders, the consort of William the Conqueror, and thus of later monarchs of England."
],
[
"Margaret I of Flanders( c. 1145- died 15 November 1194) was ruling countess of Flanders\" suo jure\" from 1191 to her death.",
"She was the daughter of Thierry, Count of Flanders, and Sibylla of Anjou, and the heiress of her childless brother, Philip of Flanders."
],
[
"Arnulf of Flanders( c. 890 – 27 March 965), called the Great, was the first Count of Flanders."
],
[
"Matilda of Béthune( died 8 November 1264), was countess of Flanders by marriage to Guy, Count of Flanders, and heir to the titles Lady of Béthune, of Dendermonde, of Richebourg and of Warneton, as well as Advocatess of the Abbey of Saint Vaast at Arras.",
"She was the mother of Robert, Count of Flanders, known as Robert of Béthune after his mother."
],
[
"Margaret I( 1310 – 9 May 1382), was a ruling Countess Palatine of Burgundy and Artois from 1361 and 1382.",
"She was also countess of Flanders, Nevers and Rethel by marriage to Louis I, Count of Flanders, and regent of Flanders during the minority of her son Louis II, Count of Flanders in 1346."
]
],
"title": [
"Baldwin V, Count of Flanders",
"Marie I, Countess of Boulogne",
"Matilda of Boulogne, Duchess of Brabant",
"Theresa of Portugal, Countess of Flanders",
"Michèle Kahn Spike",
"Judith of Flanders",
"Margaret I, Countess of Flanders",
"Arnulf I, Count of Flanders",
"Matilda of Béthune",
"Margaret I, Countess of Burgundy"
]
},
"supporting_facts": {
"sent_id": [
0,
0
],
"title": [
"Matilda of Boulogne, Duchess of Brabant",
"Marie I, Countess of Boulogne"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: What is the date of death of Matilda Of Boulogne, Duchess Of Brabant's mother?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"25 July 1182"
]
},
"style": "rule"
}
|
{
"index": 93,
"split": "test"
}
|
dev_6049
|
Which film has the director who died later, Gang Bullets or Day Of The Idiots?
|
[
"Day of the Idiots",
"Day Of The Idiots"
] |
{
"context": {
"content": [
[
"Elliot Silverstein( born August 3, 1927, in Boston, Massachusetts) is an American director, who is best known for being the director of the movie\" Cat Ballou\"( 1965)."
],
[
"Day of the Idiots is a 1981 West German psychological fantasy film drama directed and written by Werner Schroeter.",
"It stars Carole Bouquet as a disturbed mental patient with an inclination to remove her clothes and Ingrid Caven as Dr. Laura and Christine Kaufmann as Ruth.",
"The film was nominated for a Golden Palm Award at the 1982 Cannes Film Festival and won best film at the 1982 German Film Awards going to director Werner Schroeter."
],
[
"G. Marthandan is an Indian film director who works in Malayalam cinema."
],
[
"Gang Bullets is a 1938 American crime drama film directed by Lambert Hillyer.",
"The film is also known as The Crooked Way in the United Kingdom."
],
[
"Brian Patrick Kennedy( born 5 November 1961) is an Irish- born art museum director who has worked in Ireland and Australia, and now lives and works in the United States.",
"He is currently the director of the Peabody Essex Museum.",
"He was the director of the Toledo Museum of Art in Ohio from 2010 to 2019.",
"He was the director of the Hood Museum of Art from 2005 to 2010, and the National Gallery of Australia( Canberra) from 1997- 2004."
],
[
"Lambert Harwood Hillyer( July 8, 1893 – July 5, 1969) was an American film director and screenwriter."
],
[
"Werner Schroeter (7 April 1945 – 12 April 2010) was a German film director, screenwriter, and opera director known for his stylistic excess.",
"Schroeter was cited by Rainer Werner Fassbinder as an influence both on his own work and on German cinema at large."
],
[
"Abhishek Saxena is an Indian Bollywood and Punjabi film director who directed the movie Phullu.",
"The Phullu movie was released in theaters on 16 June 2017, in which film Sharib Hashmi is the lead role.",
"Apart from these, he has also directed Patiala Dreamz, this is a Punjabi film.",
"This film was screened in cinemas in 2014."
],
[
"Ben Palmer is a British film and television director who is known for being the director of\" Bo' Selecta\" and\" The Inbetweeners\"."
],
[
"Bodil Jørgensen( born 3 March 1961) is a Danish film actress.",
"She has appeared in 40 films since 1992.",
"She starred in\" The Idiots\", which was entered into the 1998 Cannes Film Festival."
]
],
"title": [
"Elliot Silverstein",
"Day of the Idiots",
"G. Marthandan",
"Gang Bullets",
"Brian Kennedy (gallery director)",
"Lambert Hillyer",
"Werner Schroeter",
"Abhishek Saxena",
"Ben Palmer",
"Bodil Jørgensen"
]
},
"supporting_facts": {
"sent_id": [
0,
0,
0,
0
],
"title": [
"Gang Bullets",
"Day of the Idiots",
"Lambert Hillyer",
"Werner Schroeter"
]
},
"type": "bridge_comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which film has the director who died later, Gang Bullets or Day Of The Idiots?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Day of the Idiots",
"Day Of The Idiots"
]
},
"style": "rule"
}
|
{
"index": 94,
"split": "test"
}
|
dev_9543
|
Who is the spouse of the director of film One, But A Lion!?
|
[
"Karin Molander"
] |
{
"context": {
"content": [
[
"Mehdi Abrishamchi is an Iranian People's Mujahedin of Iran( MEK) politician who has been described as\" the right hand man of Massoud Rajavi\".",
"He is the former spouse of the group's current leader, Maryam Rajavi, whom he divorced so that she could become the spouse of Massoud Rajavi."
],
[
"Heather Denise Gibson is a Scottish economist currently serving as Director- Advisor to the Bank of Greece( since 2011).",
"She is the spouse of Euclid Tsakalotos, former Greek Minister of Finance."
],
[
"Maria Teresa, Grand Duchess of Luxembourg( born María Teresa Mestre y Batista; on 22 March 1956), is the spouse of Grand Duke Henri."
],
[
"One, But a Lion!",
"(Swedish:",
"En, men ett lejon!)",
"is a 1940 Swedish comedy film directed by Gustaf Molander and starring Håkan Westergren, Annalisa Ericson and Marianne Aminoff.",
"The film's art direction was by Arne Åkermark."
],
[
"Adib Kheir was a leading Syrian nationalist of the 1920s.",
"He was the owner of the Librairie Universelle in Damascus.",
"His granddaughter is the spouse of Manaf Tlass."
],
[
"Princess Auguste of Bavaria( 28 April 1877 – 25 June 1964) was a member of the Bavarian Royal House of Wittelsbach and the spouse of Archduke Joseph August of Austria."
],
[
"Sophia Magdalena of Denmark (3 July 1746 – 21 August 1813) was Queen of Sweden as the spouse of King Gustav III."
],
[
"Gustaf Harald August Molander (18 November 1888 – 19 June 1973) was a Swedish actor and film director.",
"His parents were director Harald Molander, Sr.",
"(1858–1900) and singer and actress Lydia Molander, \"née\" Wessler, and his brother was director Olof Molander (1892–1966).",
"He was the father of director and producer Harald Molander from his first marriage to actress Karin Molander and father to actor Jan Molander from his second marriage to Elsa Fahlberg (1892-1977).",
"Gustaf Molander was born in Helsingfors (now Helsinki) in the Grand Duchy of Finland (in the Russian Empire), where his father was working at the Swedish Theatre.",
"He studied in the school of the Royal Dramatic Theatre in Stockholm 1907–1909, acted at the Swedish theatre in Helsingfors 1909–1913, and then at the Royal Dramatic Theatre from 1913–1926.",
"The last years there he headed the school; his students included Greta Garbo.",
"Molander wrote several screenplays for Victor Sjöström and Mauritz Stiller, and was helped by the latter to get employment as a director for Svensk Filmindustri, where he worked 1923–1956.",
"All in all, he directed sixty-two films.",
"He often worked with Gösta Ekman, and his films include \"Intermezzo\" (1936), which became Ingrid Bergman's breakthrough and paved her way to America, where she starred in the 1939 Hollywood remake of the film."
],
[
"Gertrude of Saxony and Bavaria( 1152/55–1197) was Duchess of Swabia as the spouse of Duke Frederick IV, and Queen of Denmark as the spouse of King Canute VI."
],
[
"Marie Louise Coidavid( 1778 – March 11, 1851), was the Queen of the Kingdom of Haiti 1811 – 20 as the spouse of Henri I of Haiti."
]
],
"title": [
"Mehdi Abrishamchi",
"Heather D. Gibson",
"Maria Teresa, Grand Duchess of Luxembourg",
"One, But a Lion!",
"Adib Kheir",
"Princess Auguste of Bavaria (1875–1964)",
"Sophia Magdalena of Denmark",
"Gustaf Molander",
"Gertrude of Bavaria",
"Marie-Louise Coidavid"
]
},
"supporting_facts": {
"sent_id": [
3,
3
],
"title": [
"One, But a Lion!",
"Gustaf Molander"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Who is the spouse of the director of film One, But A Lion!?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"Karin Molander"
]
},
"style": "rule"
}
|
{
"index": 95,
"split": "test"
}
|
dev_9052
|
Where was the performer of song Old Money (Lana Del Rey Song) born?
|
[
"the five boroughs",
"City of New York",
"New York",
"NY City",
"New York City",
"NYC",
"Big Apple",
"New York, New York"
] |
{
"context": {
"content": [
[
"Royce Lynn Money( born July 13, 1942) is an American academic administrator who served as president of Abilene Christian University) from 1991 to 2010, whereupon he became Chancellor.",
"He was succeeded as president by Phil Schubert."
],
[
"Israel Campbell Money( born 31 August 1960 in Maybole, Ayrshire) is a Scottish former football goalkeeper and former manager of Stranraer, Ayr United and Stenhousemuir."
],
[
"Kristian Leontiou (born February 1982) is a British singer of Greek Cypriot descent, and is the singer for the indie rock band One eskimO."
],
[
"Elizabeth Woolridge Grant (born June 21, 1985), known professionally as Lana Del Rey, is an American singer and songwriter.",
"Her music is noted by critics for its stylized cinematic quality; its themes of tragic romance, glamour, and melancholia; and its references to pop culture, particularly 1950s and 1960s Americana.",
"Born in New York City and raised in Upstate New York, Del Rey returned to New York City in 2005 to begin her music career.",
"Following numerous projects including her debut studio album and the unreleased \"Sirens\", Del Rey's breakthrough came after the viral success of her debut single \"Video Games\" in 2011.",
"She signed with Interscope and Polydor later that year.",
"Her major label debut \"Born to Die\" (2012) proved an international success and spawned her only top-ten single on the \"Billboard\" Hot 100 with the Cedric Gervais remix of \"Summertime Sadness\", as well as the internationally successful singles \"Blue JeansBorn to Die\", and \"National Anthem\".",
"Del Rey released the Grammy-nominated EP, \"Paradise\" in 2012.",
"The next year, Del Rey ventured into film as she wrote, directed and starred in the short music film, \"Tropico\", and released \"Young and Beautiful\" as the lead single for the romantic drama film \"The Great Gatsby\" (2013).",
"Del Rey subsequently issued her sophomore major label effort, \"Ultraviolence\" (2014), to critical success, topping the charts and spawning the single, \"West Coast\".",
"That same year, Del Rey recorded the eponymous theme for the drama film \"Big Eyes\", which garnered her Grammy and Golden Globe nominations.",
"Shortly afterwards, Del Rey released \"Honeymoon\" (2015) and \"Lust for Life\" (2017), the latter of which topped the charts in the U.S. and was nominated for the Grammy Award for Best Pop Vocal Album.",
"Her sixth studio album, \"Norman Fucking Rockwell!\"",
"(2019), received widespread critical acclaim and two Grammy nominations, including Album of the Year.",
"In 2019, Del Rey also released the singles \"Doin' Time\" and \"Don't Call Me Angel\", the latter being a trio with Ariana Grande and Miley Cyrus.",
"As of 2019, Del Rey has sold 16 million albums worldwide.",
"She has sold over 5 million singles in the UK and has sold 5.5 million RIAA-certified units in the US.",
"Her YouTube and Vevo pages have combined lifetime views of 3.5 billion."
],
[
"Billy Milano is a Bronx- born heavy metal musician now based in Austin, Texas.",
"He is the singer and- occasionally- guitarist and bassist of crossover thrash band M.O.D., and he was also the singer of its predecessor, Stormtroopers of Death.",
"He was also the singer of United Forces, which also featured his Stormtroopers of Death bandmate Dan Lilker."
],
[
"Bernard Bonvoisin, known as Bernie Bonvoisin( born 9 July 1956 in Nanterre, Hauts- de- Seine), is a French hard rock singer and film director.",
"He is best known for having been the singer of Trust.",
"He was one of the best friends of Bon Scott the singer of AC/ DC and together they recorded the song\" Ride On\" which was one of the last songs by Bon Scott."
],
[
"\"Old Money\" is a song by American singer Lana Del Rey from her third studio album, \"Ultraviolence\" (2014).",
"It was written by Del Rey, Robbie Fitzsimmons and producer Daniel Heath.",
"The song has drawn comparisons to Nino Rota's \"Love Theme from \"Romeo and Juliet.",
"The song charted at number 190 in France."
],
[
"\" O Valencia!\" is the fifth single by the indie rock band",
"The Decemberists, and the first released from their fourth studio album,\" The Crane Wife\".",
"The music was written by The Decemberists and the lyrics by Colin Meloy.",
"It tells a story of two star- crossed lovers.",
"The singer falls in love with a person who belongs to an opposing gang.",
"At the end of the song, the singer's lover jumps in to defend the singer, who is confronting his lover's brother( the singer's\" sworn enemy\") and is killed by the bullet intended for the singer."
],
[
"Astrid North( Astrid Karina North Radmann; 24 August 1973, Berlin – 25 June 2019, Berlin) was a German soul singer and songwriter.",
"She was the singer of the German band, with whom she released five Albums.",
"As guest singer of the band she published three albums."
],
[
"Caspar Babypants is the stage name of children's music artist Chris Ballew, who is also widely known as the singer of The Presidents of the United States of America."
]
],
"title": [
"Royce Money",
"Campbell Money",
"Kristian Leontiou",
"Lana Del Rey",
"Billy Milano",
"Bernie Bonvoisin",
"Old Money (Lana Del Rey song)",
"O Valencia!",
"Astrid North",
"Caspar Babypants"
]
},
"supporting_facts": {
"sent_id": [
0,
2
],
"title": [
"Old Money (Lana Del Rey song)",
"Lana Del Rey"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Where was the performer of song Old Money (Lana Del Rey Song) born?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"the five boroughs",
"City of New York",
"New York",
"NY City",
"New York City",
"NYC",
"Big Apple",
"New York, New York"
]
},
"style": "rule"
}
|
{
"index": 96,
"split": "test"
}
|
dev_2426
|
Which film has the director died earlier, Il Disertore or The Battle Of Algiers?
|
[
"The Battle of Algiers",
"Battle of Algiers",
"The Battle Of Algiers",
"La Bataille d'Alger"
] |
{
"context": {
"content": [
[
"Henri Antoine Marie Teissier( born 21 July 1929) is a French- Algerian Catholic Bishop of Algiers and Archbishop Emeritus of Algiers."
],
[
"Admiral of the Fleet Sir George Rooke( 1650 – 24 January 1709) was an English naval officer.",
"As a junior officer he saw action at the Battle of Solebay and again at the Battle of Schooneveld during the Third Anglo- Dutch War.",
"As a captain, he conveyed Prince William of Orange to England and took part in the Battle of Bantry Bay during the Williamite War in Ireland.",
"As a flag officer, Rooke commanded a division of the Royal Navy during their defeat at the Battle of Beachy Head.",
"He also commanded a division at the Battle of Barfleur and distinguished himself at the Battle of La Hogue.",
"He was later defeated while escorting a convoy at the Battle of Lagos.",
"Rooke commanded the unsuccessful allied expedition against Cádiz but on the passage home he destroyed the Spanish treasure fleet at the Battle of Vigo Bay in the opening stages of the War of the Spanish Succession.",
"He also commanded the allied naval forces at the capture of Gibraltar and attacked the French fleet at the Battle of Málaga."
],
[
"Maurice Audin( 14 February 1932 – after 11 June 1957) was a French mathematics assistant at the University of Algiers, a member of the Algerian Communist Party and an activist in the anticolonialist cause, who died under torture by the French state during the Battle of Algiers.",
"In the centre of Algiers, beside the university, the intersection of streets bearing the names of several other heroes of the Algerian Revolution is called the Place Maurice- Audin."
],
[
"The Battle of Algiers is a 1966 Italian- Algerian historical war film co-written and directed by Gillo Pontecorvo and starring Jean Martin and Saadi Yacef.",
"It is based on events by rebels during the Algerian War( 1954 – 62) against the French government in North Africa; the most prominent being the titular Battle of Algiers, the capital of Algeria.",
"It was shot on location and the film's score was composed by Ennio Morricone.",
"The film was shot in a Roberto Rossellini- inspired newsreel style: in black and white with documentary- type editing to add to its sense of historical authenticity.",
"It is often associated with Italian neorealist cinema.",
"The film has been critically celebrated.",
"Both insurgent groups and state authorities have considered it to be an important commentary on urban guerrilla warfare.",
"It occupies the 48th place on the\" Critics' Top 250 Films\" of the 2012\" Sight& Sound\" poll, as well as 120th place on\" Empire\" magazine's list of the' 500 greatest movies of all time'.",
"It was selected to enter the list of the\" 100 Italian films to be saved\".",
"The film concentrates on the years between 1954 and 1957 when guerrilla fighters regrouped and expanded into the Casbah, the citadel of Algiers.",
"Their actions were met by French paratroopers attempting to regain territory.",
"The highly dramatic film is about the organization of a guerrilla movement and the illegal methods, such as torture, used by the colonial power to contain it.",
"Algeria succeeded in gaining independence from the French, which Pontecorvo addresses in the film's epilogue.",
"A subject of socio-political controversy, the film was not screened for five years in France; it was released in 1971."
],
[
"Gillo Pontecorvo( 19 November 1919 – 12 October 2006) was an Italian filmmaker.",
"He worked as a film director for more than a decade before his best known film\" La battaglia di Algeri The Battle of Algiers\", 1966) was released.",
"It won the Golden Lion at the Venice Film Festival in 1966.",
"His other films include\" Kapò\"( 1960), which takes place in a World War II concentration camp, and\" Burn! Queimada\", 1969), starring Marlon Brando and loosely based on the failed slave revolution in Guadeloupe.",
"In 2000, he received the Pietro Bianchi Award at the Venice Film Festival.",
"He was also a screenwriter and composer of film scores, and a close friend of Italian President Giorgio Napolitano."
],
[
"Count was a Japanese general in the Imperial Japanese Army.",
"He was the head of the Japanese First Army during the Russo- Japanese War; and his forces enjoyed a series of successes during the Manchurian fighting at the Battle of Yalu River, the Battle of Liaoyang, the Battle of Shaho and the Battle of Mukden."
],
[
"Liao Lei was a KMT general from Guangxi.",
"He participated in the Battle of Shanghai, the Battle of Wuhan, and the Battle of Xuzhou and the Battle of Suixian – Zaoyang in Hubei.",
"He died of a stroke."
],
[
"Giuliana Berlinguer( 23 November 1933 – 15 September 2014) was an Italian director, screenwriter, and novelist."
],
[
"Il disertore( also known as\" The Deserter\") is a 1983 Italian war drama film written and directed by Giuliana Berlinguer.",
"It is based on the novel with the same name by Giuseppe Dessì.",
"It was entered into the competition at the 40th edition of the Venice Film Festival."
],
[
"Hasan Pasha( c. 1517 – 4 July 1572) was the son of Hayreddin Barbarossa and three- times\" Beylerbey\" of the Regency of Algiers.",
"His mother was a Morisca.",
"He succeeded his father as ruler of Algiers, and replaced Barbarossa's deputy Hasan Agha who had been effectively holding the position of ruler of Algiers since 1533."
]
],
"title": [
"Henri Teissier",
"George Rooke",
"Maurice Audin",
"The Battle of Algiers",
"Gillo Pontecorvo",
"Kuroki Tamemoto",
"Liao Lei",
"Giuliana Berlinguer",
"Il disertore",
"Hasan Pasha (son of Barbarossa)"
]
},
"supporting_facts": {
"sent_id": [
0,
0,
0,
0
],
"title": [
"Il disertore",
"The Battle of Algiers",
"Giuliana Berlinguer",
"Gillo Pontecorvo"
]
},
"type": "bridge_comparison"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which film has the director died earlier, Il Disertore or The Battle Of Algiers?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"The Battle of Algiers",
"Battle of Algiers",
"The Battle Of Algiers",
"La Bataille d'Alger"
]
},
"style": "rule"
}
|
{
"index": 97,
"split": "test"
}
|
dev_7041
|
When did Bud Roper's father die?
|
[
"April 30, 1971"
] |
{
"context": {
"content": [
[
"Elmo Burns Roper Jr. (July 31, 1900 in Hebron, Nebraska – April 30, 1971 in Redding, Connecticut) was a pollster known for his pioneering work in market research and opinion polling.",
"In 1933, he cofounded Cherington, Wood, and Roper, a marketing research firm.",
"When that partnership fell apart, he founded his own research company, Elmo Roper,",
"Inc. He was hired by Henry Luce in 1935 to run surveys for \"Fortune\", continuing these surveys for 15 years.",
"His prediction of Franklin Delano Roosevelt's electoral victory over Alf Landon in 1936 was correct to within 0.9%; his 1940 prediction of Roosevelt's victory was correct to within 0.5%, further solidifying the reputation of Roper's techniques.",
"In 1940, Roosevelt hired Roper to assess public opinion of Lend-Lease prior to its implementation.",
"In 1942 he was hired by William Joseph Donovan to be the deputy director of the Office of Strategic Services; Roper subsequently worked with the Office of War Information.",
"After leaving the OWI he founded the Roper Center for Public Opinion Research at the University of Connecticut in 1947.",
"Roper Opinion Research Company (the \"Roper Poll\") was later renamed Roper Starch Worldwide Company and eventually acquired by NOP World and then GfK in 2005."
],
[
"William Thomas Smith( born 9 April 1897, date of death unknown) was an English professional footballer."
],
[
"Theodred II was a medieval Bishop of Elmham.",
"The date of Theodred's consecration unknown, but the date of his death was sometime between 995 and 997."
],
[
"Albert Thompson( born 1912, date of death unknown) was a Welsh footballer."
],
[
"Thomas Scott( 1907- date of death unknown) was an English diver."
],
[
"Harry Wainwright( born 1899; date of death unknown) was an English footballer."
],
[
"Etan Boritzer( born 1950) is an American writer of children ’s literature who is best known for his book\" What is God?\" first published in 1989.",
"His best selling\" What is?\" illustrated children's book series on character education and difficult subjects for children is a popular teaching guide for parents, teachers and child- life professionals.",
"Boritzer gained national critical acclaim after\" What is God?\" was published in 1989 although the book has caused controversy from religious fundamentalists for its universalist views.",
"The other current books in the\" What is?\" series include",
"What is Love?,",
"What is Death?,",
"What is Beautiful?,",
"What is Funny?,",
"What is Right?,",
"What is Peace?,",
"What is Money?,",
"What is Dreaming?,",
"What is a Friend?,",
"What is True?,",
"What is a Family?,",
"What is a Feeling?\"",
"The series is now also translated into 15 languages.",
"Boritzer was first published in 1963 at the age of 13 when he wrote an essay in his English class at Wade Junior High School in the Bronx, New York on the assassination of John F. Kennedy.",
"His essay was included in a special anthology by New York City public school children compiled and published by the New York City Department of Education.",
"Boritzer now lives in Venice, California and maintains his publishing office there also.",
"He has helped numerous other authors to get published through\" How to Get Your Book Published!\" programs.",
"Boritzer is also a yoga teacher who teaches regular classes locally and guest- teaches nationally.",
"He is also recognized nationally as an erudite speaker on\" The Teachings of the Buddha.\""
],
[
"Harry Johnson( 1903- date of death unknown) was an English wrestler."
],
[
"David Vaughan (February 3, 1822 – February 28, 1890) was an American farmer, mechanic, and politician.",
"Born in North Wales, Vaughan emigrated with his parents to the United States in 1831 and settled in Remsen, New York.",
"Vaughan went to public school in Remsen, New York.",
"In 1864, he moved to the town of Burns, La Crosse County, Wisconsin.",
"He was a farmer and mechanic.",
"Vaughan served as justice of the peace for the town of Burns.",
"Vaughan served in the Wisconsin State Assembly in 1887 and was a Republican.",
"Vaughan died at his home in the town of Burns, Wisconsin."
],
[
"Burns Worthington \"Bud\" Roper (1925-2003) was an American pollster.",
"He was chairman (1967–93) of Roper ASW, which was founded as Elmo Roper, Inc. by his father Elmo Roper.",
"He was chairman (1970–94) of the Roper Center for Public Opinion Research at the University of Connecticut.",
"He helped develop industry standards for methodology and the wording of poll questions.",
"Two polling controversies occurred during his career; in both cases he showed readiness to admit mistakes."
]
],
"title": [
"Elmo Roper",
"Bill Smith (footballer, born 1897)",
"Theodred II (Bishop of Elmham)",
"Albert Thompson (footballer, born 1912)",
"Thomas Scott (diver)",
"Harry Wainwright (footballer)",
"Etan Boritzer",
"Harry Johnson (wrestler)",
"David Vaughan (Wisconsin politician)",
"Bud Roper"
]
},
"supporting_facts": {
"sent_id": [
1,
0
],
"title": [
"Bud Roper",
"Elmo Roper"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: When did Bud Roper's father die?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"April 30, 1971"
]
},
"style": "rule"
}
|
{
"index": 98,
"split": "test"
}
|
dev_2088
|
Which country the performer of song Gå Och Göm Dig, Åke Tråk is from?
|
[
"se",
"Lili Bech",
"Sweden",
"Kingdom of Sweden",
"SE",
"SWE"
] |
{
"context": {
"content": [
[
"Bernard Bonvoisin, known as Bernie Bonvoisin( born 9 July 1956 in Nanterre, Hauts- de- Seine), is a French hard rock singer and film director.",
"He is best known for having been the singer of Trust.",
"He was one of the best friends of Bon Scott the singer of AC/ DC and together they recorded the song\" Ride On\" which was one of the last songs by Bon Scott."
],
[
"Caspar Babypants is the stage name of children's music artist Chris Ballew, who is also widely known as the singer of The Presidents of the United States of America."
],
[
"Kristian Leontiou (born February 1982) is a British singer of Greek Cypriot descent, and is the singer for the indie rock band One eskimO."
],
[
"\" Det börjar verka kärlek, banne mig It's Beginning to Look Like Love, Damn It\") was the Swedish entry in the Eurovision Song Contest 1968, performed in Swedish by Claes- Göran Hederström, the winner of Melodifestivalen 1968.",
"Composer Peter Himmelstrand, who incidentally also wrote the two songs that finished in shared third place in the 1968 pre-selections,\" Gå och göm dig, Åke Tråk\" and\" Du vet var jag finns\", would go on to write fifteen Melodifestivalen entries as well as many other popular Svensktoppen hits all through the 1960s and 1970s.",
"\" Det börjar verka kärlek, banne mig\" was performed eight on the night, following the Monaco's Line& Willy with\" À chacun sa chanson\" and preceding Finland's Kristina Hautala with\" Kun kello käy\".",
"At the close of voting, it had received 15 points, placing fifth in a field of 17.",
"The song is widely considered a schlager classic in Sweden, mainly because of its- for the genre- highly unconventional title, its use of onomatopoetic expressions and slang and Peter Himmelstrand's humorous lyrics, including phrases like:",
"\" The day before yesterday I suddenly turned down the guys, when they invited me to a party and instead I spent a whole evening conversing with your mother, being polite.\"",
"\" Last night I went with you and watched a tearjerker, although I wanted to watch\" bang bang\".",
"And then I sat listening to a string quartet- me, I only used to like\" twang twang\" It's beginning to look like love, damn it.",
"The kind of thing you read about in books, imagine that.",
"I just do n't get it, I guess I'm a little stupid.",
"But by all means, I could be wrong ...\"",
"The song was succeeded as Swedish representative in the 1969 contest by Tommy Körberg, singing\" Judy, min vän\".",
"An English- language version\" My Time Has Come\", was released in 1980."
],
[
"Dáithí Sproule( born 23 May 1950) is a guitarist and singer of traditional Irish music.",
"His niece is the singer Claire Sproule."
],
[
"Mona Wessman, born 24 March 1948 in Långsele in Sweden, is a Swedish singer.",
"Her breakthrough came together with Nina Lizell and Claes-Göran Hederström in the \"Minishow\" TV show, which aired on Pentecost Saturday 1967.",
"Her definitive breakthrough as a recording artist was \"Gå och göm dej, Åke Tråk\", participating at Melodifestivalen 1968, ending up third, and becoming a Svensktoppen hit peaking at third position.",
"The same year, she toured the folkparks of Sweden together with Nina and Claes-Göran.",
"Sven Lindahl wrote another of her more successful songs, \"Hambostinta i kortkort\".",
"During the autumn of 1968, she toured with Lenne Broberg, Claes-Göran Hederström and Hipp Happy Band, consisting of Mats Westman, Lasse Sandborg, Stefan Möller och Bertil Bodahl.",
"Scoring several Svensktoppen hits by the late 1960s, she also appeared in the 1968 film \"",
"Åsa-Nisse och",
"den stora kalabaliken\".",
"She also participated at Melodifestivalen 1973 where the song she performed, \"Helledudane en sån karl\", ended up last."
],
[
"\"Gå och göm dig, Åke Tråk\" is a song written by Peter Himmelstrand.",
"Mona Wessman participated with the song at Melodifestivalen 1968, where it ended up third.",
"The song also charted at Svensktoppen for five weeks, peaking at third position."
],
[
"Billy Milano is a Bronx- born heavy metal musician now based in Austin, Texas.",
"He is the singer and- occasionally- guitarist and bassist of crossover thrash band M.O.D., and he was also the singer of its predecessor, Stormtroopers of Death.",
"He was also the singer of United Forces, which also featured his Stormtroopers of Death bandmate Dan Lilker."
],
[
"\" Panda\" is a song of the Chilean band Astro and is the fifth song of the homonymous album of the year 2011.",
"The song was composed and produced by the singer of the band Andrés Nusser and released like third single of the album on February 1, 2013."
],
[
"\" O Valencia!\" is the fifth single by the indie rock band",
"The Decemberists, and the first released from their fourth studio album,\" The Crane Wife\".",
"The music was written by The Decemberists and the lyrics by Colin Meloy.",
"It tells a story of two star- crossed lovers.",
"The singer falls in love with a person who belongs to an opposing gang.",
"At the end of the song, the singer's lover jumps in to defend the singer, who is confronting his lover's brother( the singer's\" sworn enemy\") and is killed by the bullet intended for the singer."
]
],
"title": [
"Bernie Bonvoisin",
"Caspar Babypants",
"Kristian Leontiou",
"Det börjar verka kärlek, banne mig",
"Dáithí Sproule",
"Mona Wessman",
"Gå och göm dig, Åke Tråk",
"Billy Milano",
"Panda (Astro song)",
"O Valencia!"
]
},
"supporting_facts": {
"sent_id": [
1,
0
],
"title": [
"Gå och göm dig, Åke Tråk",
"Mona Wessman"
]
},
"type": "compositional"
}
|
2wikimultihopqa
|
[
{
"content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which country the performer of song Gå Och Göm Dig, Åke Tråk is from?\n\n",
"role": "user"
}
] |
fact-reasoning
|
{
"ground_truth": {
"target": [
"se",
"Lili Bech",
"Sweden",
"Kingdom of Sweden",
"SE",
"SWE"
]
},
"style": "rule"
}
|
{
"index": 99,
"split": "test"
}
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.