Dataset Viewer
Auto-converted to Parquet Duplicate
id
stringlengths
6
9
question
stringlengths
32
113
golden_answers
listlengths
1
16
metadata
dict
data_source
stringclasses
1 value
prompt
listlengths
1
1
ability
stringclasses
1 value
reward_model
dict
extra_info
dict
dev_9260
Which country Haya Bint Hussein's husband is from?
[ "the United Arab Emirates", "🇦🇪", "UAE", "the UAE", "United Arab Emirates", "the Emirates", "the U.A.E.", "ae", "U.A.E.", "Emirates" ]
{ "context": { "content": [ [ "Sumaya bint Hassan( born 14 May 1971) is a princess of Jordan." ], [ "Princess Haya bint Hussein (born 3 May 1974), commonly known as Princess Haya of Jordan, is the daughter of King Hussein of Jordan and his third wife Queen Alia, and the half-sister of King Abdullah II.", "She is a graduate of Oxford University in England and an accomplished equestrian.", "She represented Jordan at the 2000 Summer Olympics in Sydney, Australia and is the two-term President of the International Federation for Equestrian Sports (FEI).", "In addition, she engages in a variety of charitable activities.", "In 2004, Haya became the sixth wife of the ruler of the Emirate of Dubai, Sheikh Mohammed bin Rashid Al Maktoum.", "The Sheikh’s first wife,", "Hind bint Maktoum bin Juma Al Maktoum, remained first lady of Dubai.", "In June 2019, Haya left her husband in Dubai with her two children and is currently residing in her mansion in the United Kingdom, while conducting legal proceedings over custody of her children." ], [ "Princess Qhabl Begum Sahiba Esra Birgen (born 1936) is a princess belonging to the Asaf Jah Dynasty of Hyderabad state by marriage.", "She was married to Prince Mukarram Jah.", "Princess Esra is a princess by marriage not by birth." ], [ "Haya bint Abdulaziz( 1929 – 2 November 2009) was a member of the House of Saud." ], [ "Yousef Al- Samoui is a Jordanian footballer who plays for Al- Hussein." ], [ "Muhannad Al- Ezza is a Jordanian footballer who plays for Al- Hussein." ], [ "Lubaba bint al- Harith( died c. 650), also known as Umm Fadl, was a prominent early Muslim.", "Two of her sisters, Maymuna bint al- Harith and Zaynab bint Khuzayma, became wives of Muhammad." ], [ "Sheikh Mohammed bin Rashid Al Maktoum (born 15 July 1949) is the Vice President and Prime Minister of the United Arab Emirates (UAE), and ruler of the Emirate of Dubai.", "Since his accession in 2006, after the death of his brother Sheikh Maktoum, he has undertaken reforms in the UAE's government, starting with the UAE Federal Government Strategy in April 2007.", "In 2010, he launched the UAE vision 2021 with the aim of making the UAE 'one of the best countries in the world' by 2021.", "He is responsible for the growth of Dubai into a global city, as well as the launch of a number of major enterprises including Emirates Airline, DP World, and the Jumeirah Group.", "Many of these are held by Dubai Holding, a company with multi-diversified businesses and investments.", "Sheikh Mohammed has overseen the development of numerous projects in Dubai including the creation of a technology park and a free economic zone, Dubai Internet City, Dubai Media City, the Dubai International Finance Centre, the Palm Islands and the Burj Al Arab hotel.", "He also drove the construction of Burj Khalifa, the tallest building in the world.", "A keen equestrian, he is the founder of the Maktoum family-owned Godolphin racing stable and the owner of Darley, a thoroughbred breeding operation with operations in six countries.", "In 2012, he rode the horse \"Madji Du Pont\" 160 km to take the FEI World Endurance Championship.", "He is a recognised poet in his native Arabic.", "He shares a close and friendly relationship with Mohammed bin Zayed Al Nahyan, who is the Crown Prince of the Emirate of Abu Dhabi, and the Deputy Supreme Commander of the UAE Armed Forces." ], [ "Malek Al- Yasseri is a Jordanian footballer who plays for Al- Hussein." ], [ "Tsandzile Dlamini is a princess of Swaziland who currently serves as Minister of Home Affairs." ] ], "title": [ "Princess Sumaya bint Hassan", "Haya bint Hussein", "Princess Esra", "Haya bint Abdulaziz Al Saud", "Yousef Al-Samoui", "Muhannad Al-Ezza", "Lubaba bint al-Harith", "Mohammed bin Rashid Al Maktoum", "Malek Al-Yasseri", "Tsandzile Dlamini" ] }, "supporting_facts": { "sent_id": [ 4, 0 ], "title": [ "Haya bint Hussein", "Mohammed bin Rashid Al Maktoum" ] }, "type": "compositional" }
2wikimultihopqa
[ { "content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which country Haya Bint Hussein's husband is from?\n\n", "role": "user" } ]
fact-reasoning
{ "ground_truth": { "target": [ "the United Arab Emirates", "🇦🇪", "UAE", "the UAE", "United Arab Emirates", "the Emirates", "the U.A.E.", "ae", "U.A.E.", "Emirates" ] }, "style": "rule" }
{ "index": 0, "split": "test" }
dev_8565
Which film has the director born later, The Abyss or Going Hollywood?
[ "Abyss", "The Abyss" ]
{ "context": { "content": [ [ "James Francis Cameron( born August 16, 1954) is a Canadian filmmaker, artist, and environmentalist, who is best known for making science fiction and epic films for the Hollywood mainstream.", "Cameron first gained recognition for directing\" The Terminator\"( 1984).", "He found further critical and commercial success with\" Aliens\"( 1986),\" The Abyss\"( 1989),( 1991) and\" True Lies\"( 1994).", "His greatest big-budget productions have been\" Titanic\"( 1997) and\" Avatar\"( 2009), the former earning him Academy Awards in Best Picture, Best Director and Best Film Editing.", "\" Avatar,\" filmed in 3 D technology, also garnered him nominations in the same categories.", "He also co-founded Lightstorm Entertainment, Digital Domain and Earthship Productions.", "In addition to his filmmaking, he is a National Geographic explorer of the sea and has produced a number of documentaries on the subject.", "Cameron contributed to underwater filming and remote vehicle technologies and helped create the digital 3D Fusion Camera System.", "In 2012, Cameron became the first person to perform a solo descent to the bottom of the Mariana Trench, the deepest part of the Earth's ocean, in the\" Deepsea Challenger\" submersible.", "In total, Cameron's films have grossed approximately US$ 2 billion in North America and US$ 6 billion worldwide.", "Cameron's\" Avatar\" and\" Titanic\" are the second and third highest- grossing films of all time, earning$ 2.78 billion and$ 2.19 billion, respectively.", "Cameron holds the achievement of having directed the first two of the five films in history to gross over$ 2 billion worldwide.", "In 2010,\" Time\" magazine named Cameron one of the 100 most influential people in the world." ], [ "The Abyss is a 1989 American science fiction film written and directed by James Cameron and starring Ed Harris, Mary Elizabeth Mastrantonio, and Michael Biehn.", "When an American submarine sinks in the Caribbean, the U.S. search and recovery team works with an oil platform crew, racing against Soviet vessels to recover the boat.", "Deep in the ocean, they encounter something unexpected.", "The film was released on August 9, 1989, receiving generally positive reviews and grossed$ 90 million.", "It won the Academy Award for Best Visual Effects and was nominated for three more Academy awards." ], [ "Going Hollywood is a 1933 American pre-Code musical film directed by Raoul Walsh and starring Marion Davies and Bing Crosby.", "It was written by Donald Ogden Stewart and based on a story by Frances Marion.", "\" Going Hollywood\" was released by Metro- Goldwyn- Mayer on December 22, 1933." ], [ "Fred R. Krug is an American film and television producer- director born in Bern, Switzerland." ], [ "Claude Weisz is a French film director born in Paris." ], [ "Yolonda Ross is an American character actress, writer and director born (31 July 1974)." ], [ "W. Augustus Barratt( 1873- 1947) was a Scottish- born, later American, songwriter and musician." ], [ "Jacques Décombe is a French author, actor and director born in 1953." ], [ "Raoul A. Walsh( March 11, 1887 December 31, 1980) was an American film director, actor, founding member of the Academy of Motion Picture Arts and Sciences( AMPAS) and the brother of the silent screen actor George Walsh.", "He was known for portraying John Wilkes Booth in the silent classic\" The Birth of a Nation\"( 1915) and for directing such films as\" The Big Trail\"( 1930), starring John Wayne,\" High Sierra\"( 1941), starring Ida Lupino and Humphrey Bogart; and\" White Heat\"( 1949), starring James Cagney and Edmond O'Brien.", "He directed his last film in 1964." ], [ "Sepideh Farsi is an Iranian film director, born in Tehran in 1965." ] ], "title": [ "James Cameron", "The Abyss", "Going Hollywood", "Fred Roy Krug", "Claude Weisz", "Yolonda Ross", "W. Augustus Barratt", "Jacques Décombe", "Raoul Walsh", "Sepideh Farsi" ] }, "supporting_facts": { "sent_id": [ 0, 0, 0, 0 ], "title": [ "The Abyss", "Going Hollywood", "James Cameron", "Raoul Walsh" ] }, "type": "bridge_comparison" }
2wikimultihopqa
[ { "content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which film has the director born later, The Abyss or Going Hollywood?\n\n", "role": "user" } ]
fact-reasoning
{ "ground_truth": { "target": [ "Abyss", "The Abyss" ] }, "style": "rule" }
{ "index": 1, "split": "test" }
dev_5183
Who is the paternal grandfather of Alexios Komnenos (Son Of Andronikos I)?
[ "Isaac Komnenos", "Isaac Komnenos (son of Alexios I)" ]
{ "context": { "content": [ [ "Jiajak Jaqeli was the Empress consort of Alexios II of Trebizond." ], [ "Rev. John Westley( 1636 – 78) was an English nonconformist minister.", "He was the grandfather of John Wesley( founder of Methodism)." ], [ "Abd al- Muttalib Shaybah ibn Hashim( c. 497 – 578) was the grandfather of Islamic prophet Muhammad." ], [ "Nikephoros Euphorbenos Katakalon was a Byzantine aristocrat and son-in- law of Emperor Alexios I Komnenos.", "He was the son of the distinguished general Constantine Euphorbenos Katakalon, one of Alexios' most trusted officials.", "As a token of Alexios' appreciation for Constantine, Nikephoros was married to the\" porphyrogennete\" Maria, the emperor's second daughter, and raised to the rank of\" panhypersebastos\".", "From the\" Alexiad\", he is known to have participated in combat against the Cumans in 1095, alongside his father, where he distinguished himself by his bravery.", "It is unknown when he died, but he is recorded as being dead by 1130.", "With Maria, he had numerous children, but only two sons, Alexios Komnenos and Andronikos, are known by name, having held senior positions later in the century.", "Another son, John, is known only by his commemoration in the\" typikon\" of the Monastery of Christ Philanthropos.", "The couple also had an unknown number of daughters( at least three according to Varzos), as their existence is mentioned in the\" typikon\" of the Kecharitomene Monastery." ], [ "Kaya Alp was, according to Ottoman tradition, the son of Kızıl Buğa and the father of Suleyman Shah, who was, in turn, the grandfather of Ertuğrul, and the great grandfather of the Ottoman Empire founder, Osman I." ], [ "George (Georgios) Palaiologos or Palaeologus was a Byzantine general, one of the most prominent military commanders and supporters of Emperor Alexios I Komnenos (r. 1081–1118).", "He was the son of the first known member of the Palaiologos family, the \"strategos\" of Mesopotamia Nikephoros Palaiologos.", "His wife Anna Doukaina was the sister of Irene Doukaina, the wife of Alexios I Komnenos, making him the emperor's brother-in-law.", "As Alexios Komnenos’ leading general and close friend, he played an important role in his campaigns, especially the Battle of Dyrrhachium against the Normans or the Battle of Levounion against the Pechenegs.", "He was the chief source used by Anna Komnene in her \"Alexiad\" on her father’s battles, and is very favourably portrayed by her in her book as both capable and loyal." ], [ "Lyon Cohen( 1868–1937) was a Polish- born Canadian businessman and a philanthropist.", "He was the grandfather of singer/ poet Leonard Cohen." ], [ "Nicholas Kallikles was a prominent physician and a leading court poet active in the Byzantine court in Constantinople during the reigns of Alexios I Komnenos( r. 1081– 1118) and John II Komnenos( r. 1118–1143)." ], [ "Alexios Komnenos (1170 – 1199) was a natural son of Andronikos I Komnenos, the Byzantine Emperor", "(r. 1183 – 1185) by his relative and mistress Theodora Komnene, Queen Dowager of Jerusalem.", "During the reign of Emperor Manuel I Komnenos (r. 1143–1180), Alexios accompanied his father Andronikos in exile, visiting, inter alia, the Kingdom of Georgia.", "The Georgian king George III, their relative, granted to Andronikos several castles in Kakhetia in the east of Georgia.", "Andronikos returned to Constantinople and attained to the Byzantine crown in 1183, only to be overthrown and killed in 1185.", "Alexios then fled to Georgia, where he was restored to his father's Georgian estates.", "At one point, he was even considered by some Georgian nobles as a candidate to become a consort of the queen regnant Tamar of Georgia.", "According to the Georgian historical tradition, Alexios's progeny flourished in Georgia, producing the noble family of Andronikashvili, i.e., \"scions of Andronikos\", so named after Alexios's purported son.", "In spite of the extremely fragmentary nature of the early Andronikashvili pedigree, Professor Cyril Toumanoff (1976) accepted the Komnenian origin as plausible, but the evidence marshaled by Kuršankis (1977) suggests that this may have been only a legend.", "Toumanoff also assumed that the line of the \"provincial kings\" of Alastani (c. 1230–1348), known from the medieval Georgian sources and including one named Andronikos, might have belonged to the Georgian Komnenoi/Andronikashvili." ], [ "Andronikos I Komnenos (– 12 September 1185), usually Latinized as Andronicus I Comnenus, was Byzantine Emperor from 1183 to 1185.", "He was the son of Isaac Komnenos and the grandson of the emperor", "Alexios I." ] ], "title": [ "Jiajak Jaqeli", "John Westley", "Abd al-Muttalib", "Nikephoros Katakalon", "Kaya Alp", "George Palaiologos", "Lyon Cohen", "Nicholas Kallikles", "Alexios Komnenos (son of Andronikos I)", "Andronikos I Komnenos" ] }, "supporting_facts": { "sent_id": [ 0, 1 ], "title": [ "Alexios Komnenos (son of Andronikos I)", "Andronikos I Komnenos" ] }, "type": "inference" }
2wikimultihopqa
[ { "content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Who is the paternal grandfather of Alexios Komnenos (Son Of Andronikos I)?\n\n", "role": "user" } ]
fact-reasoning
{ "ground_truth": { "target": [ "Isaac Komnenos", "Isaac Komnenos (son of Alexios I)" ] }, "style": "rule" }
{ "index": 2, "split": "test" }
dev_4272
Which film has the director who was born later, Alice Of Wonderland In Paris or The Roundabouts Of Handsome Karl?
[ "Alice Of Wonderland In Paris", "Alice of Wonderland in Paris" ]
{ "context": { "content": [ [ "Eugene Merril Deitch( born August 8, 1924) is an American illustrator, animator and film director.", "Based in Prague since 1959, Deitch is known for creating animated cartoons such as\" MunroTom Terrific\", and\" Nudnik\", as well as his work on the\" Popeye\" and\" Tom and Jerry\" series." ], [ "Alice Maud Fanner, later Alice Maud Taite,( 1865 – 1930) was a British landscape painter." ], [ "Alice A. Andrews( 1866 – 1946), later Alice A. Parker, was an American pianist and composer, and musical director of the Andrews Family Opera.", "Later in life, she was a vocal coach and instructor in New York City." ], [ "Carl August Hugo Froelich", "( 5 September 1875 – 12 February 1953) was a German film pioneer and film director.", "He was born and died in Berlin." ], [ "Hassan Zee is a Pakistani- American film director who was born in Chakwal, Pakistan." ], [ "W. Augustus Barratt( 1873- 1947) was a Scottish- born, later American, songwriter and musician." ], [ "Rumbi Katedza is a Zimbabwean Film Producer and Director who was born on 17 January 1974." ], [ "Alice of Norfolk or Alice of Brotherton( c. 1324 – c. 30 January 1352) was an English noblewoman.", "She was the daughter of Thomas of Brotherton, and a granddaughter of King Edward I of England.", "She married Edward Montagu, 1st Baron Montagu." ], [ "The Roundabouts of Handsome Karl( German: Die Umwege des schönen Karl) is a 1938 German comedy film directed by Carl Froelich and starring Heinz Rühmann, Karin Hardt and Sybille Schmitz.", "It portrays the experiences of a young waiter during the Great Depression." ], [ "Alice of Wonderland in Paris or Alice in Paris is a 1966 Czech- American animated film directed by Gene Deitch and produced by William L. Snyder in extreme limited animation." ] ], "title": [ "Gene Deitch", "Alice Maud Fanner", "Alice A. Andrews", "Carl Froelich", "Hassan Zee", "W. Augustus Barratt", "Rumbi Katedza", "Alice of Norfolk", "The Roundabouts of Handsome Karl", "Alice of Wonderland in Paris" ] }, "supporting_facts": { "sent_id": [ 0, 0, 0, 1 ], "title": [ "Alice of Wonderland in Paris", "The Roundabouts of Handsome Karl", "Gene Deitch", "Carl Froelich" ] }, "type": "bridge_comparison" }
2wikimultihopqa
[ { "content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which film has the director who was born later, Alice Of Wonderland In Paris or The Roundabouts Of Handsome Karl?\n\n", "role": "user" } ]
fact-reasoning
{ "ground_truth": { "target": [ "Alice Of Wonderland In Paris", "Alice of Wonderland in Paris" ] }, "style": "rule" }
{ "index": 3, "split": "test" }
dev_3346
Who is Eudocia (Daughter Of Valentinian Iii)'s sibling-in-law?
[ "Anicius Olybrius", "Olybrius", "Flavius Anicius Olybrius" ]
{ "context": { "content": [ [ "Nick Sidi( born 22 February 1966 in Manchester, England) is an English actor.", "He is the son -in- law of actress Rosalind Knight and brother- in- law of actress Susannah Elliott- Knight.", "He often plays drama or comedy roles on television and frequently appears in BBC dramas." ], [ "Eudocia or Eudoxia (439 – 466/474?) was the eldest daughter of Roman emperor Valentinian III and his wife, Licinia Eudoxia.", "She was thus the granddaughter on her mother's side of Eastern emperor Theodosius II and his wife, the poet Aelia Eudocia; and on her father's side of Western emperor Constantius III and his wife Galla Placida.", "In the mid-440s, at age 5, Eudocia was betrothed to Huneric, son of the Vandal king Gaiseric (and then a hostage in Italy).", "This engagement served to strengthen the alliance between the Western court and the Vandal kingdom in Africa.", "Their marriage did not take place at this time, however, because Eudocia was not yet of age.", "Eudocia's father was assassinated in 455, and his successor, Petronius Maximus, compelled Eudocia's mother to marry him and Eudocia herself to marry his son, Palladius.", "In response, the Vandals (reportedly at the request of Eudocia's mother) invaded Italy and captured Eudocia, her mother, and her younger sister, Placidia.", "After 7 years Eudocia's mother and sister were sent to Constantinople, while Eudocia remained in Africa and married Huneric c. 460.", "They had a son, Hilderic, who reigned as king of the Vandals, from 523-530.", "At some time following the birth of Hilderic, Eudocia withdrew to Jerusalem due to religious differences with her Arian husband.", "She died there and was buried in the sepulcher of her grandmother, Aelia Eudocia." ], [ "Robert Woodrow Levering( October 3, 1914 – August 11, 1989) was a U.S. Representative from Ohio, son-in- law of Usher L. Burdick and brother- in- law of Quentin N. Burdick." ], [ "Antony Charles Robert Armstrong- Jones, 1st Earl of Snowdon( 7 March 193013 January 2017), commonly known as Lord Snowdon, was a British photographer and filmmaker.", "He was the husband of Princess Margaret and brother- in- law of Queen Elizabeth II." ], [ "Charles Francis Norton( 1807 – 1835) was a 19th- century Member of Parliament for Guildford.", "Norton was also a Captain of 52nd Light Infantry; son-in- law of Sir Colin Campbell; brother- in- law of writer Caroline Norton" ], [ "Gertrude Elizabeth, Lady Colin Campbell (\"née\" Blood; 3 May 1857 – 1 November 1911) was an Irish-born journalist, author, playwright, and editor.", "She was married to Lord Colin Campbell, a brother-in-law of Princess Louise, Queen Victoria's fourth daughter." ], [ "Wyndham Goold( 1812-1854) was a Member of Parliament( MP) for County Limerick from 1850 to 1854.", "Goold was educated at Westminster School and Trinity College, Dublin.", "He was the son in law of Frederick Falkiner Goold, Archdeacon of Raphoe and brother in law of Bishop Hamilton Verschoyle" ], [ "Placidia was the wife of Olybrius, unrecognized Western Roman Emperor.", "Her full name is uncertain.", "The \"Chronicle of the Roman Emperors:", "The reign by reign record of the rulers of Imperial Rome\" (1995) by Chris Scarre gives her name as Galla Placidia Valentiniana or Galla Placidia the Younger, based on naming conventions for women in ancient Rome." ], [ "Robert Vadra is an Indian businessman, entrepreneur and the husband of Priyanka Gandhi.", "He is the son -in- law of Sonia Gandhi and Rajiv Gandhi, and the brother- in- law of Rahul Gandhi." ], [ "Gregory B. Penner( born December 18, 1969) is an American businessman and venture capitalist.", "Penner was named the chairman of Walmart on June 5, 2015.", "He is the son-in- law of S. Robson Walton and the grandson- in- law of Sam Walton, the founder of Walmart." ] ], "title": [ "Nick Sidi", "Eudocia (daughter of Valentinian III)", "Robert W. Levering", "Antony Armstrong-Jones, 1st Earl of Snowdon", "Charles Francis Norton", "Gertrude Elizabeth Blood", "Wyndham Goold", "Placidia", "Robert Vadra", "Greg Penner" ] }, "supporting_facts": { "sent_id": [ 6, 0 ], "title": [ "Eudocia (daughter of Valentinian III)", "Placidia" ] }, "type": "inference" }
2wikimultihopqa
[ { "content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Who is Eudocia (Daughter Of Valentinian Iii)'s sibling-in-law?\n\n", "role": "user" } ]
fact-reasoning
{ "ground_truth": { "target": [ "Anicius Olybrius", "Olybrius", "Flavius Anicius Olybrius" ] }, "style": "rule" }
{ "index": 4, "split": "test" }
dev_10972
Who was born first out of Felix Cicaterri and Rajiv Gandhi?
[ "Felix Cicaterri" ]
{ "context": { "content": [ [ "John William Joseph McMahon( 28 December 1917 – 8 May 2001) was an Australian- born first- class cricketer who played for Surrey and Somerset in England from 1947 to 1957." ], [ "Henry Walter Moore( 1849 – 20 August 1916) was an English- born first- class cricketer who spent most of his life in New Zealand." ], [ "Punjala Shiv Shankar( 10 August 1929 – 27 February 2017) was an Indian politician.", "He served as the Minister of External Affairs, Law, and Petroleum.", "He was a very influential minister in Indira Gandhi's and Rajiv Gandhi's cabinets and was one of the most senior politicians in India.", "He also served as Governor of Sikkim from 1994 to 1995 and Governor of Kerala from 1995 to 1996." ], [ "Pendekanti Venkatasubbaiah was an Indian politician.", "He was the Governor of Bihar from 15 March 1985 to 25 February 1988 and the Governor of Karnataka, India from 26 February 1988 to 5 February 1990.", "He was elected to the Lok Sabha and was the Union Minister of State for Home and Parliamentary Affairs in both Indira Gandhi and Rajiv Gandhi cabinet." ], [ "Hartley W Lobban (9 May 1926 – 15 October 2004) was a Jamaican-born first-class cricketer who played 17 matches for Worcestershire in the early 1950s." ], [ "Felix Cicaterri,( S.J.)( July 3, 1804 – July 15, 1873) was the third president of Santa Clara University, California, United States.", "He was also a one- time rector of Verona's Jesuit College.", "In 1848 he was made president of the Jesuit College at Vienna now the University of Vienna.", "After immigrating to the United States, he was a professor of divinity at St. John's College in Fordham, New York.", "Later he taught philosophy at Georgetown College in the United States.", "On March 11, 1857, Cicaterri was appointed Santa Clara University's third president and inherited the financial woes created by construction projects.", "Cicaterri was removed from the University's presidency in 1861 due to Rome's displeasure.", "In January 1873, he was sent to the College of the Sacred Heart in Woodstock, Maryland, where he died July 15, 1873." ], [ "Robert Vadra is an Indian businessman, entrepreneur and the husband of Priyanka Gandhi.", "He is the son -in- law of Sonia Gandhi and Rajiv Gandhi, and the brother- in- law of Rahul Gandhi." ], [ "Thom Adcox- Hernandez is an American actor and voice actor known for his roles as Brian in the prime- time soap opera,\" Falcon Crest\" and the voices of Felix the Cat in\" The Twisted Tales of Felix the Cat\" and Lexington on\" Gargoyles\"." ], [ "Wale Adebanwi( born 1969) is a Nigerian- born first Black Rhodes Professor at St Antony's College, Oxford." ], [ "Rajiv Gandhi( 20 August 1944 – 21 May 1991) was an Indian politician who served as the 6th Prime Minister of India from 1984 to 1989.", "He took office after the 1984 assassination of his mother, Prime Minister Indira Gandhi, to become the youngest Indian Prime Minister at the age of 40.", "Gandhi was a scion of the politically powerful Nehru – Gandhi family, which had been associated with the Indian National Congress party.", "For much of his childhood, his maternal grandfather Jawaharlal Nehru was Prime Minister.", "Gandhi attended college in the United Kingdom.", "He returned to India in 1966 and became a professional pilot for the state- owned Indian Airlines.", "In 1968, he married Sonia Gandhi; the couple settled in Delhi to a domestic life with their children Rahul Gandhi and Priyanka Gandhi Vadra.", "For much of the 1970s, his mother Indira Gandhi was prime minister and his brother Sanjay Gandhi an MP; despite this, Rajiv Gandhi remained apolitical.", "After Sanjay's death in a fighter plane crash in 1980, Gandhi reluctantly entered politics at the behest of Indira.", "The following year he won his brother's Parliamentary seat of Amethi and became a member of the Lok Sabha—the lower house of India's Parliament.", "As part of his political grooming, Rajiv was made general secretary of the Congress party and given significant responsibility in organising the 1982 Asian Games.", "On the morning of 31 October 1984, his mother was assassinated by one of her bodyguards; later that day, Gandhi was appointed Prime Minister.", "His leadership was tested over the next few days as organised mobs rioted against the Sikh community, resulting in anti-Sikh riots in Delhi.", "That December, Congress party won the largest Lok Sabha majority to date, 411 seats out of 542.", "Rajiv Gandhi's period in office was mired in controversies; perhaps the greatest crises were the Bhopal disaster, Bofors scandal and the Shah Bano case.", "In 1988, he reversed the coup in Maldives, antagonising militant Tamil groups such as PLOTE, intervening and then sending peacekeeping troops to Sri Lanka in 1987, leading to open conflict with the Liberation Tigers of Tamil Eelam( LTTE).", "In mid-1987, the Bofors scandal damaged his corruption- free image and resulted in a major defeat for his party in the 1989 election.", "Gandhi remained Congress President until the elections in 1991.", "While campaigning for the elections, he was assassinated by a suicide bomber from the LTTE.", "His widow Sonia became the president of the Congress party in 1998 and led the party to victory in the 2004 and 2009 parliamentary elections.", "His son Rahul Gandhi is a Member of Parliament and the current President of Indian National Congress.", "In 1991, Indian government posthumously awarded Gandhi the Bharat Ratna, the country's highest civilian award.", "At the India Leadership Conclave in 2009, the\" Revolutionary Leader of Modern India\" award was conferred posthumously on Gandhi." ] ], "title": [ "John McMahon (Surrey and Somerset cricketer)", "Henry Moore (cricketer)", "P. Shiv Shankar", "Pendekanti Venkatasubbaiah", "Hartley Lobban", "Felix Cicaterri", "Robert Vadra", "Thom Adcox-Hernandez", "Wale Adebanwi", "Rajiv Gandhi" ] }, "supporting_facts": { "sent_id": [ 0, 0 ], "title": [ "Felix Cicaterri", "Rajiv Gandhi" ] }, "type": "comparison" }
2wikimultihopqa
[ { "content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Who was born first out of Felix Cicaterri and Rajiv Gandhi?\n\n", "role": "user" } ]
fact-reasoning
{ "ground_truth": { "target": [ "Felix Cicaterri" ] }, "style": "rule" }
{ "index": 5, "split": "test" }
dev_11734
Did Larry Godfrey and Barbara Strang share the same nationality?
[ "yes" ]
{ "context": { "content": [ [ "Several people share the surname Dugès:" ], [ "Gregory La Cava( March 10, 1892 – March 1, 1952) was an American film director of Italian descent best known for his films of the 1930s, including\" My Man Godfrey\" and\" Stage Door\", which earned him nominations for Academy Award for Best Director." ], [ "Godfrey- Proctor,( Godfrey and Proctor), was a British car manufacturer( circa late 1920s- 1928) founded by Henry Ronald Godfrey( H.R. Godfrey) and Stuart Proctor at Richmond, Surrey in England.", "Godfrey left G.N.( Godfrey- Nash) to form H.R. Godfrey Motors, providing G.N. spares and rebuilds, which evolved into Godfrey and Proctor who both serviced GN's and produced their own car based on the Austin 7.", "About 10 cars had been produced when the company ceased trading in 1928.", "Godfrey went on to found the HRG Engineering Company in 1935 and S.R. Proctor joined in 1950 as technical director, replacing original partner Guy Robbins." ], [ "Cry Wolf is a 1947 American mystery film directed by Peter Godfrey and featuring Errol Flynn and Barbara Stanwyck, based on the novel of the same name by Marjorie Carleton." ], [ "Unexpected Uncle is a 1941 American comedy- drama film directed by Peter Godfrey and starring Charles Coburn, Anne Shirley, and James Craig." ], [ "Alf, Bill and Fred is a 1964 short animated film directed by Bob Godfrey and written by Stan Hayward." ], [ "Several people share the surname Lacordaire:" ], [ "Malcolm Pointon( died February 2007) was a pianist and lecturer from Thriplow, England, and the subject of the film\" Malcolm and Barbara- A Love Story\" shown in 1999, and, more recently, of an Independent Television program entitled\" Malcolm and Barbara: Love ’s Farewell\", broadcast on Wednesday, 8 August 2007." ], [ "Barbara Strang née", "Barbara Mary Hope Carr later Lady Strang (20 April 1925 – 11 April 1982) was a British English language scholar." ], [ "Laurence Paul Godfrey( born 9 June 1976, in Bristol) is a British archer.", "He studied for an HND in Mechanical Manufacturing at the University of the West of England from 1997 to 2000." ] ], "title": [ "Dugès", "Gregory La Cava", "Godfrey-Proctor", "Cry Wolf (1947 film)", "Unexpected Uncle", "Alf, Bill and Fred", "Lacordaire", "Malcolm Pointon", "Barbara Strang", "Larry Godfrey" ] }, "supporting_facts": { "sent_id": [ 0, 1 ], "title": [ "Larry Godfrey", "Barbara Strang" ] }, "type": "comparison" }
2wikimultihopqa
[ { "content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Did Larry Godfrey and Barbara Strang share the same nationality?\n\n", "role": "user" } ]
fact-reasoning
{ "ground_truth": { "target": [ "yes" ] }, "style": "rule" }
{ "index": 6, "split": "test" }
dev_5147
Where was the director of film A Winter Of Cyclists born?
[ "Chill Factor", "Southern Ireland", "ie", "Éire", "ireland", "Eire", "Ireland, Republic of", "Hibernia", "Irish", "IRL", "Ireland", "IE", "Republic of Ireland", "🇮🇪" ]
{ "context": { "content": [ [ "Olav Aaraas( born 10 July 1950) is a Norwegian historian and museum director.", "He was born in Fredrikstad.", "From 1982 to 1993 he was the director of Sogn Folk Museum, from 1993 to 2010 he was the director of Maihaugen and from 2001 he has been the director of the Norwegian Museum of Cultural History.", "In 2010 he was decorated with the Royal Norwegian Order of St. Olav." ], [ "Ian Barry is an Australian director of film and TV." ], [ "Jesse Edward Hobson( May 2, 1911 – November 5, 1970) was the director of SRI International from 1947 to 1955.", "Prior to SRI, he was the director of the Armour Research Foundation." ], [ "Peter Levin is an American director of film, television and theatre." ], [ "Mike Prendergast (born 21 May 1977) is an Irish rugby union coach who started his professional career with Munster.", "He also played a season with both Bourgoin in France and Gloucester in England before returning for one final season with Munster.", "He played as a scrum-half and represented Young Munster in the All-Ireland League.", "Having previously been director of rugby for amateur Irish club Young Munster, Prendergast joined French club Grenoble as a skills coach in 2013, where former Leinster and Ireland hooker Bernard Jackman was then defence coach.", "Ahead of the 2017–18 Top 14 season, Prendergast joined Oyonnax as their attack coach, before joining Stade Français, where former Munster teammate", "Paul O'Connell was also a coach, as their backs and attack coach for the 2018–19 season, after Oyonnax were relegated from the Top 14.", "After one season with Stade Français, Prendergast departed the club to join their Parisian rivals, Racing 92, where former Munster teammate Ronan O'Gara once coached and also where former Munster players Donnacha Ryan and Simon Zebo are currently playing, as their backs coach on a long-term contract ahead of the 2019–20 season." ], [ "S.N. Mathur was the Director of the Indian Intelligence Bureau between September 1975 and February 1980.", "He was also the Director General of Police in Punjab." ], [ "A Winter of Cyclists is a 2013 documentary film by Mike Prendergast.", "The film chronicles a group of Colorado participants as they attempt to complete a 52-day winter bicycle commuting challenge created by Scot Stucky.", "The challenge, known as “The Icy Bike Winter Commuting Challenge”, was created to encourage people to cycle to work from October to March, during the darker, colder and the snowier months of the year.", "The film captures the wide diversity in which riders adapt to new commuting logistics, adverse weather conditions and riding in the dark.", "And when the Colorado riders encounter an unexpected solidarity across the American continent and into Europe via Facebook, the film provides glimpses into a broader global winter cycling community that is just becoming aware of each other.", "Facebook Group members encourage each other, share weather reports, gear tips, and inspirational and touching personal stories.", "The film concludes with the results of the challenge and the riders reflection on their winter of cycling." ], [ "Alix Madigan is an American film producer known for her work on the 2010 film\" Winter's Bone\", for which she was nominated for an Academy Award." ], [ "Brian Patrick Kennedy( born 5 November 1961) is an Irish- born art museum director who has worked in Ireland and Australia, and now lives and works in the United States.", "He is currently the director of the Peabody Essex Museum.", "He was the director of the Toledo Museum of Art in Ohio from 2010 to 2019.", "He was the director of the Hood Museum of Art from 2005 to 2010, and the National Gallery of Australia( Canberra) from 1997- 2004." ], [ "Winter in the Woods is a 1956 German drama film directed by Wolfgang Liebeneiner and starring Claus Holm.", "It is a remake of the 1936 film\" Winter in the Woods\"." ] ], "title": [ "Olav Aaraas", "Ian Barry (director)", "Jesse E. Hobson", "Peter Levin", "Mike Prendergast", "S. N. Mathur", "A Winter of Cyclists", "Alix Madigan", "Brian Kennedy (gallery director)", "Winter in the Woods" ] }, "supporting_facts": { "sent_id": [ 0, 0 ], "title": [ "A Winter of Cyclists", "Mike Prendergast" ] }, "type": "compositional" }
2wikimultihopqa
[ { "content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Where was the director of film A Winter Of Cyclists born?\n\n", "role": "user" } ]
fact-reasoning
{ "ground_truth": { "target": [ "Chill Factor", "Southern Ireland", "ie", "Éire", "ireland", "Eire", "Ireland, Republic of", "Hibernia", "Irish", "IRL", "Ireland", "IE", "Republic of Ireland", "🇮🇪" ] }, "style": "rule" }
{ "index": 7, "split": "test" }
dev_3910
Which film has the director who is older than the other, Melody Of Murder or City Of Blood?
[ "Melody Of Murder" ]
{ "context": { "content": [ [ "Dan Milne is a British actor/ director who is possibly best known for his role in\" EastEnders\"." ], [ "Melody of Death is a 1922 British silent crime film directed by Floyd Martin Thornton and starring Philip Anthony, Enid R. Reed and Dick Sutherd.", "It is an adaptation of the 1915 novel\" The Melody of Death\" by Edgar Wallace." ], [ "Elliot Silverstein( born August 3, 1927, in Boston, Massachusetts) is an American director, who is best known for being the director of the movie\" Cat Ballou\"( 1965)." ], [ "Ben Palmer is a British film and television director who is known for being the director of\" Bo' Selecta\" and\" The Inbetweeners\"." ], [ "Scott Fox is a pornographic film director who is a member of the AVN Hall of Fame." ], [ "The Melody of Murder,( Original Title: Mordets Melodi), is a 1944 Danish film noir directed by Bodil Ipsen and starring Gull- Maj Norin and Poul Reichhardt.", "The dark tale revolves around a sexually ambiguous serial killer whose crimes are committed to a French cabaret song, and the police suspect a chanteuse who sings the same tune during her performances.", "Produced during the German occupation of Denmark in World War II, the film was praised by critics for its\" stylish cinematography, thematic intensity, and dark vision of Copenhagen;\" and consider it one of the most influential Danish films of that period." ], [ "Bodil Ipsen( 30 August 1889 – 26 November 1964) was a Danish actress and film director, and is considered one of the great stars of Danish cinematic history.", "Her acting career, which began in theater and silent films, was marked by leading roles in large folk comedies and melodramas.", "However, it was as a director that she was most influential: directing the first Danish film noir and making several dark psychological thrillers during the 1940s and 1950s.", "Ipsen's name along with that of Bodil Kjer is given to Denmark's most celebrated film prize, the Bodil Award." ], [ "Darrell James Roodt( born in Johannesburg, April 28, 1962) is a South African film director, screenwriter and producer.", "He is probably most well known for his 1992 film\" Sarafina!\" which starred actress Whoopi Goldberg.", "Also regarded as South Africa's most prolific film director, Roodt has worked with the late Patrick Swayze in\" Father Hood\", James Earl Jones in\" Cry, the Beloved Country\" and Ice Cube in\" Dangerous Ground\"." ], [ "City of Blood is a 1983 South African thriller film directed by Darrell Roodt, and starring Joe Stewardson, Ian Yule, Ken Gampu." ], [ "White: Melody of Death( lit.", "White: The Melody of the Curse) is a 2011 South Korean horror film by Kim Gok and Kim Sun." ] ], "title": [ "Dan Milne", "Melody of Death", "Elliot Silverstein", "Ben Palmer", "Scotty Fox", "Melody of Murder", "Bodil Ipsen", "Darrell Roodt", "City of Blood", "White: Melody of Death" ] }, "supporting_facts": { "sent_id": [ 0, 0, 0, 0 ], "title": [ "Melody of Murder", "City of Blood", "Bodil Ipsen", "Darrell Roodt" ] }, "type": "bridge_comparison" }
2wikimultihopqa
[ { "content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which film has the director who is older than the other, Melody Of Murder or City Of Blood?\n\n", "role": "user" } ]
fact-reasoning
{ "ground_truth": { "target": [ "Melody Of Murder" ] }, "style": "rule" }
{ "index": 8, "split": "test" }
dev_4351
Where was the director of film In The Soup (1936 Film) born?
[ "Weston-super-Mare, Somerset", "Weston-super-Mare", "Weston super Mare" ]
{ "context": { "content": [ [ "Henry Edwards (18 September 1882 – 2 November 1952) was an English actor and film director.", "He appeared in 81 films between 1915 and 1952.", "He also directed 67 films between 1915 and 1937.", "Edwards was married to actress Chrissie White, who co-starred in a number of his films.", "He was born in Weston-super-Mare, Somerset and died in Chobham, Surrey." ], [ "Dana Blankstein- Cohen( born March 3, 1981) is the director of the Israeli Academy of Film and Television.", "She is a film director, and an Israeli culture entrepreneur." ], [ "Peter Levin is an American director of film, television and theatre." ], [ "Brian Patrick Kennedy( born 5 November 1961) is an Irish- born art museum director who has worked in Ireland and Australia, and now lives and works in the United States.", "He is currently the director of the Peabody Essex Museum.", "He was the director of the Toledo Museum of Art in Ohio from 2010 to 2019.", "He was the director of the Hood Museum of Art from 2005 to 2010, and the National Gallery of Australia( Canberra) from 1997- 2004." ], [ "Olav Aaraas( born 10 July 1950) is a Norwegian historian and museum director.", "He was born in Fredrikstad.", "From 1982 to 1993 he was the director of Sogn Folk Museum, from 1993 to 2010 he was the director of Maihaugen and from 2001 he has been the director of the Norwegian Museum of Cultural History.", "In 2010 he was decorated with the Royal Norwegian Order of St. Olav." ], [ "Jason Moore( born October 22, 1970) is an American director of film, theatre and television." ], [ "Ian Barry is an Australian director of film and TV." ], [ "S.N. Mathur was the Director of the Indian Intelligence Bureau between September 1975 and February 1980.", "He was also the Director General of Police in Punjab." ], [ "In the Soup is a 1936 British comedy film directed by Henry Edwards and starring Ralph Lynn, Judy Gunn, Morton Selten and Nelson Keys." ], [ "Jesse Edward Hobson( May 2, 1911 – November 5, 1970) was the director of SRI International from 1947 to 1955.", "Prior to SRI, he was the director of the Armour Research Foundation." ] ], "title": [ "Henry Edwards (actor)", "Dana Blankstein", "Peter Levin", "Brian Kennedy (gallery director)", "Olav Aaraas", "Jason Moore (director)", "Ian Barry (director)", "S. N. Mathur", "In the Soup (1936 film)", "Jesse E. Hobson" ] }, "supporting_facts": { "sent_id": [ 0, 4 ], "title": [ "In the Soup (1936 film)", "Henry Edwards (actor)" ] }, "type": "compositional" }
2wikimultihopqa
[ { "content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Where was the director of film In The Soup (1936 Film) born?\n\n", "role": "user" } ]
fact-reasoning
{ "ground_truth": { "target": [ "Weston-super-Mare, Somerset", "Weston-super-Mare", "Weston super Mare" ] }, "style": "rule" }
{ "index": 9, "split": "test" }
dev_6484
Where does Bernard Tschumi's father work at?
[ "École Polytechnique Fédérale de Lausanne", "ETHL", "Ecole polytechnique federale de Lausanne", "EPF Lausanne", "Swiss Federal Institute of Technology in Lausanne", "EPFL", "École polytechnique fédérale de Lausanne" ]
{ "context": { "content": [ [ "Fiona McIntosh( born 1960) is an author of adult and children's books who lives in Australia.", "She was born in Brighton, England and between the ages of three and eight, travelled a lot to Africa due to her father's work.", "At the age of nineteen, she travelled first to Paris and later to Australia, where she has lived ever since.", "In 2007, she released a crime novel,\" Bye Bye Baby\", under the pseudonym of Lauren Crow, however the pen name has since been dropped for the republished edition of\" Bye Bye Baby\" and for the sequel,\" Beautiful Death\"." ], [ "Bernard Tschumi (born 25 January 1944 in Lausanne, Switzerland) is an architect, writer, and educator, commonly associated with deconstructivism.", "Son of the well-known Swiss architect Jean Tschumi and a French mother, Tschumi is a dual French-Swiss national who works and lives in New York City and Paris.", "He studied in Paris and at ETH in Zurich, where he received his degree in architecture in 1969." ], [ "Christopher Shinn is an American playwright.", "His play\" Dying City\"( 2006) was a finalist for the 2008 Pulitzer Prize for Drama, and\" Where Do We Live\"( 2004) won the 2005 Obie Award, Playwriting." ], [ "Jean André Tschumi (born February 14, 1904", "Plainpalais, now a neighborhood of Geneva, Switzerland, died January 25, 1962) was a Swiss architect and professor at the École Polytechnique Fédérale de Lausanne.", "A member of the Modern Movement, Jean Tschumi is known for his buildings for Sandoz (laboratory, factory), Nestlé (pavilions at various international fairs, headquarters), la Mutuelle Vaudoise (headquarters), as well as for some of his projects (underground Paris souterrain, observation tower, etc.)." ], [ "Emanuel Glicen Romano or Emanuel Glicenstein( 1897 – 1984) was a painter born in Italy.", "He emigrated to America and spent some time in Safed in Israel.", "where he organised a museum for his father's work." ], [ "Dave Grossman is an American game programmer and game designer, most known for his work at Telltale Games and early work at LucasArts.", "He has also written several children's books, and a book of\" guy poetry\" called\" Ode to the Stuff in the Sink\"." ], [ "Andrey Ippolitovich Vilkitsky was a Russian hydrographer and surveyor.", "He was born in the Minsk Governorate.", "His son, Boris Vilkitsky followed up his father's work; the Vilkitsky Islands are named after him." ], [ "David Shute is a British journalist, best known for his work at the BBC." ], [ "Hernando de Cabezón, (baptized 7 September 1541 – 1 October 1602) was a Spanish composer and organist, son of Antonio de Cabezón.", "Only a few of his works are extant today, and he is chiefly remembered for publishing the bulk of his father's work." ], [ "Frank Wildhorn( born November 29, 1958) is an American composer known for both his musicals and popular songs.", "He is most known for his musical\" Jekyll& Hyde\", which ran four years on Broadway, and for writing the# 1 International hit song\" Where Do Broken Hearts Go\" for Whitney Houston." ] ], "title": [ "Fiona McIntosh", "Bernard Tschumi", "Christopher Shinn", "Jean Tschumi", "Emanuel Glicen Romano", "Dave Grossman (game developer)", "Andrey Vilkitsky", "David Shute", "Hernando de Cabezón", "Frank Wildhorn" ] }, "supporting_facts": { "sent_id": [ 1, 1 ], "title": [ "Bernard Tschumi", "Jean Tschumi" ] }, "type": "compositional" }
2wikimultihopqa
[ { "content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Where does Bernard Tschumi's father work at?\n\n", "role": "user" } ]
fact-reasoning
{ "ground_truth": { "target": [ "École Polytechnique Fédérale de Lausanne", "ETHL", "Ecole polytechnique federale de Lausanne", "EPF Lausanne", "Swiss Federal Institute of Technology in Lausanne", "EPFL", "École polytechnique fédérale de Lausanne" ] }, "style": "rule" }
{ "index": 10, "split": "test" }
dev_2144
When was Prince Frederick William Of Solms-Braunfels's wife born?
[ "3 March 1778" ]
{ "context": { "content": [ [ "Amalia of Solms- Braunfels( 31 August 1602, Braunfels – 8 September 1675, The Hague), was Princess consort of Orange by marriage to Frederick Henry, Prince of Orange.", "She acted as the political adviser of her spouse during his reign, and acted as his de facto deputy and regent during his infirmity from 1640 – 47.", "She also served as chair of the regency council during the minority of her grandson William III, Prince of Orange from 1650 until 1672.", "She was the daughter of count John Albert I of Solms- Braunfels and countess Agnes of Sayn- Wittgenstein." ], [ "Ferdinand Wilhelm Ernst, 2nd Prince of Solms- Braunfels( 8 February 1721 in Braunfels – 2 October 1783, ibid.) was the second Prince of Solms- Braunfels.", "He was the son of Frederick William, Prince of Solms- Braunfels( 1696 – 1761) by his first wife Princess Magdalena Henrietta of Nassau- Weilburg( 1691 – 1725)." ], [ "Alexander Frederick, Landgrave of Hesse( 25 January 1863, Copenhagen – 26 March 1945) was a Hessian Prince.", "He was the son of Prince Frederick William of Hesse- Kassel and Princess Anna of Prussia.", "From 1888–1925 he was Head of the House of Hesse, but abdicated his position to his brother Prince Frederick Charles of Hesse.", "He was visually handicapped from birth, and this, in addition to his morganatic marriage, played a part in his decision to abdicate." ], [ "Christine Charlotte of Solms- Braunfels( 10 November 1690 in Greifenstein – 16 October 1771 in Homburg) was a Countess of Solms- Braunfels by birth and by marriage Landgravine of Hesse- Homburg." ], [ "Prince Frederick William of Solms- Braunfels( 11 January 1696 in Braunfels – 24 February 1761, Braunfels) was the first Prince of Solms- Braunfels.", "He was the son of Count William Maurice of Solms- Braunfels( 1651 – 1724) and his wife Magdalene Sophie of Hesse- Homburg( 1660 – 1720), a daughter of William Christoph, Landgrave of Hesse- Homburg, and his first wife." ], [ "Wilhelm Christian Karl, 3rd Prince of Solms Braunfels (9 January 1759, in Braunfels – 20 March 1837, in Braunfels) was by succession an immediate Prince, then a nobleman and head of the Princely House of Solms-Braunfels, a Prussian major general and Hessian deputy." ], [ "Prince Frederick William of Solms-Braunfels (22 October 1770 in Braunfels – 13 April 1814 in Slawentzitz) was a Prussian Major General.", "He was the fourth son of Ferdinand William Ernest, 2nd Prince of Solms-Braunfels (1721–1783) and Countess Sophie Christine Wilhelmine of Solms-Laubach (1741–1772).", "He became known mainly through his marriage to Princess Frederica of Mecklenburg-Strelitz.", "She was the widow of Prince Louis Charles of Prussia (1773–1796).", "When she became pregnant in 1798, he married her in order to avoid a scandal.", "The daughter died soon after birth.", "Frederick William was alleged to have been strongly inclined to consume alcohol and had to quit military service in 1805 for health reasons.", "He also lost his income and even his brother advised Frederica to divorce.", "She was initially against it, but when in 1813, she met Ernest Augustus, King of Hanover (1771–1851), she, too, wanted the divorce.", "Before they got around to it, however, Frederick William died, in 1814 in Sławięcice." ], [ "Frederica of Mecklenburg-Strelitz (\"Friederike Louise Caroline Sophie Charlotte Alexandrine\") (3 March 1778 – 29 June 1841) was a German princess who became, by marriage, princess of Prussia, princess of Solms-Braunfels, Duchess of Cumberland in Britain and Queen of Hanover (in Germany) as the consort of Ernest Augustus, King of Hanover (the fifth son and eighth child of King George III).", "She was born in the \"Altes Palais\" of Hanover as the fifth daughter of Charles II, Duke of Mecklenburg-Strelitz, and his first wife, Frederica, daughter of Prince George William of Hesse-Darmstadt.", "Her father assumed the title of Grand Duke of Mecklenburg on 18 June 1815.", "Duchess Frederica was the niece of her future mother-in-law, Queen Charlotte, through her father." ], [ "Countess Amalie Henriette Charlotte of Solms- Baruth( Kliczków, 30 January", "1768 – Karlsruhe, 31 October 1847) was a countess by birth of Solms- Baruth.", "She was the only child of John Christian II, Count of Solms- Baruth and his wife, Friederike Louise of Reuss- Köstritz." ], [ "Johanetta of Sayn-Wittgenstein (15 February 1561 – 13 April 1622) was a daughter of Count Louis I of Sayn-Wittgenstein and his first wife, Anna of Solms-Braunfels, who was a relative of Amalia of Solms-Braunfels." ] ], "title": [ "Amalia of Solms-Braunfels", "Ferdinand, Prince of Solms-Braunfels", "Alexander Frederick, Landgrave of Hesse", "Christine Charlotte of Solms-Braunfels", "Frederick William, Prince of Solms-Braunfels", "William, Prince of Solms-Braunfels", "Prince Frederick William of Solms-Braunfels", "Frederica of Mecklenburg-Strelitz", "Countess Amalie Henriette of Solms-Baruth", "Johannetta of Sayn-Wittgenstein (1561–1622)" ] }, "supporting_facts": { "sent_id": [ 2, 0 ], "title": [ "Prince Frederick William of Solms-Braunfels", "Frederica of Mecklenburg-Strelitz" ] }, "type": "compositional" }
2wikimultihopqa
[ { "content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: When was Prince Frederick William Of Solms-Braunfels's wife born?\n\n", "role": "user" } ]
fact-reasoning
{ "ground_truth": { "target": [ "3 March 1778" ] }, "style": "rule" }
{ "index": 11, "split": "test" }
dev_11004
Where does Andy Smith (Entrepreneur)'s wife work at?
[ "Leland Stanford Junior University", "The Farm", "Stanford University", "Stanford", "University of Stanford" ]
{ "context": { "content": [ [ "Rear- Admiral Andrew( Andy) Smith is a former Royal Canadian Navy officer.", "Smith was the former Chief of Military Personnel and Chief of Transformation, until retiring in 2013 to move to a senior role with Public Works and Government Services Canada." ], [ "Where Do You Go may refer to:" ], [ "David Shute is a British journalist, best known for his work at the BBC." ], [ "Andy Smith( born 6 July 1984) is an English former professional rugby league footballer who played as a.", "Smith was the first try scorer of the annual magic weekend in 2007." ], [ "Andrew\" Andy\" Smith( born 25 May 1966 in York, England) is a retired British motorcycle speedway rider." ], [ "Dave Grossman is an American game programmer and game designer, most known for his work at Telltale Games and early work at LucasArts.", "He has also written several children's books, and a book of\" guy poetry\" called\" Ode to the Stuff in the Sink\"." ], [ "Andy Smith (born January 9, 1968) is an American entrepreneur, start up advisor, and author.", "He is best known for \"The Dragonfly Effect: Quick, Effective, and Powerful Ways To Use Social Media to Drive Social Change,\" which he wrote with his wife, Dr. Jennifer Aaker.", "Described by the \"New Yorker\" as \"the first book to explain what happens at the intersection of social media like Twitter, marketing and psychologyThe Dragonfly Effect\" examines how people use social media to connect and form groups that create global change.", "In 2014, he partnered with Jay Adelson to form Center Electric, an early-stage technology venture capital firm designed to leverage the growth of the Internet of Things.", "Center Electric was an outgrowth of Smith and Adelson's passion project, Point Option, which develops next generation devices for in-home use." ], [ "Christopher Shinn is an American playwright.", "His play\" Dying City\"( 2006) was a finalist for the 2008 Pulitzer Prize for Drama, and\" Where Do We Live\"( 2004) won the 2005 Obie Award, Playwriting." ], [ "Andrew or Andy Smith may refer to:" ], [ "Jennifer Aaker (born 15 January 1967, California) is an American social psychologist, author, and General Atlantic Professor of Marketing at the Stanford Graduate School of Business.", "Best known for her research on time, money, and happiness, Aaker also focuses on the transmission of ideas through social networks, the power of story in decision making, and how to build global brands across cultures.", "She is the recipient of the Distinguished Scientific Achievement Award from the Society for Consumer Psychology and the Stanford Distinguished Teaching Award." ] ], "title": [ "Andrew Smith (military officer)", "Where Do You Go", "David Shute", "Andy Smith (rugby league)", "Andy Smith (speedway rider)", "Dave Grossman (game developer)", "Andy Smith (entrepreneur)", "Christopher Shinn", "Andrew Smith", "Jennifer Aaker" ] }, "supporting_facts": { "sent_id": [ 1, 0 ], "title": [ "Andy Smith (entrepreneur)", "Jennifer Aaker" ] }, "type": "compositional" }
2wikimultihopqa
[ { "content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Where does Andy Smith (Entrepreneur)'s wife work at?\n\n", "role": "user" } ]
fact-reasoning
{ "ground_truth": { "target": [ "Leland Stanford Junior University", "The Farm", "Stanford University", "Stanford", "University of Stanford" ] }, "style": "rule" }
{ "index": 12, "split": "test" }
dev_10575
Are both On A Magical Night and The Mongols (Film) from the same country?
[ "yes" ]
{ "context": { "content": [ [ "Date at Midnight is a 1959 British crime film from the Danzigers." ], [ "Bombay Dreams is a Swedish comedy film from 2004." ], [ "On a Magical Night is a 2019 French comedy film written and directed by Christophe Honoré and starring Chiara Mastroianni.", "It was screened in the Un Certain Regard section at the 2019 Cannes Film Festival where Mastroianni won the award for Best Performance." ], [ "One and Five is a 1969 short film from Egypt." ], [ "The Mongols is a 1961 Italian/ French international co-production historical adventure film directed by Andre DeToth and Leopoldo Savona, and starring Jack Palance and Anita Ekberg." ], [ "Shima is a 2007 film from Uzbekistan." ], [ "A Kind of America is a Hungarian comedy film from 2002." ], [ "Cayuco is a Spanish documentary film from 2007." ], [ "21 ème Siècle is a 2004 film from Senegal." ], [ "California Winter is an American drama film from 2012." ] ], "title": [ "Date at Midnight", "Bombay Dreams (film)", "On a Magical Night", "One and Five", "The Mongols (film)", "Shima (film)", "A Kind of America", "Cayuco (film)", "21ème Siècle", "California Winter" ] }, "supporting_facts": { "sent_id": [ 0, 0 ], "title": [ "On a Magical Night", "The Mongols (film)" ] }, "type": "comparison" }
2wikimultihopqa
[ { "content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Are both On A Magical Night and The Mongols (Film) from the same country?\n\n", "role": "user" } ]
fact-reasoning
{ "ground_truth": { "target": [ "yes" ] }, "style": "rule" }
{ "index": 13, "split": "test" }
dev_4915
Which film has the director died earlier, L'Arbre, Le Maire Et La Médiathèque or Sette Giorni All'Altro Mondo?
[ "Sette Giorni All'Altro Mondo", "Sette giorni all'altro mondo" ]
{ "context": { "content": [ [ "Jacob Le Maire( c. 1585, Antwerp or Amsterdam- 22 December 1616, at sea) was a Dutch mariner who circumnavigated the earth in 1615 and 1616.", "The strait between Tierra del Fuego and Isla de los Estados was named the Le Maire Strait in his honor, though not without controversy.", "It was Le Maire himself who proposed to the council aboard\" Eendracht\" that the new passage should be called by his name and the council unanimously agreed with Le Maire.", "The author or authors of\" The Relation\" took\" Eendracht\" captain Schouten ’s side by proclaiming:\" Eendracht\" then rounded Cape Horn, proving that Tierra del Fuego was not a continent." ], [ "Sette giorni all' altro mondo is a 1936 Italian comedy film directed by Mario Mattoli." ], [ "William\" Willy\" le Maire de Warzée d' Hermalle( 7 March 1879 – 1966) was a Belgian tennis player.", "He has been named as among the best male tennis players in Belgian history.", "Le Maire de Warzée d' Hermalle reached the quarterfinals of Wimbledon in 1904 and 1908.", "Le Maire de Warzée d' Hermalle was born into the Belgian nobility, the son of Marie- Léon- Hubert le Maire Warzée d' Hermalle( born 1848) and Noëmie Constance Caroline Georgine Valérie de Warzée d' Hermalle( born in Beaufays in 1850).", "His elder brother, Marie- Georges- Gérard- Léon le Maire de Warzée d' Hermalle, Baron de Warzée d' Hermalle( 1877 – 1931), was a diplomat and Belgian ambassador to Iran, Japan and China, who married Dorothy Hall, daughter of Irish writer Owen Hall." ], [ "Le masque de la Méduse is a 2009 fantasy horror film directed by Jean Rollin.", "The film is a modern-day telling of the Greek mythological tale of the Gorgon and was inspired by the 1964 classic Hammer Horror film of the same name and the 1981 cult classic\" Clash of the Titans\".", "It was Rollin's final film, as the director died in 2010." ], [ "Thulasi is a 1987 Tamil romantic drama film directed by Ameerjan.", "The film features Murali and Seetha in lead roles.", "The film, produced by P. S. V. Hariharan, had musical score by Sampath Selvam and was released on 27 November 1987.", "The music director died before the release of the film." ], [ "Un altro mondo( English title: Another World) is a 2010 Italian drama film directed by and starring Silvio Muccino and based on the novel of the same name by Carla Vangelista." ], [ "Mario Mattòli( 30 November 1898 – 26 February 1980) was an Italian film director and screenwriter.", "He directed 86 films between 1934 and 1966.", "His 1939 film\" Defendant, Stand Up!\" was shown as part of a retrospective on Italian comedy at the 67th Venice International Film Festival." ], [ "Querelle is a 1982 West German- French English- language drama film directed by Rainer Werner Fassbinder and starring Brad Davis, adapted from French author Jean Genet's 1947 novel\" Querelle of Brest\".", "It marked Fassbinder's final film as a writer/ director; it was posthumously released just months after the director died of a drug overdose in June 1982." ], [ "L'Arbre, le maire et la médiathèque ou les sept hasards is a French comedy film directed by Éric Rohmer.", "It was shown at the 1993 Montreal World Film Festival where it received the FIPRESCI prize.", "It is known variously by the alternate title\" Les Sept hasards\" and in English translation as\" The Tree, the Mayor and the Mediatheque\"." ], [ "Jean Marie Maurice Schérer or Maurice Henri Joseph Schérer, known as Éric Rohmer (21 March 192011 January 2010), was a French film director, film critic, journalist, novelist, screenwriter, and teacher.", "Rohmer was the last of the post-World War II French New Wave directors to become established.", "He edited the influential film journal, \"Cahiers du cinéma\", from 1957 to 1963, while most of his colleagues—among them Jean-Luc Godard and François Truffaut—were making the transition from film critics to filmmakers and gaining international attention.", "Rohmer gained international acclaim around 1969 when his film \"My Night at Maud's\" was nominated at the Academy Awards.", "He won the San Sebastián International Film Festival with \"Claire's Knee\" in 1971 and the Golden Lion at the Venice Film Festival for \"The Green Ray\" in 1986.", "Rohmer went on to receive the Venice Film Festival's Career Golden Lion in 2001.", "After Rohmer's death in 2010, his obituary in \"The Daily Telegraph\" described him as \"the most durable filmmaker of the French New Wave\", outlasting his peers and \"still making movies the public wanted to see\" late in his career." ] ], "title": [ "Jacob Le Maire", "Sette giorni all'altro mondo", "William le Maire de Warzée d'Hermalle", "Le Masque de la Méduse", "Thulasi (1987 film)", "Un altro mondo", "Mario Mattoli", "Querelle", "L'Arbre, le maire et la médiathèque", "Éric Rohmer" ] }, "supporting_facts": { "sent_id": [ 0, 0, 0, 0 ], "title": [ "L'Arbre, le maire et la médiathèque", "Sette giorni all'altro mondo", "Éric Rohmer", "Mario Mattoli" ] }, "type": "bridge_comparison" }
2wikimultihopqa
[ { "content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which film has the director died earlier, L'Arbre, Le Maire Et La Médiathèque or Sette Giorni All'Altro Mondo?\n\n", "role": "user" } ]
fact-reasoning
{ "ground_truth": { "target": [ "Sette Giorni All'Altro Mondo", "Sette giorni all'altro mondo" ] }, "style": "rule" }
{ "index": 14, "split": "test" }
dev_7491
Are the directors of both films The Great Profile and Kim Kardashian, Superstar from the same country?
[ "yes" ]
{ "context": { "content": [ [ "Kim Kardashian, Superstar( also known simply as Kim K Superstar) is a 2007 pornographic film featuring American television personality Kim Kardashian and singer- actor Ray J.", "It depicts the pair having sexual intercourse in October 2002 while on vacation in Cabo San Lucas, Mexico." ], [ "Hranush Alexis\" Hrush\" Achemyan( born July 19, 1987) is an Armenian- American celebrity make- up artist and beauty influencer.", "She is best known for being a personal stylist for Kylie Jenner and Kim Kardashian." ], [ "Kimberly Noel Kardashian West(\" née\" Kardashian; born October 21, 1980) is an American media personality, model, businesswoman, socialite and actress.", "Kardashian first gained media attention as a friend and stylist of Paris Hilton but received wider notice after a 2002 sex tape,\" Kim Kardashian, Superstar\", with her then- boyfriend Ray J was released in 2007.", "Later that year, she and her family began to appear in the E! reality television series\" Keeping Up with the Kardashians\"( 2007 – present).", "Its success soon led to the creation of spin- offs including\" Kourtney and Kim Take New York\"( 2011–2012) and\" Kourtney and Kim Take Miami\"( 2009 – 2013).", "In recent years, Kim Kardashian has developed an online and social media presence, including hundreds of millions of followers on Twitter and Instagram.", "She has released a variety of products tied to her name, including the 2014 mobile game, a variety of clothing and products, the 2015 photo book\" Selfish\" and her eponymous personal app.", "Her relationship with rapper Kanye", "West has also received significant media coverage; the couple married in 2014, and they have four children together.", "Kardashian has appeared in films such as\" Disaster Movie\"( 2008),\" Deep in the Valley\"( 2009) and( 2013).", "\" Time\" magazine included Kardashian on their list of 2015's 100 most influential people, while\" Vogue\" described her in 2016 as a\" pop culture phenomenon.\"", "Both critics and admirers have described her as exemplifying the notion of being famous for being famous.", "She was reported to be the highest- paid reality television personality of 2015, with her estimated total earnings exceeding US$ 53 million." ], [ "I Want Your", "Love is the title of both a 2010 short film and a 2012 feature- length film.", "Both films were directed and written by Travis Mathews.", "The drama films both revolve around the friends and ex-lovers of Jesse Metzger, a gay man in his mid-thirties who is forced to move back to his hometown from San Francisco due to financial reasons.", "The actors' own names, along with much of their real- life stories, were used for their characters in both films, which features graphic sexual scenes.", "The production of both films was aided by the gay pornographic studio NakedSword.", "This led to the full- length film being refused exemption from classification, which would have allowed it to screen at the Melbourne Queer Film Festival, a decision to which actor James Franco( who invited Mathews to collaborate on Franco's film\" Interior. Leather Bar.\") reacted negatively." ], [ "William Raymond Norwood Jr.( born January 17, 1981), known professionally as Ray J, is an American rapper, singer, songwriter, television personality, actor, and entrepreneur.", "Born in McComb, Mississippi and raised in Carson, California, he is the brother of recording artist and actress Brandy Norwood and the first cousin of rapper Snoop Dogg.", "In January 2017, he competed in the nineteenth season of the UK reality television programme\" Celebrity Big Brother\"." ], [ "The Great Profile is a 1940 film directed by Walter Lang and starring John Barrymore and John Payne.", "Barrymore lampoons himself.", "A famous actor, given to drink, nearly destroys the show, but his leading lady returns to save it.", "Meanwhile, a young girl tries to reform him." ], [ "Sophie Theallet( Born 1964 in Bagnères- de- Bigorre, France) is a French fashion designer whose clients include First Lady Michelle Obama, Jennifer Lopez, Jessica Alba, Oprah Winfrey, Gabrielle Union, and Kim Kardashian" ], [ "Matt Corboy( born June 4, 1973) is an American actor.", "He has appeared in both films and television series." ], [ "Walter Lang( August 10, 1896 – February 7, 1972) was an American film director." ], [ "Leopoldo Salcedo( March 12, 1912 – June 11, 1998) was a two- time FAMAS award- winning Filipino film actor who specialized in portraying dramatic heroes.", "Dubbed as\" The Great Profile\", he was said to be among the first\" kayumanggi\" or dark- skinned Filipino film stars, in contrast to the lighter- skinned mestizo actors of his generation." ] ], "title": [ "Kim Kardashian, Superstar", "Hrush Achemyan", "Kim Kardashian", "I Want Your Love (film)", "Ray J", "The Great Profile", "Sophie Theallet", "Matt Corboy", "Walter Lang", "Leopoldo Salcedo" ] }, "supporting_facts": { "sent_id": [ 0, 0, 0, 0 ], "title": [ "The Great Profile", "Kim Kardashian, Superstar", "Walter Lang", "Ray J" ] }, "type": "bridge_comparison" }
2wikimultihopqa
[ { "content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Are the directors of both films The Great Profile and Kim Kardashian, Superstar from the same country?\n\n", "role": "user" } ]
fact-reasoning
{ "ground_truth": { "target": [ "yes" ] }, "style": "rule" }
{ "index": 15, "split": "test" }
dev_5180
Which film whose director is younger, Vendredi Soir or El Santo De La Espada?
[ "Vendredi soir", "Vendredi Soir" ]
{ "context": { "content": [ [ "Miloš Zličić( born 29 December 1999) is a Serbian football forward.", "He is a younger brother of Lazar Zličić." ], [ "Leopoldo Torre Nilsson( 5 May 1924 – 8 September 1978), also known as Leo Towers and as Babsy, was an Argentine film director, producer and screenwriter.", "Born as Leopoldo Torres Nilsson( he later changed his paternal surname from Torres to Torre) was the son of Argentine pioneer film director Leopoldo Torres Ríos, with whom he collaborated between 1939 and 1949.", "He debuted in 1947 with the short\" El muro\".", "His mother was an Argentinian citizen of Swedish descent.", "His uncle was cinematographer Carlos Torres Ríos( 1898 – 1956).", "Torre Nilsson's first full- length film,\" El crimen de Oribe\"( 1950), was an adaptation of Adolfo Bioy Casares's novel\" El perjurio de la nieve\".", "In 1954 he directed\" Días de odio\", based on Jorge Luis Borges's short story\" Emma Zunz\".", "In 1956 he directed\" Graciela\", based on Carmen Laforet's novel\" Nada\", winner of Nadal Literary Prize 1944.", "He also directed films about icons of Argentine history and culture:\" Martín Fierro\"( 1968), about the main character of Argentina's national poem;\" El Santo de la Espada\"( 1970), about General José de San Martín; and( 1971), about Martín Miguel de Güemes.", "It was entered into the 7th Moscow International Film Festival.", "His 1973 film\" Los siete locos\" won the Silver Bear at the 23rd Berlin International Film Festival.", "Torre Nilsson was married to writer Beatriz Guido, whose work served as inspiration and who worked alongside him in many of his scripts.", "He is acknowledged as the first Argentine film director to be critically acclaimed outside the country, making Argentina's film production known in important international festivals.", "He died of cancer in his native Buenos Aires in 1978, at the age of 54.", "He was buried at the Cementerio Británico in Buenos Aires.", "A novelized biography of Torre Nilsson,\" El Gran Babsy\", by Mónica Martín, was published in 1993.", "Another biography,\" Leopoldo Torre Nilsson: Imagen y Poesía\" was published in 2006 by the newspaper\" La Nación\" and the Aguilar editorial house." ], [ "Hugo Arana( born July 23, 1943) is an Argentine film, television and theatre actor.", "Arana was born in Buenos Aires, grew up in Monte Grande and moved with his family to Lomas de Zamora and then Lanús.", "He studied acting with Marcello Lavalle and Augusto Fernandez.", "In his first years as an actor, he was part of a theatre group called\" Errare Humanum Est\" and he acted in films such as\" El Santo de la Espada\"( 1970) and\" La tregua\"( 1974).", "In the 1980s, he became popular for his part in an advertisement for Crespi wine, and then for his part in the TV sitcom\" Matrimonios y algo más\", created by Hugo Moser, in which he played two characters who were highly acclaimed by the public: the\" Groncho\"( in the comedy sketch\" El Groncho y La Dama\"( The Shabby Man and the Lady)) and Huguito Araña( a stereotypically feminine gay man).", "He has worked on the Telefé TV series\" Los exitosos Pells\", where he played the director of the fictitious channel\" Mega News\", Franco Andrada." ], [ "El Santo de la Espada( in English,\" The Saint of the Sword\") is a 1970 Argentine historical epic film directed by Leopoldo Torre Nilsson and starring Alfredo Alcón.", "It narrates the life of José de San Martín.", "It was written by Beatriz Guido and Luis Pico Estrada, based on the eponymous novel by Ricardo Rojas.", "The script was supervised by the Sanmartinian National Institute.", "It had a great success, and it is the most successful movie about\" El Libertador\" San Martín, although it received some negative reviews due to historical inaccuracies." ], [ "Jesus Christ Vampire Hunter is a 2001 horror parody film from Odessa Filmworks which deals with Jesus' modern-day struggle to protect the lesbians of Ottawa, Ontario, Canada, from vampires with the help of Mexican wrestler El Santo (based on El Santo, Enmascarado de Plata, and played by actor Jeff Moffet, who starred as El Santo in two other Odessa Filmworks productions).", "This film earned an honorable mention in the Spirit of Slamdance category at the 2002 Slamdance Film Festival and is considered a cult film." ], [ "Vendredi soir( Friday Night) is a 2002 drama film directed by Claire Denis.", "The screenplay was written by Claire Denis and Emmanuèle Bernheim, based upon Bernheim's novel of the same name.", "The film premiered at the 2002 Venice Film Festival." ], [ "Claire Denis( born 21 April 1946) is a French film director and writer.", "Her feature film\" Beau Travail\"( 1999) has been called one of the greatest films of the 1990s.", "Other acclaimed works include\" Trouble Every Day\"( 2001),\" 35 Shots of Rum\"( 2008), and\" White Material\"( 2009).", "Her work has dealt with themes of colonial and post-colonial West Africa, as well as issues in modern France, and continues to influence European cinematic identity." ], [ "Marcos Eusebio Jiménez de la Espada Martín( born 3 November 1985), known simply as Marcos, is a Spanish footballer who plays for Indian club East Bengal as a forward." ], [ "La Bestia humana is a 1957 Argentine film whose story is based on the novel\" La Bête Humaine\" by the French writer Émile Zola." ], [ "In 1998 she wrote Vendredi soir( Friday night), a novel that was adapted into film by Claire Denis in 2002.", "She also worked with Michel Houellebecq on a film adaptation of his novel Platform." ] ], "title": [ "Miloš Zličić", "Leopoldo Torre Nilsson", "Hugo Arana", "El Santo de la Espada", "Jesus Christ Vampire Hunter", "Vendredi soir", "Claire Denis", "Marcos Jiménez de la Espada Martín", "La Bestia humana", "Emmanuèle Bernheim" ] }, "supporting_facts": { "sent_id": [ 0, 0, 0, 0 ], "title": [ "Vendredi soir", "El Santo de la Espada", "Claire Denis", "Leopoldo Torre Nilsson" ] }, "type": "bridge_comparison" }
2wikimultihopqa
[ { "content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Which film whose director is younger, Vendredi Soir or El Santo De La Espada?\n\n", "role": "user" } ]
fact-reasoning
{ "ground_truth": { "target": [ "Vendredi soir", "Vendredi Soir" ] }, "style": "rule" }
{ "index": 16, "split": "test" }
dev_12318
Who is the spouse of the director of film Marianne And Juliane?
[ "Volker Schlöndorff" ]
{ "context": { "content": [ [ "Gertrude of Saxony and Bavaria( 1152/55–1197) was Duchess of Swabia as the spouse of Duke Frederick IV, and Queen of Denmark as the spouse of King Canute VI." ], [ "Sophia Magdalena of Denmark (3 July 1746 – 21 August 1813) was Queen of Sweden as the spouse of King Gustav III." ], [ "Marianne and Juliane (lit. \"", "The Leaden Time\" or \"Leaden Times\"), also called The German Sisters in the United Kingdom, is a 1981 West German film directed by Margarethe von Trotta.", "The screenplay is a fictionalized account of the true lives of Christiane and Gudrun Ensslin.", "Gudrun, a member of The Red Army Faction, was found dead in her prison cell in Stammheim in 1977.", "In the film, Von Trotta depicts the two sisters Juliane (Christine) and Marianne (Gudrun) through their friendship and journey to understanding each other. \"", "Marianne and Juliane\" was von Trotta's third film and solidified her position as a director of the New German Cinema.", "\"Marianne and Juliane\" also marked the first time that von Trotta worked with Barbara Sukowa.", "They would go on to work on six more films together." ], [ "Heather Denise Gibson is a Scottish economist currently serving as Director- Advisor to the Bank of Greece( since 2011).", "She is the spouse of Euclid Tsakalotos, former Greek Minister of Finance." ], [ "Margarethe von Trotta (born 21 February 1942) is a German film director who has been referred to as a \"leading force\" of the New German Cinema movement.", "Von Trotta's extensive body of work has won awards internationally.", "She was married to and collaborated with director Volker Schlöndorff.", "Although they made a successful team, von Trotta felt she was seen as secondary to Schlöndorff.", "Subsequently, she established a solo career for herself and became \"Germany’s foremost female film director, who has offered the most sustained and successful female variant of \"Autorenkino\" in postwar German film history\".", "Certain aspects of von Trotta's work have been compared to Ingmar Bergman's features from the 1960s and 1970s.", "Von Trotta has been called \"the world’s leading feminist filmmaker.", "\" The predominant aim of her films is to create new representations of women.", "Her films are concerned with relationships between and among women (sisters, best friends, etc.), as well as with relationships between women and men, and involve political setting.", "Nevertheless, she rejects the suggestion that she makes \"women’s films\"." ], [ "Mehdi Abrishamchi is an Iranian People's Mujahedin of Iran( MEK) politician who has been described as\" the right hand man of Massoud Rajavi\".", "He is the former spouse of the group's current leader, Maryam Rajavi, whom he divorced so that she could become the spouse of Massoud Rajavi." ], [ "Princess Auguste of Bavaria( 28 April 1877 – 25 June 1964) was a member of the Bavarian Royal House of Wittelsbach and the spouse of Archduke Joseph August of Austria." ], [ "Maria Teresa, Grand Duchess of Luxembourg( born María Teresa Mestre y Batista; on 22 March 1956), is the spouse of Grand Duke Henri." ], [ "Marie Louise Coidavid( 1778 – March 11, 1851), was the Queen of the Kingdom of Haiti 1811 – 20 as the spouse of Henri I of Haiti." ], [ "Adib Kheir was a leading Syrian nationalist of the 1920s.", "He was the owner of the Librairie Universelle in Damascus.", "His granddaughter is the spouse of Manaf Tlass." ] ], "title": [ "Gertrude of Bavaria", "Sophia Magdalena of Denmark", "Marianne and Juliane", "Heather D. Gibson", "Margarethe von Trotta", "Mehdi Abrishamchi", "Princess Auguste of Bavaria (1875–1964)", "Maria Teresa, Grand Duchess of Luxembourg", "Marie-Louise Coidavid", "Adib Kheir" ] }, "supporting_facts": { "sent_id": [ 1, 2 ], "title": [ "Marianne and Juliane", "Margarethe von Trotta" ] }, "type": "compositional" }
2wikimultihopqa
[ { "content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Who is the spouse of the director of film Marianne And Juliane?\n\n", "role": "user" } ]
fact-reasoning
{ "ground_truth": { "target": [ "Volker Schlöndorff" ] }, "style": "rule" }
{ "index": 17, "split": "test" }
dev_1188
What is the place of birth of Gwladys's husband?
[ "Gwynllwg" ]
{ "context": { "content": [ [ "Devisingh Ransingh Shekhawat is the former and first First Gentleman of India as the husband of Pratibha Patil." ], [ "James Randall Marsh( 1896- 1965) was an American artist and the husband of Anne Steele Marsh." ], [ "Jonathan Leach( born April 18, 1973) is a former professional tennis player from the United States.", "He is the husband of Lindsay Davenport." ], [ "Saint Gwynllyw Milwr or Gwynllyw Farfog, known in English in a corrupted form as Woolos the Warrior or Woolos the Bearded (450 – 500 AD) was a Welsh king and religious figure.", "He was King of Gwynllwg in South Wales and is the legendary founder and patron saint of the City of Newport living around the 5th century.", "According to medieval tradition he was a feared warlord and raider who knew King Arthur, but later found religion and became a hermit founding St Woolos Cathedral in Newport.", "He was the father of one of the most revered Welsh saints, Saint Cadoc the Wise." ], [ "Andrew Upton( born 1 February 1966) is an Australian playwright, screenwriter, and director.", "He is the husband of actress Cate Blanchett." ], [ "Stan Rice( November 7, 1942 – December 9, 2002) was an American poet and artist.", "He was the husband of author Anne Rice." ], [ "Stan Marks is an Australian writer and journalist.", "He is the husband of Holocaust survivor Eva Marks." ], [ "Saint Gwladys ferch Brychan or St Gladys (Latin-Gladusa), daughter of King Brychan of Brycheiniog, was the queen of the saint-king Gwynllyw Milwr and the mother of Cadoc \"the Wise\", whose \"vita\" may be the earliest saint's life to mention Arthur.", "Gwladys's other children were Cynidr, Bugi, Cyfyw, Maches, Glywys II and Egwine.", "Today her main church and associated school is in Bargoed." ], [ "Stephen Alan Fletcher Pownall( born 30 December 1986) is an English singer- songwriter and the husband of actress Gabriella Wilde." ], [ "Lapidoth(\" Lappidothtorches\") was the husband of Deborah the fourth Judge of Israel.", "Lapidoth is also a Hebrew male given name." ] ], "title": [ "Devisingh Ransingh Shekhawat", "James Randall Marsh", "Jon Leach", "Gwynllyw", "Andrew Upton", "Stan Rice", "Stan Marks", "Gwladys", "Alan Pownall", "Lapidoth" ] }, "supporting_facts": { "sent_id": [ 0, 1 ], "title": [ "Gwladys", "Gwynllyw" ] }, "type": "compositional" }
2wikimultihopqa
[ { "content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: What is the place of birth of Gwladys's husband?\n\n", "role": "user" } ]
fact-reasoning
{ "ground_truth": { "target": [ "Gwynllwg" ] }, "style": "rule" }
{ "index": 18, "split": "test" }
dev_152
Where did the director of film My Heidelberg, I Can Not Forget You die?
[ "AR", "Argentine Republic", "🇦🇷", "Argentina", "ARG", "ar" ]
{ "context": { "content": [ [ "Dana Blankstein- Cohen( born March 3, 1981) is the director of the Israeli Academy of Film and Television.", "She is a film director, and an Israeli culture entrepreneur." ], [ "Jesse Edward Hobson( May 2, 1911 – November 5, 1970) was the director of SRI International from 1947 to 1955.", "Prior to SRI, he was the director of the Armour Research Foundation." ], [ "Olav Aaraas( born 10 July 1950) is a Norwegian historian and museum director.", "He was born in Fredrikstad.", "From 1982 to 1993 he was the director of Sogn Folk Museum, from 1993 to 2010 he was the director of Maihaugen and from 2001 he has been the director of the Norwegian Museum of Cultural History.", "In 2010 he was decorated with the Royal Norwegian Order of St. Olav." ], [ "Do n't Forget", "You're Going to Die is a 1995 French drama film directed, co-written by and starring Xavier Beauvois." ], [ "Brian Patrick Kennedy( born 5 November 1961) is an Irish- born art museum director who has worked in Ireland and Australia, and now lives and works in the United States.", "He is currently the director of the Peabody Essex Museum.", "He was the director of the Toledo Museum of Art in Ohio from 2010 to 2019.", "He was the director of the Hood Museum of Art from 2005 to 2010, and the National Gallery of Australia( Canberra) from 1997- 2004." ], [ "James Bauer (1884–1940) was a German film director.", "Following the Nazi takeover of power in 1933, he emigrated first to Spain and later to Argentina, where he directed several films until his death in 1940." ], [ "Peter Levin is an American director of film, television and theatre." ], [ "My Heidelberg, I Can Not Forget You", "(German:", "Mein Heidelberg, ich kann Dich nicht vergessen) is a 1927 German silent film directed by James Bauer and starring Dorothea Wieck, and Hans Adalbert Schlettow.", "It was shot at the Emelka Studios in Munich." ], [ "S.N. Mathur was the Director of the Indian Intelligence Bureau between September 1975 and February 1980.", "He was also the Director General of Police in Punjab." ], [ "Ian Barry is an Australian director of film and TV." ] ], "title": [ "Dana Blankstein", "Jesse E. Hobson", "Olav Aaraas", "Don't Forget You're Going to Die", "Brian Kennedy (gallery director)", "James Bauer", "Peter Levin", "My Heidelberg, I Can Not Forget You", "S. N. Mathur", "Ian Barry (director)" ] }, "supporting_facts": { "sent_id": [ 2, 1 ], "title": [ "My Heidelberg, I Can Not Forget You", "James Bauer" ] }, "type": "compositional" }
2wikimultihopqa
[ { "content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Where did the director of film My Heidelberg, I Can Not Forget You die?\n\n", "role": "user" } ]
fact-reasoning
{ "ground_truth": { "target": [ "AR", "Argentine Republic", "🇦🇷", "Argentina", "ARG", "ar" ] }, "style": "rule" }
{ "index": 19, "split": "test" }
dev_7508
Where did the composer of film Rajaratha graduate from?
[ "Vidya Vikas Institute of Engineering & Technology (VVIET)", "Vidya Vikas Institute of Engineering & Technology" ]
{ "context": { "content": [ [ "Thomas Morse( born June 30, 1968) is an American composer of film and concert music." ], [ "Abe Meyer( 1901 – 1969) was an American composer of film scores." ], [ "Alonso Mudarra( c. 1510 – April 1, 1580) was a Spanish composer of the Renaissance, and also played the vihuela, a guitar- shaped string instrument.", "He was an innovative composer of instrumental music as well as songs, and was the composer of the earliest surviving music for the guitar." ], [ "Anup Bhandari is an Indian software engineer, turned to film maker, film director, composer and lyricist who works primarily in Kannada cinema,", "now debut Kannada film director Anup Bhandari seems to have charted out his career graph well in advance.", "He was born in Mangaluru to director Sudhakar Bhandary and Rathna Bhandari, raised in Mysuru and has worked in the US, but hopes to settle down in Bengaluru.", "Anup holds an Engineering degree from Vidya Vikas Institute of Engineering & Technology (VVIET), Mysuru.", "His directorial début mystery thriller Kannada movie Rangitaranga has been critically acclaimed.", "Anup also wrote the lyrics and composed music for the movie.", "His other works include a short movie \"Words\" for which cinematography was done by Lance Kaplan.", "He is currently working on his next film with\nSudeep" ], [ "Bert Grund( 1920–1992) was a German composer of film scores." ], [ "Amedeo Escobar( 1888–1973) was an Italian composer of film scores." ], [ "Rajaratha is is a 2018 Indian Kannada romantic comedy film written and directed by Anup Bhandari who also composed the film's music.", "It stars Nirup Bhandari and Avantika Shetty in the lead roles in their second collaboration with the director after the success of \"RangiTaranga\".", "The film was dubbed in Telugu as \"Rajaratham\" (English: \"Royal Chariot\") with Rana Daggubati replacing Puneeth Rajkumar as the narrator.", "Arya, who made his debut in Kannada cinema for the first time with the film, features in a supporting role alongside P. Ravishankar and Sruthi Hariharan.", "While the film score is composed by B. Ajaneesh Loknath, the soundtrack is composed by Anup Bhandari.", "The cinematography is by William David.", "The first look of the film was released in the social media by the team on 2 October 2017.", "The film was released on March 23, 2018" ], [ "The Seventh Company Outdoors is a 1977 French comedy film directed by Robert Lamoureux.", "It is a sequel to Now Where Did the 7th Company Get to ?." ], [ "Walter Ulfig was a German composer of film scores." ], [ "Henri Verdun( 1895–1977) was a French composer of film scores." ] ], "title": [ "Thomas Morse", "Abe Meyer", "Alonso Mudarra", "Anup Bhandari", "Bert Grund", "Amedeo Escobar", "Rajaratha", "The Seventh Company Outdoors", "Walter Ulfig", "Henri Verdun" ] }, "supporting_facts": { "sent_id": [ 0, 3 ], "title": [ "Rajaratha", "Anup Bhandari" ] }, "type": "compositional" }
2wikimultihopqa
[ { "content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: Where did the composer of film Rajaratha graduate from?\n\n", "role": "user" } ]
fact-reasoning
{ "ground_truth": { "target": [ "Vidya Vikas Institute of Engineering & Technology (VVIET)", "Vidya Vikas Institute of Engineering & Technology" ] }, "style": "rule" }
{ "index": 20, "split": "test" }
dev_10177
What is the place of birth of the director of film Yoogan?
[ "Kovilpatti" ]
{ "context": { "content": [ [ "Yoogan is a Tamil horror, suspense thriller film directed by Kamal G. Starring Yashmith and Sakshi Agarwal in the lead roles along with Siddhu GRN, Pradeep Balaji, Shyam Kirthivasan, Tarun Chakarvarthy and Suresh Pillai.", "Rashaanth Arwin composed the music For the film.", "Alex Premnath composed the background score and cinematography was by Ravi Arumugam.", "It is a horror story which revolves around the problems that happen in an IT company.", "The movie got released on 24 April 2015." ], [ "Kamal G (born 22 July 1989) is an Indian film director, film editor and film producer from Kovilpatti in Thoothukudi District, Tamil Nadu, who works in the South Indian film industry." ], [ "Dana Blankstein- Cohen( born March 3, 1981) is the director of the Israeli Academy of Film and Television.", "She is a film director, and an Israeli culture entrepreneur." ], [ "Peter Levin is an American director of film, television and theatre." ], [ "Brian Patrick Kennedy( born 5 November 1961) is an Irish- born art museum director who has worked in Ireland and Australia, and now lives and works in the United States.", "He is currently the director of the Peabody Essex Museum.", "He was the director of the Toledo Museum of Art in Ohio from 2010 to 2019.", "He was the director of the Hood Museum of Art from 2005 to 2010, and the National Gallery of Australia( Canberra) from 1997- 2004." ], [ "Jesse Edward Hobson( May 2, 1911 – November 5, 1970) was the director of SRI International from 1947 to 1955.", "Prior to SRI, he was the director of the Armour Research Foundation." ], [ "Ian Barry is an Australian director of film and TV." ], [ "Olav Aaraas( born 10 July 1950) is a Norwegian historian and museum director.", "He was born in Fredrikstad.", "From 1982 to 1993 he was the director of Sogn Folk Museum, from 1993 to 2010 he was the director of Maihaugen and from 2001 he has been the director of the Norwegian Museum of Cultural History.", "In 2010 he was decorated with the Royal Norwegian Order of St. Olav." ], [ "Jason Moore( born October 22, 1970) is an American director of film, theatre and television." ], [ "S.N. Mathur was the Director of the Indian Intelligence Bureau between September 1975 and February 1980.", "He was also the Director General of Police in Punjab." ] ], "title": [ "Yoogan", "Kamal G", "Dana Blankstein", "Peter Levin", "Brian Kennedy (gallery director)", "Jesse E. Hobson", "Ian Barry (director)", "Olav Aaraas", "Jason Moore (director)", "S. N. Mathur" ] }, "supporting_facts": { "sent_id": [ 0, 0 ], "title": [ "Yoogan", "Kamal G" ] }, "type": "compositional" }
2wikimultihopqa
[ { "content": "\nAnswer the given question. Every time you receive new information, you must first conduct reasoning inside <think> ... </think>. After reasoning, if you find you lack some knowledge, you can call a specialized LLM by writing a query inside <search> LLM-Name:Your-Query </search>. \n!!! STRICT FORMAT RULES for <search>: !!!\n + You MUST replace LLM-Name with the EXACT name of a model selected from [Qwen2.5-7B-Instruct, LLaMA-3.1-8B-Instruct, LLaMA-3.1-70B-Instruct, Mistral-7B-Instruct, Mixtral-8x22B-Instruct, Gemma-2-27B-Instruct]. + You MUST replace Your-Query with a CONCRETE QUESTION that helps answer the original question below. + NEVER copy or paste model descriptions into <search>.\n + NEVER output the placeholder format <search> LLM-Name:Your-Query </search>. Always replace both parts correctly. \nBefore each LLM call, you MUST explicitly reason inside <think> ... </think> about: + Why external information is needed. + Which model is best suited for answering it, based on the LLMs' abilities (described below). \nWhen you call an LLM, the response will be returned between <information> and </information>. You must not limit yourself to repeatedly calling a single LLM (unless its provided information is consistently the most effective and informative). You are encouraged to explore and utilize different LLMs to better understand their respective strengths and weaknesses. It is also acceptable—and recommended—to call different LLMs multiple times for the same input question to gather more comprehensive information. \n\n#### The Descriptions of Each LLM \nQwen2.5-7B-Instruct:Qwen2.5-7B-Instruct is a powerful Chinese-English instruction-tuned large language model designed for tasks in language, coding, mathematics, and reasoning. As part of the Qwen2.5 series, it features enhanced knowledge, stronger coding and math abilities, improved instruction following, better handling of long and structured texts, and supports up to 128K context tokens. It also offers multilingual capabilities across over 29 languages.\n\nLLaMA-3.1-8B-Instruct:LLaMA-3.1-8B-Instruct is an 8-billion-parameter instruction-tuned language model optimized for multilingual dialogue. It provides strong language understanding, reasoning, and text generation performance, outperforming many open-source and closed-source models on standard industry benchmarks.\n\nLLaMA-3.1-70B-Instruct:LLaMA-3.1-70B-Instruct is a 70-billion-parameter state-of-the-art language model designed for advanced multilingual dialogue tasks. It excels in language comprehension, complex reasoning, and high-quality text generation, setting a new standard against both open and closed models in benchmark evaluations.\n\nMistral-7B-Instruct:Mistral-7B-Instruct is a fine-tuned version of the Mistral-7B-v0.3 language model designed to follow instructions, complete user requests, and generate creative text. It was trained on diverse public conversation datasets to enhance its ability to handle interactive tasks effectively.\n\nMixtral-8x22B-Instruct:Mixtral-8x22B-Instruct is a cutting-edge sparse Mixture-of-Experts (SMoE) large language model from MistralAI. It efficiently uses 39B active parameters out of 141B total, delivering high performance at lower costs. The model excels at following instructions, completing tasks, and generating creative text, with strong skills in multiple languages (English, French, Italian, German, Spanish), mathematics, and coding. It also supports native function calling and handles long contexts up to 64K tokens for better information recall.\n\nGemma-2-27B-Instruct:Gemma-2-27B-Instruct is a cutting-edge, instruction-tuned text generation model developed by Google. Built using the same technology as Gemini, it excels at text understanding, transformation, and code generation. As a lightweight, decoder-only model with open weights, it is ideal for tasks like question answering, summarization, and reasoning. Its compact size enables deployment on laptops, desktops, or private cloud setups, making powerful AI more accessible.\n\nIf you find that no further external knowledge is needed, you can directly provide your final answer inside <answer> ... </answer>, without additional explanation or illustration. For example: <answer> Beijing </answer>. + Important: You must not output the placeholder text \"<answer> and </answer>\" alone. + You must insert your actual answer between <answer> and </answer>, following the correct format. Question: What is the place of birth of the director of film Yoogan?\n\n", "role": "user" } ]
fact-reasoning
{ "ground_truth": { "target": [ "Kovilpatti" ] }, "style": "rule" }
{ "index": 21, "split": "test" }
End of preview. Expand in Data Studio
README.md exists but content is empty.
Downloads last month
50