Spaces:
Runtime error
Runtime error
update to prompt and length
Browse files- utils/haystack.py +9 -10
utils/haystack.py
CHANGED
|
@@ -14,25 +14,23 @@ def start_haystack():
|
|
| 14 |
You may go into some detail about what topics they tend to like tweeting about. Please also mention their overall tone, for example: positive,
|
| 15 |
negative, political, sarcastic or something else.
|
| 16 |
|
| 17 |
-
|
| 18 |
|
| 19 |
Twitter stream: Many people in our community asked how to utilize LLMs in their NLP pipelines and how to modify prompts for their tasks.…
|
| 20 |
RT @deepset_ai: We use parts of news articles from The Guardian as documents and create custom prompt templates to categorize these article
|
| 21 |
|
| 22 |
-
|
| 23 |
-
|
| 24 |
-
Example:
|
| 25 |
|
| 26 |
Twitter stream: I've directed my team to set sharper rules on how we deal with unidentified objects.\n\nWe will inventory, improve ca…
|
| 27 |
the incursion by China’s high-altitude balloon, we enhanced radar to pick up slower objects.\n \nBy doing so, w…
|
| 28 |
I gave an update on the United States’ response to recent aerial objects.
|
| 29 |
|
| 30 |
-
|
| 31 |
They have been tweeting about the USA. They have had a political tone. They mostly post in English.
|
| 32 |
|
| 33 |
-
Twitter stream: $tweets
|
| 34 |
|
| 35 |
-
|
| 36 |
""")
|
| 37 |
return prompt_node, twitter_template
|
| 38 |
|
|
@@ -50,8 +48,9 @@ def query(username):
|
|
| 50 |
response = requests.request("GET", url, headers = headers)
|
| 51 |
twitter_stream = ""
|
| 52 |
for tweet in response.json():
|
| 53 |
-
twitter_stream +=
|
| 54 |
-
result = prompter.prompt(prompt_template=template, tweets=twitter_stream[0:
|
| 55 |
-
except:
|
|
|
|
| 56 |
result = ["Please make sure you are providing a correct, public twitter accout"]
|
| 57 |
return result
|
|
|
|
| 14 |
You may go into some detail about what topics they tend to like tweeting about. Please also mention their overall tone, for example: positive,
|
| 15 |
negative, political, sarcastic or something else.
|
| 16 |
|
| 17 |
+
Use the following format:
|
| 18 |
|
| 19 |
Twitter stream: Many people in our community asked how to utilize LLMs in their NLP pipelines and how to modify prompts for their tasks.…
|
| 20 |
RT @deepset_ai: We use parts of news articles from The Guardian as documents and create custom prompt templates to categorize these article
|
| 21 |
|
| 22 |
+
Summary: This person has lately been tweeting about NLP and LLMs. Their tweets have been in Enlish
|
|
|
|
|
|
|
| 23 |
|
| 24 |
Twitter stream: I've directed my team to set sharper rules on how we deal with unidentified objects.\n\nWe will inventory, improve ca…
|
| 25 |
the incursion by China’s high-altitude balloon, we enhanced radar to pick up slower objects.\n \nBy doing so, w…
|
| 26 |
I gave an update on the United States’ response to recent aerial objects.
|
| 27 |
|
| 28 |
+
Summary: This person has lately been tweeting about an unidentified object and an incursion by China with a high-altitude baloon.
|
| 29 |
They have been tweeting about the USA. They have had a political tone. They mostly post in English.
|
| 30 |
|
| 31 |
+
Twitter stream: $tweets
|
| 32 |
|
| 33 |
+
Summary:
|
| 34 |
""")
|
| 35 |
return prompt_node, twitter_template
|
| 36 |
|
|
|
|
| 48 |
response = requests.request("GET", url, headers = headers)
|
| 49 |
twitter_stream = ""
|
| 50 |
for tweet in response.json():
|
| 51 |
+
twitter_stream += tweet["text"]
|
| 52 |
+
result = prompter.prompt(prompt_template=template, tweets=twitter_stream[0:10000])
|
| 53 |
+
except Exception as e:
|
| 54 |
+
print(e)
|
| 55 |
result = ["Please make sure you are providing a correct, public twitter accout"]
|
| 56 |
return result
|