-
8.57 kB
data
czech.pickle
Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktLanguageVars",
- "builtins.int",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters",
- "collections.defaultdict",
- "builtins.set"
How to fix it?
1.12 MB
data
danish.pickle
Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktLanguageVars",
- "builtins.int",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters",
- "collections.defaultdict",
- "builtins.set"
How to fix it?
1.19 MB
data
dutch.pickle
Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktLanguageVars",
- "builtins.int",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters",
- "collections.defaultdict",
- "builtins.set"
How to fix it?
694 kB
data
english.pickle
Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktLanguageVars",
- "builtins.int",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters",
- "collections.defaultdict",
- "builtins.set"
How to fix it?
407 kB
data
estonian.pickle
Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktLanguageVars",
- "builtins.int",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters",
- "collections.defaultdict",
- "builtins.set"
How to fix it?
1.5 MB
data
finnish.pickle
Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktLanguageVars",
- "builtins.int",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters",
- "collections.defaultdict",
- "builtins.set"
How to fix it?
1.85 MB
data
french.pickle
Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktLanguageVars",
- "builtins.int",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters",
- "collections.defaultdict",
- "builtins.set"
How to fix it?
554 kB
data
german.pickle
Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktLanguageVars",
- "builtins.int",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters",
- "collections.defaultdict",
- "builtins.set"
How to fix it?
1.46 MB
data
greek.pickle
Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktLanguageVars",
- "builtins.int",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters",
- "collections.defaultdict",
- "builtins.set"
How to fix it?
876 kB
data
italian.pickle
Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktLanguageVars",
- "builtins.int",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters",
- "collections.defaultdict",
- "builtins.set"
How to fix it?
615 kB
data
malayalam.pickle
Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters",
- "__builtin__.set",
- "collections.defaultdict",
- "__builtin__.int"
How to fix it?
221 kB
data
norwegian.pickle
Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktLanguageVars",
- "builtins.int",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters",
- "collections.defaultdict",
- "builtins.set"
How to fix it?
1.18 MB
data
polish.pickle
Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktLanguageVars",
- "builtins.int",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters",
- "collections.defaultdict",
- "builtins.set"
How to fix it?
1.74 MB
data
portuguese.pickle
Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktLanguageVars",
- "builtins.int",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters",
- "collections.defaultdict",
- "builtins.set"
How to fix it?
612 kB
data
russian.pickle
Detected Pickle imports (7)
- "builtins.int",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktParameters",
- "collections.defaultdict",
- "builtins.set"
How to fix it?
33 kB
data
slovene.pickle
Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktLanguageVars",
- "builtins.int",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters",
- "collections.defaultdict",
- "builtins.set"
How to fix it?
734 kB
data
spanish.pickle
Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktLanguageVars",
- "builtins.int",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters",
- "collections.defaultdict",
- "builtins.set"
How to fix it?
562 kB
data
swedish.pickle
Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktLanguageVars",
- "builtins.int",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters",
- "collections.defaultdict",
- "builtins.set"
How to fix it?
980 kB
data
turkish.pickle
Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktLanguageVars",
- "builtins.int",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters",
- "collections.defaultdict",
- "builtins.set"
How to fix it?
1.02 MB
data