How to use "Xenova/grok-1-tokenizer" with "AutoTokenizer"
#1
by
jeril
- opened
The following code is working fine:
tokenizer = LlamaTokenizerFast.from_pretrained('Xenova/grok-1-tokenizer')
how to make it work with AutoTokenizer like the following:
tokenizer = AutoTokenizer.from_pretrained("Xenova/grok-1-tokenizer")
Currently when I am using AutoTokenizer, i am getting the following error:
ValueError: Tokenizer class Grok1Tokenizer does not exist or is not currently imported.
Please advise.
If you'd like to use it with AutoTokenizer, you can clone the repo and simply rename Grok1Tokenizer in tokenizer_config.json to LlamaTokenizer.