anybody know the size of the context window for Nanbeige4.1-3B?

#20
by test333333 - opened

I heard somewhere that setting the context length in ollama to a number above the size of the context window of a model might dampen the models intelligence. Is this true?
In the quickstart section they recommend setting the max tokens to 131072. So maybe the context window is 16k?

Nanbeige LLM Lab org

The context window is 256k, and we recommend setting max output tokens to 128k.

Sign up or log in to comment