Hi,
Is there any vLLM support planned in the near future? I'd like to try this model locally but it doesn't seem possible for the moment
Thanks
· Sign up or log in to comment