Clarification on X-Omni VQ Tokenizer Licensing in GLM-Image

#4
by JosephusCheung - opened

Hi,

I noticed the VQ tokenizer weights in this repo are seemingly identical to X-Omni/X-Omni-En, which is licensed under Apache 2.0.

Since your model is released under MIT, could you clarify if the VQ weights were explicitly re-licensed? Or should the project license/model card be updated to reflect the Apache 2.0 terms for these weights?

Thanks!

Sign up or log in to comment