trankit pretrained weights

This repository simply hosts the decompressed model weights that come from the open-source nlp-uoregon/trankit project. All archives in the original HuggingFace model uonlp/trankit have been extracted to their respective directories so the parameters can be consumed without additional unzip steps.

The upstream project distributes the assets under the Apache License 2.0. The same license therefore applies to this copy of the weights—make sure every downstream consumer is aware of the original terms.

The directory layout mirrors the upstream release:

models/
  v1.0.0/
    xlm-roberta-base/
      <language-or-corpus>/
        *.mdl, *.pt, *.json, …
    xlm-roberta-large/
      <language-or-corpus>/
        …

No additional code changes were made; if you need the training or inference library please refer to the upstream GitHub repository.

language: - multilingual license: apache-2.0 tags: - trankit

This repository simply hosts the decompressed model weights that come from the open-source nlp-uoregon/trankit project. All archives in the original HuggingFace model uonlp/trankit have been extracted to their respective directories so the parameters can be consumed without additional unzip steps.

The upstream project distributes the assets under the Apache License 2.0. The same license therefore applies to this copy of the weights—make sure every downstream consumer is aware of the original terms.

Downloads last month
-
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support