Update README.md
Browse files
README.md
CHANGED
|
@@ -71,7 +71,25 @@ generation you should look at model like GPT2.
|
|
| 71 |
|
| 72 |
### How to use
|
| 73 |
|
| 74 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 75 |
|
| 76 |
### Limitations and bias
|
| 77 |
|
|
|
|
| 71 |
|
| 72 |
### How to use
|
| 73 |
|
| 74 |
+
If you haven't already, you can install the [Transformers.js](https://huggingface.co/docs/transformers.js) JavaScript library from [NPM](https://www.npmjs.com/package/@huggingface/transformers) using:
|
| 75 |
+
```bash
|
| 76 |
+
npm i @huggingface/transformers
|
| 77 |
+
```
|
| 78 |
+
|
| 79 |
+
You can then use this model directly with a pipeline for masked language modeling:
|
| 80 |
+
|
| 81 |
+
```js
|
| 82 |
+
import { pipeline } from "@huggingface/transformers";
|
| 83 |
+
|
| 84 |
+
const unmasker = await pipeline("fill-mask", "onnx-community/bert-base-uncased-ONNX");
|
| 85 |
+
const result = await unmasker("The capital of France is [MASK].", { top_k: 3 });
|
| 86 |
+
console.log(result);
|
| 87 |
+
// [
|
| 88 |
+
// { score: 0.41678285598754883, token: 3000, token_str: 'paris', sequence: 'the capital of france is paris.' },
|
| 89 |
+
// { score: 0.07141835987567902, token: 22479, token_str: 'lille', sequence: 'the capital of france is lille.' },
|
| 90 |
+
// { score: 0.06339213997125626, token: 10241, token_str: 'lyon', sequence: 'the capital of france is lyon.' }
|
| 91 |
+
// ]
|
| 92 |
+
```
|
| 93 |
|
| 94 |
### Limitations and bias
|
| 95 |
|