Update README.md
Browse files
README.md
CHANGED
|
@@ -22,7 +22,7 @@ pipeline_tag: text-generation
|
|
| 22 |
#### M3 Ultra 512GB RAM using [Inferencer app v1.7.3](https://inferencer.com)
|
| 23 |
* Expect ~16.5 tokens/s @ 1000 tokens
|
| 24 |
* Memory usage: ~450 GB
|
| 25 |
-
* For a larger context window (11k tokens) you can expand the RAM limit:
|
| 26 |
```bash
|
| 27 |
sudo sysctl iogpu.wired_limit_mb=507000
|
| 28 |
```
|
|
|
|
| 22 |
#### M3 Ultra 512GB RAM using [Inferencer app v1.7.3](https://inferencer.com)
|
| 23 |
* Expect ~16.5 tokens/s @ 1000 tokens
|
| 24 |
* Memory usage: ~450 GB
|
| 25 |
+
* For a larger context window (>11k tokens) you can expand the RAM limit:
|
| 26 |
```bash
|
| 27 |
sudo sysctl iogpu.wired_limit_mb=507000
|
| 28 |
```
|