inferencerlabs commited on
Commit
718faee
·
verified ·
1 Parent(s): b6f5844

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -22,7 +22,7 @@ pipeline_tag: text-generation
22
  #### M3 Ultra 512GB RAM using [Inferencer app v1.7.3](https://inferencer.com)
23
  * Expect ~16.5 tokens/s @ 1000 tokens
24
  * Memory usage: ~450 GB
25
- * For a larger context window (11k tokens) you can expand the RAM limit:
26
  ```bash
27
  sudo sysctl iogpu.wired_limit_mb=507000
28
  ```
 
22
  #### M3 Ultra 512GB RAM using [Inferencer app v1.7.3](https://inferencer.com)
23
  * Expect ~16.5 tokens/s @ 1000 tokens
24
  * Memory usage: ~450 GB
25
+ * For a larger context window (>11k tokens) you can expand the RAM limit:
26
  ```bash
27
  sudo sysctl iogpu.wired_limit_mb=507000
28
  ```