Spaces:
Running
Running
| title: README | |
| emoji: ✨ | |
| colorFrom: gray | |
| colorTo: red | |
| sdk: static | |
| pinned: false | |
| <p> | |
| <img src="https://huggingface.co/datasets/loubnabnl/repo-images/resolve/main/bigcode_light.png" alt="drawing" width="440"/> | |
| </p> | |
| <p> | |
| BigCode is an open scientific collaboration working on responsible training of large language models for coding applications. | |
| You can find more information on the main website at <a href="https://www.bigcode-project.org/" class="underline">https://www.bigcode-project.org</a>. You can also follow Big Code on Twitter at <a href="https://twitter.com/BigCodeProject" class="underline">https://twitter.com/BigCodeProject</a>. | |
| In this organization, you can find <a href="https://huggingface.co/datasets/bigcode/the-stack" class="underline">The Stack</a>, a 6.4TB of source code in 358 programming languages from permissive licenses. | |
| You can also find <a href="https://huggingface.co/bigcode/santacoder" class="underline">SantaCoder</a>, a strong 1.1B code generation model trained on Java, JavaScript & Python. In addition to some data governance tools. | |
| </p> | |