Instructions to use codesage/codesage-small with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use codesage/codesage-small with Transformers:
# Load model directly from transformers import CodeSage model = CodeSage.from_pretrained("codesage/codesage-small", trust_remote_code=True, dtype="auto") - Notebooks
- Google Colab
- Kaggle
Fix MD format for python code
#4
by mrm8488 - opened
README.md
CHANGED
|
@@ -24,7 +24,7 @@ This checkpoint is first trained on code data via masked language modeling (MLM)
|
|
| 24 |
### How to use
|
| 25 |
This checkpoint consists of an encoder (130M model), which can be used to extract code embeddings of 1024 dimension. It can be easily loaded using the AutoModel functionality and employs the Starcoder tokenizer (https://arxiv.org/pdf/2305.06161.pdf).
|
| 26 |
|
| 27 |
-
```
|
| 28 |
from transformers import AutoModel, AutoTokenizer
|
| 29 |
|
| 30 |
checkpoint = "codesage/codesage-small"
|
|
|
|
| 24 |
### How to use
|
| 25 |
This checkpoint consists of an encoder (130M model), which can be used to extract code embeddings of 1024 dimension. It can be easily loaded using the AutoModel functionality and employs the Starcoder tokenizer (https://arxiv.org/pdf/2305.06161.pdf).
|
| 26 |
|
| 27 |
+
```py
|
| 28 |
from transformers import AutoModel, AutoTokenizer
|
| 29 |
|
| 30 |
checkpoint = "codesage/codesage-small"
|