Instructions to use predibase/magicoder with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- PEFT
How to use predibase/magicoder with PEFT:
from peft import PeftModel from transformers import AutoModelForCausalLM base_model = AutoModelForCausalLM.from_pretrained("mistralai/Mistral-7B-v0.1") model = PeftModel.from_pretrained(base_model, "predibase/magicoder") - Notebooks
- Google Colab
- Kaggle
Update README.md (#1)
Browse files- Update README.md (105759eca4e5c857183c934e6620298c451d0526)
Co-authored-by: Pruvost <geoffreyPvt@users.noreply.huggingface.co>
README.md
CHANGED
|
@@ -22,10 +22,10 @@ def strlen(string: str) -> int:
|
|
| 22 |
---\
|
| 23 |
Sample output: ```python
|
| 24 |
def strlen(string: str) -> int:
|
| 25 |
-
return len(string)
|
| 26 |
-
```\
|
| 27 |
---\
|
| 28 |
Try using this adapter yourself!
|
|
|
|
| 29 |
```
|
| 30 |
from transformers import AutoModelForCausalLM, AutoTokenizer
|
| 31 |
|
|
|
|
| 22 |
---\
|
| 23 |
Sample output: ```python
|
| 24 |
def strlen(string: str) -> int:
|
| 25 |
+
return len(string)```\
|
|
|
|
| 26 |
---\
|
| 27 |
Try using this adapter yourself!
|
| 28 |
+
|
| 29 |
```
|
| 30 |
from transformers import AutoModelForCausalLM, AutoTokenizer
|
| 31 |
|