Instructions to use ShacklesLay/Deberta4task2 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use ShacklesLay/Deberta4task2 with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("token-classification", model="ShacklesLay/Deberta4task2")# Load model directly from transformers import AutoTokenizer, AutoModelForTokenClassification tokenizer = AutoTokenizer.from_pretrained("ShacklesLay/Deberta4task2") model = AutoModelForTokenClassification.from_pretrained("ShacklesLay/Deberta4task2") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 9adfd1e0332846352bad50bbc8472c0a9ffddccfd6ddd4e5e9c5d8b6dbb5ada2
- Size of remote file:
- 1.52 GB
- SHA256:
- 8022203b16eff3b87da758170f0490895e9155b48a3cada87547e28ea7ae11c5
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.