Instructions to use vespa-engine/colbert-medium with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use vespa-engine/colbert-medium with Transformers:
# Load model directly from transformers import AutoTokenizer, ColBERT tokenizer = AutoTokenizer.from_pretrained("vespa-engine/colbert-medium") model = ColBERT.from_pretrained("vespa-engine/colbert-medium") - Notebooks
- Google Colab
- Kaggle
Commit History
allow flax 20b55ea
Update with dynamic axes 63ecbef
Jo Kristian Bergum commited on
more details on colbert model 44cb111
Jo Kristian Bergum commited on
Import model weights and tokenizer configuration df12733
Jo Kristian Bergum commited on
Clean readme e7e3092
Jo Kristian Bergum commited on
Add initial Readme ec9f05f
Jo Kristian Bergum commited on