Fill-Mask
Transformers
Safetensors
English
bert
protein
protbert
masked-language-modeling
bioinformatics
sequence-prediction
Instructions to use faceless-void/protbert-sequence-unmasking with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use faceless-void/protbert-sequence-unmasking with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="faceless-void/protbert-sequence-unmasking")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("faceless-void/protbert-sequence-unmasking") model = AutoModelForMaskedLM.from_pretrained("faceless-void/protbert-sequence-unmasking") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- bbf4301ccfa4480b9fdbbad530072f607f3924c466837454e9efcfa392017304
- Size of remote file:
- 5.11 kB
- SHA256:
- 8c64f819b82db521aed3aae169ccb123b22c7faf045c7f09555f4a5463923735
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.