Instructions to use nlpie/bio-mobilebert with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use nlpie/bio-mobilebert with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="nlpie/bio-mobilebert")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("nlpie/bio-mobilebert") model = AutoModelForMaskedLM.from_pretrained("nlpie/bio-mobilebert") - Notebooks
- Google Colab
- Kaggle
Commit History
Update README.md 65d9f2f verified
Update README.md e6de740
Update README.md d02dc01
Update README.md 2898f9f
Update README.md ace5e33
Update README.md edbb1ae
Create README.md 32366ad
Third version of the MobileBioBERT model. 447bc44
Mojtaba aka Omid Rohanian commited on
Second version of the BioMobileBERT model. fa386ab
Mojtaba aka Omid Rohanian commited on
First version of the Bio-MobileBERT model and tokenizer. b420287
Mojtaba aka Omid Rohanian commited on