Text Classification
Transformers
PyTorch
bert
feature-extraction
custom_code
text-embeddings-inference
Instructions to use Wellcome/WellcomeBertMesh with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Wellcome/WellcomeBertMesh with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="Wellcome/WellcomeBertMesh", trust_remote_code=True)# Load model directly from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("Wellcome/WellcomeBertMesh", trust_remote_code=True) model = AutoModel.from_pretrained("Wellcome/WellcomeBertMesh", trust_remote_code=True) - Notebooks
- Google Colab
- Kaggle
Error with sample Code
#2
by martinWatzinger - opened
Hi,
thank you for providing this library. When I try to run the sample code, the error
"AttributeError: 'BertConfig' object has no attribute 'pretrained_model'"
comes back. Unfortunately I am a novice and this might be an easy thing, but I do not know how to resolve this.
Can you help?
Thanks
Hi Martin,
Sorry for the late reply. We also noticed this issue lately, we just had not time to fix it for the latest transformers version. A version we know works is 4.30 so installing with that version should resolve it. Let me know if you have further issues. Would also be keen to hear what application you are using the model for π