Instructions to use AntoineBlanot/roberta-large-seq-classif with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use AntoineBlanot/roberta-large-seq-classif with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="AntoineBlanot/roberta-large-seq-classif")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("AntoineBlanot/roberta-large-seq-classif") model = AutoModelForSequenceClassification.from_pretrained("AntoineBlanot/roberta-large-seq-classif") - Notebooks
- Google Colab
- Kaggle
roberta-large-3way
This is the checkpoint for roberta-large after being trained on a various of tasks inlcuding various of datasets. The used datasets have been transformed in a binary setting: non-entailment and entailment
It can be directly used as a NLI inference model or a zero-shot classifier.
- Downloads last month
- 4