Summarization
Transformers
PyTorch
TensorFlow
JAX
Rust
Safetensors
English
bart
text2text-generation
Eval Results (legacy)
Instructions to use facebook/bart-large-cnn with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use facebook/bart-large-cnn with Transformers:
# Use a pipeline as a high-level helper # Warning: Pipeline type "summarization" is no longer supported in transformers v5. # You must load the model directly (see below) or downgrade to v4.x with: # 'pip install "transformers<5.0.0' from transformers import pipeline pipe = pipeline("summarization", model="facebook/bart-large-cnn")# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("facebook/bart-large-cnn") model = AutoModelForSeq2SeqLM.from_pretrained("facebook/bart-large-cnn") - Inference
- Notebooks
- Google Colab
- Kaggle
Getting "Could not load model" errors when using inference api.
#73
by breisa - opened
Why am I getting this error when using the inference api? My code used to work, I didn't change anything.
500 Internal Server Error: "{"error":"Could not load model facebook/bart-large-cnn with any of the following classes: (<class 'transformers.models.bart.modeling_bart.BartForConditionalGeneration'>, <class 'transformers.models.bart.modeling_tf_bart.TFBartForConditionalGeneration'>)."}"
This is a "500 Internal Server Error" . I just now tried loading, it's working. Re-try after some time.
Yeah, the issue still persists.
Confirm, the issue still persists
Could not load model facebook/bart-large-cnn with any of the following classes: (<class 'transformers.models.bart.modeling_bart.BartForConditionalGeneration'>, <class 'transformers.models.bart.modeling_tf_bart.TFBartForConditionalGeneration'>).
The above issue still persists.
Hi, Does anyone know how to fix the "Could not load model" error?
+1
+1
+1
+1
