Instructions to use google/flan-t5-base with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use google/flan-t5-base with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("google/flan-t5-base") model = AutoModelForSeq2SeqLM.from_pretrained("google/flan-t5-base") - Notebooks
- Google Colab
- Kaggle
Use from DJL (Deep Java Library)?
#34
by kwalcock - opened
Has anyone been able to use this model from Java, with DJL, for example? With that, the responses to
djl-import -m google/flan-t5-base
is the error message
Unsupported model architecture: T5ForConditionalGeneration for google/flan-t5-base. model not found: Namespace(limit=1, output_dir='.', output_format='PyTorch', retry_failed=False, cpu_only=False, optimize=None, device=None, dtype=None, category=None, model_name='google/flan-t5-base', trust_remote_code=False, min_version=None) finished.
even though I've been able to import other models.