Instructions to use google/flan-t5-base with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use google/flan-t5-base with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("google/flan-t5-base") model = AutoModelForSeq2SeqLM.from_pretrained("google/flan-t5-base") - Notebooks
- Google Colab
- Kaggle
Maximum sequence length of flan-t5 base , large , xl ,xxl ?
#20
by ali-issa - opened
Hello, could someone provide more information regarding the maximum input and output size of the Flan-T5 models? While reading the paper, I noticed it was trained on 1024 input length and 256 output length, but I also saw conflicting information. Can someone please clarify? Thank you.
Hi @ali-issa
I think that there should be no strong limit for the maximum length, please have a look at my comment here: https://huggingface.co/google/flan-t5-xxl/discussions/41#65117d0d33ddefa58fee136f and let me know if this makes sense