FiscalNote/billsum
Viewer • Updated • 23.5k • 16.8k • 54
How to use nvbAI/my_awesome_billsum_model with Transformers:
# Load model directly
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("nvbAI/my_awesome_billsum_model")
model = AutoModelForSeq2SeqLM.from_pretrained("nvbAI/my_awesome_billsum_model")This model is a fine-tuned version of t5-small on the billsum dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
|---|---|---|---|---|---|---|---|---|
| No log | 1.0 | 62 | 2.8026 | 0.1286 | 0.0367 | 0.1075 | 0.1075 | 19.0 |
| No log | 2.0 | 124 | 2.5917 | 0.1368 | 0.0469 | 0.1129 | 0.113 | 19.0 |
| No log | 3.0 | 186 | 2.5262 | 0.144 | 0.0532 | 0.1199 | 0.1199 | 19.0 |
| No log | 4.0 | 248 | 2.5092 | 0.1436 | 0.0535 | 0.1198 | 0.1199 | 19.0 |
Base model
google-t5/t5-small