tblard/allocine
Viewer • Updated • 200k • 1.56k • 18
How to use baptiste-pasquier/camembert-allocine with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-classification", model="baptiste-pasquier/camembert-allocine") # Load model directly
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("baptiste-pasquier/camembert-allocine")
model = AutoModelForSequenceClassification.from_pretrained("baptiste-pasquier/camembert-allocine")This model is a fine-tuned version of camembert-base on the allocine dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
|---|---|---|---|---|---|---|---|
| 0.1276 | 0.2 | 500 | 0.1187 | 0.9623 | 0.9622 | 0.9462 | 0.9787 |
| 0.1013 | 0.4 | 1000 | 0.0917 | 0.9683 | 0.9675 | 0.9725 | 0.9625 |
| 0.1254 | 0.6 | 1500 | 0.0889 | 0.9701 | 0.9698 | 0.9597 | 0.9801 |
| 0.1004 | 0.8 | 2000 | 0.0792 | 0.9716 | 0.9709 | 0.9727 | 0.9691 |
| 0.1149 | 1.0 | 2500 | 0.0762 | 0.9727 | 0.9723 | 0.9673 | 0.9773 |
| 0.0574 | 1.2 | 3000 | 0.0849 | 0.9733 | 0.9729 | 0.9679 | 0.9780 |
| 0.0394 | 1.4 | 3500 | 0.1026 | 0.9718 | 0.9715 | 0.9595 | 0.9839 |
| 0.0401 | 1.6 | 4000 | 0.1065 | 0.9698 | 0.9697 | 0.9528 | 0.9872 |
| 0.0458 | 1.8 | 4500 | 0.0834 | 0.9744 | 0.9739 | 0.9715 | 0.9764 |
| 0.0554 | 2.0 | 5000 | 0.0873 | 0.9719 | 0.9717 | 0.9594 | 0.9844 |
| 0.0516 | 2.2 | 5500 | 0.0928 | 0.9754 | 0.9749 | 0.9723 | 0.9775 |
| 0.0355 | 2.4 | 6000 | 0.1017 | 0.9744 | 0.9741 | 0.9642 | 0.9842 |
| 0.0227 | 2.6 | 6500 | 0.0983 | 0.9748 | 0.9743 | 0.9729 | 0.9757 |
| 0.0359 | 2.8 | 7000 | 0.0990 | 0.9747 | 0.9743 | 0.9665 | 0.9823 |
| 0.0384 | 3.0 | 7500 | 0.1001 | 0.9746 | 0.9742 | 0.9662 | 0.9824 |