rajpurkar/squad
Viewer • Updated • 98.2k • 148k • 363
How to use Kapilydv6/my-qa-model with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("question-answering", model="Kapilydv6/my-qa-model") # Load model directly
from transformers import AutoTokenizer, AutoModelForQuestionAnswering
tokenizer = AutoTokenizer.from_pretrained("Kapilydv6/my-qa-model")
model = AutoModelForQuestionAnswering.from_pretrained("Kapilydv6/my-qa-model")A fine-tuned DistilBERT model for extractive question answering. Given a context paragraph and a question, it extracts the answer directly from the text.
from transformers import pipeline
qa = pipeline("question-answering", model="Kapilydv6/my-qa-model")
result = qa(
question="Who created Python?",
context="Python is a programming language created by Guido van Rossum."
)
print(result["answer"]) # "Guido van Rossum"
| Parameter | Value |
|---|---|
| Base model | distilbert-base-uncased |
| Dataset | SQuAD v1.1 (3000 samples) |
| Epochs | 3 |
| Learning rate | 3e-5 |
| Max sequence len | 384 |
| Framework | PyTorch + Transformers |
This is my second Hugging Face model! Key concepts:
Trained on a small subset of SQuAD for learning purposes. For production, train on the full dataset (~87k examples).