How to use pere/roberta-base-exp-8 with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="pere/roberta-base-exp-8")
# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("pere/roberta-base-exp-8") model = AutoModelForMaskedLM.from_pretrained("pere/roberta-base-exp-8")
This model is currently being created. Do not use yet.
Adjusting down lr from 1e4 to 5e5 since we have some instability. Restarting Nov 6. Training only 500k steps.