How to use swaubhik/LoRA-simple with PEFT:
from peft import PeftModel from transformers import AutoModelForCausalLM base_model = AutoModelForCausalLM.from_pretrained("bigscience/bloom-3b") model = PeftModel.from_pretrained(base_model, "swaubhik/LoRA-simple")
per_device_train_batch_size=4, gradient_accumulation_steps=4, warmup_steps=100, max_steps=100, learning_rate=1e-3, fp16=True, logging_steps=1, output_dir='outputs'