vishalgimhan/uber-report-2024-dataset
Viewer • Updated • 5.43k • 15
How to use vishalgimhan/uber-assistant with Transformers:
# Load model directly
from transformers import AutoModel
model = AutoModel.from_pretrained("vishalgimhan/uber-assistant", dtype="auto")This is a LoRA adapter finetuned on Uber Annual Report 2024
meta-llama/Llama-3.1-8B-Instruct
Finetuned using the Uber Annual Report 2024 Dataset
from transformers import AutoModelForCausalLM, AutoTokenizer, BitsAndBytesConfig
from peft import PeftModel
import torch
model_id = "vishalgimhan/uber-assistant"
bnb_config = BitsAndBytesConfig(
load_in_4bit=True,
bnb_4bit_compute_dtype=torch.bfloat16,
bnb_4bit_quant_type="nf4",
bnb_4bit_use_double_quant=True
)
base_model = AutoModelForCausalLM.from_pretrained(
"meta-llama/Llama-3.1-8B-Instruct",
quantization_config=bnb_config,
device_map="auto"
)
model = PeftModel.from_pretrained(base_model, model_id)
tokenizer = AutoTokenizer.from_pretrained(model_id)
This adapter inherits the license of the base model and dataset. Check those licenses before use or redistribution.