Hugging Face documentation on loading PEFT
Summary
Tutorials
Choosing the Right PEFT Method
from transformers import AutoModelForCausalLM
from peft import LoraConfig, get_peft_model
model = AutoModelForCausalLM.from_pretrained("facebook/opt-1.3b")
lora_config = LoraConfig(
r=8,
lora_alpha=16,
target_modules=["q_proj", "v_proj"],
lora_dropout=0.05,
bias="none",
task_type="CAUSAL_LM",
)
model = get_peft_model(model, lora_config)Optimising Adapter Hyperparameters
Efficient Storage and Sharing of Adapters
Combining Multiple Adapters
Fine-Tuning Additional Layers with PEFT Adapters
Monitoring Adapter Training
Last updated
Was this helpful?

