Hugging Face Forums
I wonder how to merge my PEFT adapter with the base model and finally get a new whole model?
🤗Transformers
John6666
February 5, 2025, 1:33pm
3
It seems that the problem can be avoided in some cases by using the PEFT side function.
show post in topic
Related topics
Topic
Replies
Views
Activity
I want to merge my PEFT adapter model with the base model and make a fully new model
Beginners
4
4901
February 5, 2025
Help with merging LoRA weights back into base model :-)
Beginners
11
71676
February 6, 2025
Handling Peft Model the right way (save, load, inference)
🤗Transformers
0
180
August 10, 2024
Dequantize 4bit B&B model to prepare for merging
🤗Transformers
4
125
September 2, 2025
Difference between AutoModelForCausalLM and peft_model.merge_and_unload() for a LoRA model during inference
🤗Transformers
2
1405
August 2, 2024