Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Combining Multiple MiniCPM-o adapaters #809

Open
nogifeet opened this issue Feb 3, 2025 · 0 comments
Open

Combining Multiple MiniCPM-o adapaters #809

nogifeet opened this issue Feb 3, 2025 · 0 comments

Comments

@nogifeet
Copy link

nogifeet commented Feb 3, 2025

Hello, I have finetuned MiniCPM-o, for two specific downstream tasks, using Lora. But when I try to merge the two adapters into the same model I get an error. The issue is that both of these models are trying to update the same

  modules_to_save =['embed_tokens', 'resampler']

Which is not allowed by hugging face, I guess. My question is if I try to remove the modules_to_save option and set it to None, how will that affect the performance of the final fine-tuned adapter? Is this something recommended while fine-tuning MiniCPM-o?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant