You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
当把config文件中改成MiniCPM3ForCausalLM时,报
File "/home/.local/lib/python3.10/site-packages/sglang/srt/models/minicpm3.py", line 650, in load_weights
param = params_dict[name]
KeyError: 'embed_tokens.weight'
, detoken_init_state: init ok
改成MiniCPMForCausalLM时,报
RuntimeError: The size of tensor a (16) must match the size of tensor b (32) at non-singleton dimension 0
, detoken_init_state: init ok
Is there an existing issue ? / 是否已有相关的 issue ?
Describe the bug / 描述这个 bug
按照官方教程进行lora微调和模型合并保存后,调用vllm serve无法加载模型,报
To Reproduce / 如何复现
按照官方llama factory教程,对示例dpo数据采取lora微调,并按官方教程对模型进行合并保存,通过vllm serve调用后报错
Expected behavior / 期望的结果
正常运行微调模型
Screenshots / 截图
No response
Environment / 环境
Additional context / 其他信息
No response
The text was updated successfully, but these errors were encountered: