-
Notifications
You must be signed in to change notification settings - Fork 107
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
update: 使用通义千问大模型 #8
Conversation
QwenLLM.py
Outdated
LLMMetadata, | ||
) | ||
|
||
DEFAULT_MODEL = "qwen-turobo" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
qwen-turobo 这是一个typo吗
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
是的,但是这个没用到,可以删掉
为行动力点赞👍,可以把QwenLLM.py 放入custom/llms/ 文件夹下,然后建一个新的config_qwen.yml, 我在Update里加上对于Qwen的支持。 |
放到 custom/llms这部OK了,但是创建新的config_qwen.yml没理解,里边放什么呢? |
现在config.yaml里面是openai的模型,config_qwen.yml使用qwen作为llm。 |
我先合了吧,我来处理好了。 |
国内的几家都试了下,只有通义千问返回的是正确的。
看了下llama_index支持的大模型,国内大模型都不在上面。
按它的Interface实现了一个QwenLLM, 可以直接用通义千问的API
用法:
问题:
样例: