-
-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. Weβll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Keywords AI Integration #5130
base: main
Are you sure you want to change the base?
Keywords AI Integration #5130
Conversation
The latest updates on your projects. Learn more about Vercel for Git βοΈ
|
Hey @ishaan-jaff, just want to follow up. Would appreciate any updates here. |
Hey @krrishdholakia @ishaan-jaff, is there any update on this? Do I need to change anything for this PR? |
@ishaan-jaff Hey just, following up again |
Hey @ishaan-jaff , just want to follow up again. Would appreciate any update on this. |
Hey @ishaan-jaff , just want to follow up again. Any updates here? |
@krrishdholakia Hey, any updates here would be appreciated |
"Content-Type": "application/json", | ||
} | ||
|
||
keywordsai_params = kwargs.get("extra_body", {}).pop("keywordsai_params", {}) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this looks wrong. you should be taking this information from metadata. extra_body will be sent straight to the llm provider.
] = original_response.model_dump() | ||
|
||
|
||
def commit_to_keywordsai(kwargs, start_time, end_time, success=True): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can you refactor this, to work like langsmith logging - specifically how batching logs works -
litellm/litellm/integrations/langsmith.py
Line 364 in a3d4bf6
async def async_send_batch(self): |
from litellm.integrations.keywordsai import KeywordsAILogger | ||
# litellm.set_verbose = True | ||
litellm.api_base = None | ||
litellm.callbacks = [KeywordsAILogger()] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is now how logging integrations on litellm work.
A user should just be able to do litellm.callbacks = ["keywords_ai"]
and it should just work.
elif logging_integration == "langsmith": |
|
||
```python | ||
from litellm.integrations.keywordsai import KeywordsAILogger | ||
litellm.callbacks = [KeywordsAILogger] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
please fix this to work as expected litellm.callbacks = ["keywords_ai"]
|
||
### Supported LLM Providers | ||
|
||
Keywords AI can log requests across [various LLM providers](https://docs.keywordsai.co/integration/supported-models) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm confused, why does your logging integration not work across all litellm models?
You don't handle any of the calling logic - it should just be a destination to receive the logged request/response (similar to otel/langfuse/etc.)
Title
Keywords AI Integration
Type
π New Feature
π Documentation
π Infrastructure
β Test
Changes
[REQUIRED] Testing - Attach a screenshot of any new tests passing locall
If UI changes, send a screenshot/GIF of working UI fixes
Here are all the outputs of the test/test_keywordsai_integrations.py
There are 9 tests:
1 for using Keywords AI as a proxy
8 for combinations of sync/async, stream/non-stream, tools/no-tools.
The result shows no errors, and the logging is non-blocking.