Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Keywords AI Integration #5130

Open
wants to merge 10 commits into
base: main
Choose a base branch
from
Open

Keywords AI Integration #5130

wants to merge 10 commits into from

Conversation

Raymond1415926
Copy link
Contributor

Title

Keywords AI Integration

Type

πŸ†• New Feature
πŸ“– Documentation
πŸš„ Infrastructure
βœ… Test

Changes

  • integrations/keywordsai.py
  • tests/test_keywordsai_integration.py
  • docs/observability/keywordsai_integration.md

[REQUIRED] Testing - Attach a screenshot of any new tests passing locall

If UI changes, send a screenshot/GIF of working UI fixes
image
Here are all the outputs of the test/test_keywordsai_integrations.py
There are 9 tests:
1 for using Keywords AI as a proxy
8 for combinations of sync/async, stream/non-stream, tools/no-tools.

The result shows no errors, and the logging is non-blocking.

Copy link

vercel bot commented Aug 9, 2024

The latest updates on your projects. Learn more about Vercel for Git β†—οΈŽ

Name Status Preview Comments Updated (UTC)
litellm βœ… Ready (Inspect) Visit Preview πŸ’¬ Add feedback Sep 5, 2024 9:28pm

@Raymond1415926
Copy link
Contributor Author

Hey @ishaan-jaff, just want to follow up. Would appreciate any updates here.

@Raymond1415926
Copy link
Contributor Author

Hey @krrishdholakia @ishaan-jaff, is there any update on this? Do I need to change anything for this PR?

@Raymond1415926
Copy link
Contributor Author

@ishaan-jaff Hey just, following up again

@Raymond1415926
Copy link
Contributor Author

Hey @ishaan-jaff , just want to follow up again. Would appreciate any update on this.

@Raymond1415926
Copy link
Contributor Author

Hey @ishaan-jaff , just want to follow up again. Any updates here?

@Raymond1415926
Copy link
Contributor Author

@krrishdholakia Hey, any updates here would be appreciated

"Content-Type": "application/json",
}

keywordsai_params = kwargs.get("extra_body", {}).pop("keywordsai_params", {})
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this looks wrong. you should be taking this information from metadata. extra_body will be sent straight to the llm provider.

] = original_response.model_dump()


def commit_to_keywordsai(kwargs, start_time, end_time, success=True):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you refactor this, to work like langsmith logging - specifically how batching logs works -

async def async_send_batch(self):

from litellm.integrations.keywordsai import KeywordsAILogger
# litellm.set_verbose = True
litellm.api_base = None
litellm.callbacks = [KeywordsAILogger()]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is now how logging integrations on litellm work.

A user should just be able to do litellm.callbacks = ["keywords_ai"] and it should just work.

elif logging_integration == "langsmith":


```python
from litellm.integrations.keywordsai import KeywordsAILogger
litellm.callbacks = [KeywordsAILogger]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please fix this to work as expected litellm.callbacks = ["keywords_ai"]


### Supported LLM Providers

Keywords AI can log requests across [various LLM providers](https://docs.keywordsai.co/integration/supported-models)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm confused, why does your logging integration not work across all litellm models?

You don't handle any of the calling logic - it should just be a destination to receive the logged request/response (similar to otel/langfuse/etc.)

@krrishdholakia krrishdholakia self-assigned this Sep 20, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants