Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LiteLLM Proxy Startup Error: TypeError in check_view_exists() #5702

Closed
kishan-getstarted opened this issue Sep 14, 2024 · 9 comments · Fixed by #5723 or #5731
Closed

LiteLLM Proxy Startup Error: TypeError in check_view_exists() #5702

kishan-getstarted opened this issue Sep 14, 2024 · 9 comments · Fixed by #5723 or #5731

Comments

@kishan-getstarted
Copy link

kishan-getstarted commented Sep 14, 2024

When attempting to start the LiteLLM proxy server following the quick start guide, I encountered an error during the application startup process. The error occurs in the check_view_exists() function and seems to be related to handling a None value.

Steps to Reproduce

Follow the quick start guide at https://docs.litellm.ai/docs/proxy/quick_start
Attempt to start the LiteLLM proxy server

Error Message

`ERROR:    Traceback (most recent call last):
  File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/starlette/routing.py", line 732, in lifespan
    async with self.lifespan_context(app) as maybe_state:
               ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/starlette/routing.py", line 608, in __aenter__
    await self._router.startup()
  File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/starlette/routing.py", line 709, in startup
    await handler()
  File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/proxy/proxy_server.py", line 2920, in startup_event
    create_view_response = await prisma_client.check_view_exists()
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/backoff/_async.py", line 151, in retry
    ret = await target(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/proxy/utils.py", line 995, in check_view_exists
    if required_view not in ret[0]["view_names"]:
       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: argument of type 'NoneType' is not iterable

ERROR:    Application startup failed. Exiting.`
@ksquarekumar
Copy link

ksquarekumar commented Sep 14, 2024

I am also facing the same issue on a kubernetes deployment, running prisma commands like push and validate from the pod shell works with DATABASE_URL set, but starting with litellm --port 4000 --config /app/proxy_server_config.yaml leads to this error where the workers fail to start.

I am running ghcr.io/berriai/litellm:main-latest

@ishaan-jaff
Copy link
Contributor

@kishan-getstarted @ksquarekumar fix will be on 1.46.0

@agileben
Copy link

@ishaan-jaff I tried a fresh deploy of 1.46.0 on railway but same error I think:

INFO: Started server process [1]

INFO: Waiting for application startup.

ERROR: Traceback (most recent call last):

File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 732, in lifespan

async with self.lifespan_context(app) as maybe_state:

File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 608, in aenter

await self._router.startup()

File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 709, in startup

await handler()

File "/usr/local/lib/python3.11/site-packages/litellm/proxy/proxy_server.py", line 2905, in startup_event

create_view_response = await prisma_client.check_view_exists()

                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/usr/local/lib/python3.11/site-packages/backoff/_async.py", line 151, in retry

ret = await target(*args, **kwargs)

      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/usr/local/lib/python3.11/site-packages/litellm/proxy/utils.py", line 995, in check_view_exists

if required_view not in ret[0]["view_names"]:

   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

TypeError: argument of type 'NoneType' is not iterable

ERROR: Application startup failed. Exiting.

@kishan-getstarted
Copy link
Author

I have taken a pull from main. I have updated the litellm I am still facing the same issue I m just doing docker-compose up

am I doing something wrong here ?

@krrishdholakia
Copy link
Contributor

krrishdholakia commented Sep 17, 2024

Hi everyone, thank you for trying this so far. We believe we have a fix on main for this now.

It's not published yet due to some ci/cd issues that we're working through. Hoping to have this fixed by EOD.

@krrishdholakia
Copy link
Contributor

@kishan-getstarted i can confirm this works for me on main latest with a new db. This is the new warning you should now be seeing -

"\n\n\033[93mNot all views exist in db, needed for UI 'Usage' tab. Missing={}.\nRun 'create_views.py' from https://github.com/BerriAI/litellm/tree/main/db_scripts to create missing views.\033[0m\n".format(

Screenshot 2024-09-17 at 1 57 12 PM

@kishan-getstarted
Copy link
Author

kishan-getstarted commented Sep 17, 2024

yeah okay so far I have to change the docker-compose file to change the db. I can see it should take from the .env but it was not taking after few retry I managed to resolved the previous error now new one is coming as below
Screenshot 2024-09-18 at 3 02 17 AM
Screenshot 2024-09-18 at 3 02 47 AM

I can confirm my PG is running.

may be its my system issue idk tbh..

@krrishdholakia
Copy link
Contributor

This looks like an unrelated issue @kishan-getstarted

looks like your DB is either not running or the url is incorrect

@kishan-getstarted
Copy link
Author

@krrishdholakia it works fine now.. I had to do docker compose down && docker compose up --build -d. and the docker-compose up

now I am able to run the app locally. I am happy to close this thread :)

Thanks for your response and help !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
5 participants