Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The ollama plugin does not work well in the latest version. #2

Closed
BKLFIRST opened this issue Jan 25, 2024 · 4 comments
Closed

The ollama plugin does not work well in the latest version. #2

BKLFIRST opened this issue Jan 25, 2024 · 4 comments

Comments

@BKLFIRST
Copy link

hello

I would like to inquire because I have encountered a problem when using rivet with ollama.

I recently installed rivet and ollama on my Apple Silicon MacBook Pro.

After confirming that ollama serve is working well at http://127.0.0.1:11434

Enter the address in the plugin settings and restart rivet.

Added ollama chat node.

The LLM model used llama2

I tried entering a simple prompt and seeing the results, but only the message "Error from Ollama: Load failed" appeared and it no longer ran.

After confirming that it works well by running "ollama run llama2" in the terminal,

Add the GET OLLAMA MODEL node, enter "llama2" as the model name, and run it.

All I get is an error saying “TypeError: Load failed”.

@abrenneke
Copy link
Owner

You have to either run ollama with OLLAMA_ORIGINS=* or use the Node executor in rivet

@BKLFIRST
Copy link
Author

You have to either run ollama with OLLAMA_ORIGINS=* or use the Node executor in rivet

Thank you for answer. But it still not work.

이미지 2024  1  25  오후 1 17

@BKLFIRST
Copy link
Author

Solved.
After stopping ollama that has already started
I ran the command you gave me again and ran rivet and it worked.

@FlipTip

This comment was marked as resolved.

abrenneke pushed a commit that referenced this issue Apr 22, 2024
feat: Implement batch processing for Ollama embeddings
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants