-
Notifications
You must be signed in to change notification settings - Fork 96
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Vectorizer for Ollama: Add Ollama support for vectorizer #230
Comments
We are on it already 😄 |
wait, so if i want to use local embedding model - that means that i can't right now ? was trying to make it work with nomic-embed-text but simply could not make it work. |
You can actually, I am using it via the
|
What problem does the new feature solve?
Add Ollama support for vectorizer
What does the feature do?
Allows to use self hosted Ollama instance to process embeddings.
Implementation challenges
No response
Are you going to work on this feature?
None
The text was updated successfully, but these errors were encountered: