Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Local Ollama guideline #32

Open
MiikaMatias opened this issue Jan 22, 2025 · 1 comment
Open

Local Ollama guideline #32

MiikaMatias opened this issue Jan 22, 2025 · 1 comment
Labels
bug Something isn't working

Comments

@MiikaMatias
Copy link

MiikaMatias commented Jan 22, 2025

I'm running my ollama on http://localhost:11434/, but it's not reached by the application. Any ideas why this could be? The application says that ollama is not running, and that model's cannot be found. I'm using ollama serve and browser returns the correct response.

@ECuiDev ECuiDev added the bug Something isn't working label Jan 23, 2025
@ECuiDev
Copy link
Owner

ECuiDev commented Jan 23, 2025

That's strange. When I run ollama serve I don't have any issues. Are you using the correct base URL in Obsidian?

Image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants