-
Notifications
You must be signed in to change notification settings - Fork 130
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Failure to connect to local ollama model #993
Labels
bug
Something isn't working
Comments
can you check your version of ollama? ollama -v |
**Feedback:**
The issue you're encountering appears to be related to the connection between your script and the Ollama model. The runtime successfully downloads the model, but it fails to connect during execution with an internal server error.
Here are a few steps you can take to troubleshoot and potentially resolve this issue:
1. **Model Compatibility**: Ensure that the `ollama:phi3.5:latest` model is compatible with the GenAI script runtime environment. There might be specific requirements or configurations needed for models hosted on Ollama.
2. **Environment Variables**: Verify if all necessary environment variables are correctly set up in your local environment. These might include API keys, authentication tokens, or other configuration settings required to interact with the Ollama model.
3. **Network and Firewall**: Check your network settings and firewall rules to ensure that there are no restrictions preventing the runtime from connecting to the Ollama model server.
4. **Logs and Diagnostics**: Refer to the detailed trace provided (the image). Look for any additional error messages or logs that might provide more context about why the connection is failing.
5. **Update Dependencies**: Ensure that all dependencies and libraries used by GenAI script are up to date. Sometimes, issues can be resolved with a simple update.
6. **Contact Support**: If none of the above steps resolve the issue, consider reaching out to Ollama support or checking for any known issues on their platform.
If you provide more details about the specific error messages or logs, I might be able to offer more targeted advice.
|
@bzorn could you try a smaller model, like |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I am running a trivial script from VS code on my Windows desktop:
When I run this from github copilot:
The runtime successfully downloads the model.
but I get an internal server error when it tries to connect:
Here's the trace:
The text was updated successfully, but these errors were encountered: