Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Failure to connect to local ollama model #993

Open
bzorn opened this issue Jan 8, 2025 · 3 comments
Open

Failure to connect to local ollama model #993

bzorn opened this issue Jan 8, 2025 · 3 comments
Labels
bug Something isn't working

Comments

@bzorn
Copy link
Collaborator

bzorn commented Jan 8, 2025

I am running a trivial script from VS code on my Windows desktop:

script({ title: "test1", 
    model: "ollama:phi3.5:latest",
    group: "text",
    system: ["system"]})

$`Summarize an imaginary story where the characters find out 
that dinosaurs still exist.  Write only a newspaper headline announcing their discovery.`

When I run this from github copilot:

@genaiscript /run test1

The runtime successfully downloads the model.

PS C:\Users\zorn> ollama list
NAME            ID              SIZE    MODIFIED
llama3:latest   365c0bd3c000    4.7 GB  7 months ago
phi3:latest     a2c89ceaed85    2.3 GB  8 months ago
phi3.5:latest   61819fb370a3    2.2 GB  2 days ago
PS C:\Users\zorn>

but I get an internal server error when it tries to connect:

project: found 93 scripts
run 317761433b05: starting test1
genaiscript: test1
 large: azure:gpt-4o_2024-08-06 (env)
 small: azure:gpt-4o-mini_2024-07-18 (env)
 vision: azure:gpt-4o_2024-08-06 (env)
 reasoning: azure:o1-preview_2024-09-12 (env)
 reasoning_small: azure:o1-mini_2024-09-12 (env)
chat: sending 2 messages to ollama:phi3.5:latest (~43 tokens)

retry #1 in 15.5s, 500, Internal Server Error
retry #2 in 23.1s, 500, Internal Server Error
retry #3 in 35.0s, 500, Internal Server Error

Here's the trace:

Image

@bzorn bzorn added the bug Something isn't working label Jan 8, 2025
@pelikhan
Copy link
Member

pelikhan commented Jan 8, 2025

can you check your version of ollama?

ollama -v

Copy link

github-actions bot commented Jan 8, 2025

**Feedback:**

The issue you're encountering appears to be related to the connection between your script and the Ollama model. The runtime successfully downloads the model, but it fails to connect during execution with an internal server error.

Here are a few steps you can take to troubleshoot and potentially resolve this issue:

1. **Model Compatibility**: Ensure that the `ollama:phi3.5:latest` model is compatible with the GenAI script runtime environment. There might be specific requirements or configurations needed for models hosted on Ollama.

2. **Environment Variables**: Verify if all necessary environment variables are correctly set up in your local environment. These might include API keys, authentication tokens, or other configuration settings required to interact with the Ollama model.

3. **Network and Firewall**: Check your network settings and firewall rules to ensure that there are no restrictions preventing the runtime from connecting to the Ollama model server.

4. **Logs and Diagnostics**: Refer to the detailed trace provided (the image). Look for any additional error messages or logs that might provide more context about why the connection is failing.

5. **Update Dependencies**: Ensure that all dependencies and libraries used by GenAI script are up to date. Sometimes, issues can be resolved with a simple update.

6. **Contact Support**: If none of the above steps resolve the issue, consider reaching out to Ollama support or checking for any known issues on their platform.

If you provide more details about the specific error messages or logs, I might be able to offer more targeted advice.

AI-generated content by issue-reviewer may be incorrect

@pelikhan
Copy link
Member

pelikhan commented Jan 8, 2025

@bzorn could you try a smaller model, like llama3.2:1b ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants