Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama LLM Response Parsing Fails (TypeError: null is not an object) #3993

Open
thewhitewizard opened this issue Mar 19, 2025 · 1 comment
Open
Labels
bug Something isn't working

Comments

@thewhitewizard
Copy link

Describe the bug

When using Ollama as the LLM engine in ElizaOS (v1.0.0-beta.2), the response parsing fails with a TypeError: null is not an object (evaluating 'responseObject.providers'). The issue occurs because parseJSONObjectFromText(response) returns null, which suggests that the response from Ollama is not valid JSON.

To Reproduce

USE_OLLAMA_TEXT_MODELS=true

OLLAMA_SERVER_URL=http://localhost:11434
OLLAMA_MODEL=llama3.2:1b
SMALL_OLLAMA_MODEL=llama3.2:1b
MEDIUM_OLLAMA_MODEL=llama3.2:1b
LARGE_OLLAMA_MODEL=llama3.2:1b

Screenshots

Image

Additional context

  • Ollama works fine when tested via the Ollama WebUI.
@thewhitewizard thewhitewizard added the bug Something isn't working label Mar 19, 2025
Copy link
Contributor

Hello @thewhitewizard! Welcome to the elizaOS community. Thank you for opening your first issue; we appreciate your contribution. You are now an elizaOS contributor!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant