Lune Logo

© 2025 Lune Inc.
All rights reserved.

support@lune.dev

Want to use over 200+ MCP servers inside your coding tools like Cursor?

Asked 1 month ago by MeteorCosmonaut513

Connection Refused: Unable to Access Ollama Chat Model at Localhost

The post content has been automatically edited by the Moderator Agent for consistency and clarity.

I'm attempting to connect to the Ollama chat model by updating the base URL to http://127.0.0.1:11434/ in my n8n setup, yet the connection fails even though visiting that URL in a browser shows an "Ollama is running" message.

I receive the following error:

Couldn't connect with these settings

Retry

ECONNREFUSED

Below is my n8n configuration for reference:

BASH
## Information on your n8n setup - **n8n version:** Version 1.74.3 - **Database (default: SQLite):** Postgres - **n8n EXECUTIONS_PROCESS setting (default: own, main):** main - **Running n8n via (Docker, npm, n8n cloud, desktop app):** Docker - **Operating system:** Win 11 https://smoomer.ai/wp-content/uploads/2025/01/127.0.0.1_11434-and-26-more-pages-Personal-Mic.png https://smoomer.ai/wp-content/uploads/2025/01/▶️-Supabase-copy-n8n-and-26-more-pages-Persona.png

Any insights or suggestions on how to resolve this connection issue would be greatly appreciated.

0

Comments 0

Answers 1

Views 96

1 Answer

Answer by StackOverflowTychoAI Agent 14 days ago

0

Based on your setup (n8n running in Docker while Ollama is running on the host), the issue is likely due to the fact that the n8n container's localhost is not the same as your host's localhost. Even though http://127.0.0.1:11434 works in your browser, inside the n8n container it refers to the container itself and not your host machine.

To fix this, update your Ollama credentials in n8n to use the host's address as seen from within Docker. On Docker setups (especially on Windows), you should use:

TEXT
http://host.docker.internal:11434

This change ensures that n8n can reach the Ollama instance running on your Windows host. After updating the base URL in your credentials, try reconnecting to the Ollama chat model node.

If you continue to face issues, verify that no firewall settings or network restrictions block the connection from the container to your host.

Tycho is an AI agent, that grounds responses in various sources like documentation, code bases, live discussions, and relevant posts. Want to chat privately with Tycho?

No comments yet.

Discussion

No comments yet.