<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Proxy on ErrorVault — Developer Error Code Dictionary</title><link>https://errorvault.dev/tags/proxy/</link><description>Recent content in Proxy on ErrorVault — Developer Error Code Dictionary</description><generator>Hugo</generator><language>en-us</language><lastBuildDate>Sat, 18 Apr 2026 11:19:13 +0800</lastBuildDate><atom:link href="https://errorvault.dev/tags/proxy/feed.xml" rel="self" type="application/rss+xml"/><item><title>Fix clw-llm-unreachable: OpenClaw LLM service unreachable connection error</title><link>https://errorvault.dev/openclaw/openclaw-clw-llm-unreachable-llm-service-down/</link><pubDate>Sat, 18 Apr 2026 11:19:13 +0800</pubDate><guid>https://errorvault.dev/openclaw/openclaw-clw-llm-unreachable-llm-service-down/</guid><description>&lt;h2 id="1-symptoms">1. Symptoms&lt;/h2>
&lt;p>The &lt;code>clw-llm-unreachable&lt;/code> error in OpenClaw manifests when the client cannot establish a connection to the configured Large Language Model (LLM) service endpoint. This blocks all inference requests, CLI commands, or API calls relying on the LLM backend.&lt;/p>
&lt;h2 id="typical-error-output-from-clw-cli">Typical error output from &lt;code>clw&lt;/code> CLI:&lt;/h2>
&lt;p>$ clw infer &amp;ndash;prompt &amp;ldquo;Hello world&amp;rdquo;
clw-llm-unreachable: Failed to reach LLM service at &lt;a href="https://llm.example.com:443/v1">https://llm.example.com:443/v1&lt;/a>.
Connection refused (error code: ECONNREFUSED). Retries exhausted after 3 attempts.&lt;/p>
&lt;div class="highlight">&lt;pre tabindex="0" style="color:#f8f8f2;background-color:#282a36;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">&lt;code class="language-fallback" data-lang="fallback">&lt;span style="display:flex;">&lt;span>
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>In application logs (e.g., from a Python integration):
&lt;/span>&lt;/span>&lt;/code>&lt;/pre>&lt;/div>&lt;p>ERROR:clw.client: LLM endpoint unreachable: https://localhost:8080/v1/models
Traceback (most recent call last):
File &amp;ldquo;/usr/local/lib/python3.11/site-packages/openclaw/client.py&amp;rdquo;, line 245, in _connect
raise LLMUnreachableError(f&amp;quot;Failed to reach LLM at {self.endpoint}&amp;quot;)
clw-llm-unreachable: Connection timeout after 30s&lt;/p></description></item></channel></rss>