This article describes how to run Caddy on OpenWrt while keep the access to LuCI.

tl;dr

To run LuCI on Caddy, prepare a Caddyfile

{
    order file_server last
}

:80 {
    cgi /cgi-bin/luci* /www/cgi-bin/luci { script_name /cgi-bin/luci }
    cgi /ubus* ubus.sh { script_name /ubus }
    file_server /luci-static* { root /www }
    redir / /cgi-bin/luci
}

Download ubus.sh from yurt-page/cgi-ubus and put it together with Caddyfile.

To fetch Caddy executable with CGI support for a router:

Intro

Caddy is a powerful HTTP server with lots of handy & easily-configurable modules

OpenWRT is a popular Linux distro mainly designed for network devices (home routers) & embedded devices. It has a built-in HTTP server called uhttpd. It is also “powerful” but the prefix u (i.e. μ micro) suggests that it lacks some features. For example, it would not pass some headers like Authorization to CGI backends. OpenWRT runs LuCI, a web management system of routers, on uhttpd.

I developed some web applications running on my router with Lua WSAPI CGI. What I want is that these applications & LuCI share the port 80 for HTTP. But there cannot be two or more applications binding to the same TCP port (strictly the same IP address + port). So a natural way to do this is to bind uhttpd to another loopback port, and add a reverse proxy in Caddy.

I removed HTTPS listeners because HTTPS will be managed by Caddy, and change the wildcard binding to loopback:8000,

--- /etc/config/uhttpd
+++ /etc/config/uhttpd
@@ -1,5 +1,3 @@
 config uhttpd 'main'
-    list listen_http '0.0.0.0:80'
-    list listen_http '[::]:80'
-    list listen_https '0.0.0.0:443'
-    list listen_https '[::]:443'
+    list listen_http '127.0.0.1:8000'
+    list listen_http '[::1]:8000'

also prepared a Caddyfile to run Caddy:

:80 { reverse_proxy 127.0.0.1:8000 }

The work above was expected to work until I tested it.

RPC Failure "No related RPC reply"

LuCI fetches device information (wireless list, interfaces, DHCP leases etc.) through a ubus JSONRPC interface. This is the details of RPC response:

RPC Failure Response

It is evident that the response is aborted thus truncated for some reasons. I found no solutions for it after trying to adjust dial_timeout, flush_interval etc. I could just treat it as a feature. :(

Solution

ubus JSONRPC interface is also implemented by part of LuCI. So it might be possible to make a new ubus RPC in Shell or any other language that is able to interact with CGI. Therefore, all things could be done in Caddy. I found a post and got its repo yurt-page/cgi-ubus, with the following Caddyfile:

{
    order file_server last
}

:80 {
    cgi /cgi-bin/luci* /www/cgi-bin/luci { script_name /cgi-bin/luci }
    cgi /ubus* ubus.sh { script_name /ubus }
    file_server /luci-static* { root /www }
    redir / /cgi-bin/luci
}

However the script did not support multiple JSONRPC requests at once. (i.e. the body is an array of several JSONRPC requests) After patching ubus.sh, it comes to work, with a pull request.

JSONRPC Requests