Deploying a Loco for Rust Chat Room Application
Introduction
Loco is a new framework that aims to be the Rust equivalent of Rails. This is an ambitious goal, given Rust's different paradigms and ecosystem. In this post, I'll walk you through my experience of deploying a Loco-built chat room application on a Linux (Ubuntu) server.
Getting Started
Forking and Setting Up the Project
I started by forking the chat-rooms example repository from Loco. After cloning it to my local machine, I ran into a couple of issues with Cargo dependencies. Here's how I resolved them:
I ran cargo update
to ensure all dependencies were at their latest compatible versions.
With this completed, I had the project running locally and was ready to tackle deployment.
Deployment Process
Creating a Systemd Service
To run our chat room application as a service on Ubuntu, we'll use systemd. Here's how to create a new service file:
- Create and edit the service file:
sudo systemctl edit --force --full chatroom.service
Which will bring up an editor. So I started with this:-
[Unit]
Description=ChatRooms Daemon
Requires=multi-user.target
After=network.target auditd.service
After=multi-user.target
[Service]
WorkingDirectory=/usr/share/chatroom/
ExecStart=/usr/share/chatroom/chat_rooms-cli
[Install]`
WantedBy=multi-user.target
-
Setup the folders used by the service Now I have to create the
/usr/share/chatroom
folder and get the build in there. A quicksudo mkdir /usr/share/chatroom
gets us started. But I will also need a folder for the static elements of the front end build. Sosudo mkdir /usr/share/chatroom/static
will get the folders ready. -
Building the app and putting it in place. Back in the codebase and into the root folder and a
cargo build --release
and then dig out the app from thetarget/release
folder and copy it to the folder setup for the service above. Also need to copy the config folder from the root of the project to the service foldersudo cp -r config/ /usr/share/chatroom/
. -
Building the frontend and putting it in place We will need to build this first. I use pnpm for such things so a
pnpm i
in the frontend folder to install the required packages and then apnpm build
to get the frontend built to thefrontend/dist
folder. Which again we need to copy to the service folder/usr/share/chatroom
. Then copy the dist folder across to the static folder we created for the front end earliersudo cp -R dist/. /usr/share/chatroom/static
. -
Testing the service. We should now have all of the files in place to start our service and see if it works.
Lets start it off with sudo systemctl enable --now chatroom.service
and then lets see if it is working or not with systemctl status chatroom.service
. On the first run I got an error message because I have not defined what I want the app to do when it runs. Loco apps require you to define what you want to do with the app like start
or task
. I want it to start so I need to change my service definition so lets sudo systemctl edit chatroom.service
.
- Debugging and fixing Changed the service definiton file to
[Unit]
Description=ChatRooms Daemon
Requires=multi-user.target
After=network.target auditd.service
After=multi-user.target
[Service]
WorkingDirectory=/usr/share/chatroom/
ExecStart=/usr/share/chatroom/chat_rooms-cli start
[Install]`
WantedBy=multi-user.target
So I restarted the service sudo restart chatroom.service
. Still this fails but this time because the config file is looking for the frontend of the site in frontend/dist so we need to change the config file. The config file that gets distributed with the app is for development so I have created a production.yaml
config file with some changes and then I need to put the app into production mode by doing a -e production
when starting it. So need to make changes to the service definition file again.
So my final version of the service definition file is
[Unit]
Description=ChatRooms Daemon
Requires=multi-user.target
After=network.target auditd.service
After=multi-user.target
[Service]
WorkingDirectory=/usr/share/chatroom/
ExecStart=/usr/share/chatroom/chat_rooms-cli -e production start
[Install]
WantedBy=multi-user.target
And my production config file is:-
# Loco configuration file documentation
# Application logging configuration
logger:
# Enable or disable logging.
enable: true
# Log level, options: trace, debug, info, warn or error.
level: debug
# Define the logging format. options: compact, pretty or Json
format: compact
# By default the logger has filtering only logs that came from your code or logs that came from `loco` framework. to see all third party libraries
# Uncomment the line below to override to see all third party libraries you can enable this config and override the logger filters.
# override_filter: trace
# Web server configuration
server:
# Port on which the server will listen. the server binding is 0.0.0.0:{PORT}
port: 3001
# The UI hostname or IP address that mailers will point to.
host: http://chat.dave-gill.co.uk
# Out of the box middleware configuration. to disable middleware you can changed the `enable` field to `false` of comment the middleware block
middlewares:
# Enable Etag cache header middleware
etag:
enable: true
# Allows to limit the payload size request. payload that bigger than this file will blocked the request.
limit_payload:
# Enable/Disable the middleware.
enable: true
# the limit size. can be b,kb,kib,mb,mib,gb,gib
body_limit: 5mb
# Generating a unique request ID and enhancing logging with additional information such as the start and completion of request processing, latency, status code, and other request details.
logger:
# Enable/Disable the middleware.
enable: true
# when your code is panicked, the request still returns 500 status code.
catch_panic:
# Enable/Disable the middleware.
enable: true
# Timeout for incoming requests middleware. requests that take more time from the configuration will cute and 408 status code will returned.
timeout_request:
# Enable/Disable the middleware.
enable: false
# Duration time in milliseconds.
timeout: 5000
static:
enable: true
must_exist: true
folder:
uri: "/"
path: "static"
fallback: "static/index.html"
My service is now running:-
dave@system76-pc:/usr/share/chatroom$ systemctl status chatroom.service
● chatroom.service - ChatRooms Daemon
Loaded: loaded (/etc/systemd/system/chatroom.service; enabled; vendor preset: enabled)
Active: active (running) since Tue 2024-09-03 21:18:26 BST; 3s ago
Main PID: 751333 (chat_rooms-cli)
Tasks: 9 (limit: 37981)
Memory: 3.1M
CPU: 7ms
CGroup: /system.slice/chatroom.service
└─751333 /usr/share/chatroom/chat_rooms-cli -e production start
Sep 03 21:18:26 system76-pc chat_rooms-cli[751333]: ██████ █████ ███ █████ ▄▄▄ █████ ███ █████
Sep 03 21:18:26 system76-pc chat_rooms-cli[751333]: ██████ █████ ███ ████ ███ █████ ███ ████▀
Sep 03 21:18:26 system76-pc chat_rooms-cli[751333]: ▀▀▀██▄ ▀▀▀▀▀▀▀▀▀▀ ▀▀▀▀▀▀▀▀▀▀ ▀▀▀▀▀▀▀▀▀▀ ██▀
Sep 03 21:18:26 system76-pc chat_rooms-cli[751333]: ▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
Sep 03 21:18:26 system76-pc chat_rooms-cli[751333]: https://loco.rs
Sep 03 21:18:26 system76-pc chat_rooms-cli[751333]: environment: production
Sep 03 21:18:26 system76-pc chat_rooms-cli[751333]: logger: debug
Sep 03 21:18:26 system76-pc chat_rooms-cli[751333]: compilation: release
Sep 03 21:18:26 system76-pc chat_rooms-cli[751333]: modes: server
Sep 03 21:18:26 system76-pc chat_rooms-cli[751333]: listening on [::]:3001
- Making it public Now I need to get it setup so that the outside world can see it and for it to be part of my website.
I have added a new DNS entry to point chat.dave-gill.co.uk to my web server ip address. Now I need to arrange for my reverse proxy to pass that traffic through to my service I have just created. So I wont show this part so that I dont expose anything I dont want to. I then used certbot to create a certificate for the domain and the whole thing is setup and running. Except the fact that the frontend does not work and hits cors issues. Looking at the error it looks like it is because the frontend is setup to look for localhost. Which is a bit odd.
After digging around in the front end code I spotted that the domain name for the websocket was hardcoded so changed the code in frontend to take a env variable to define the ws url to use. Which was more difficult than I anticipated because Loco uses vite for the front end which I have never used it. And not it is working and available here and my repo with the code chages is here