How to run Discord bot on Cloud Run

One of my hobbies is playing Pokemon Go – yeah, I’m not a kid, but I grow up on Pokemons, so this game is my guilty pleasure. Of course, it’s a cooperation game, so some type of communicators is very nice to ensure, that the local community can communicate with each other. Most Pokemon Go communities use Discord as the main way of communication. So we do – out local groups from the city I live in. As you can guess, I’m that “tech guy”, who makes necessary changes in Discord configuration and so on. Moreover, I decided to create a Discord Bot for our server to provide necessary information for players and automate new members’ management (like assignee roles and so on). And I didn’t even think, that I will deploy Discord Bot on Cloud Run in the future.

In the beginning, my bot was running on VPS, in the simplest possible way – as a service. But recently I decided to move my toys from VPS to more… “fully-managed services”, so I had to sort out the Discord Bot problem. I made deep research and finally decided to try with Google Cloud Platform Cloud Run. I thought that it should be very easy to deploy and run my bot, and I was right. Now maintaining is much easier than before, on VPS. Especially the deployment process is more flexible, safe, and enjoyable.

Why Discord Bot on Cloud Run?

I needed service as simple as possible. Moreover, the price also was a very important decision factor. I checked some dedicated services (like Glitch), but I was mostly focused on Google Cloud Platform. I work with this cloud provider on a daily basis, so I’m familiar with their services, way of work, and so on. Also, Cloud Functions was one of those services I already knew, so it seemed to be one of the best options. This opinion was challenged very quickly – Discord Bot should be running for the whole time, and Cloud Functions was created for completely different purposes.

Then I recalled the Google Cloud Run – I had one meeting with this service, but it was a very… unsatisfying meeting. Hence I was not convinced at all. Anyway, I decided to try, because that one situation, when I used Cloud Run was slightly different than my current needs. So why not give another chance? And do you know what? It was a fantastic decision. Of course, I needed to add one unnecessary (from the Bot’s point of view) functionality, but… Generally, it was a very good decision. But why I chose this service?

  • very cheap (2 million requests free, then 0.40$ per 1 million requests)
  • fully managed (I don’t need to maintain it)
  • very easy to deploy and release a new version
  • based on the Docker images
  • low entry threshold

How did I do a migration from VPS?

On my VPS the whole Bot was very… primitive. It’s written in JavaScripts (uses Discord.js – please, don’t ask why not Python, I have no idea, it just happened, but I have migration in my plans too). The whole service was built from the following parts:

  • service configuration file enabled on the system startup
  • a directory in the /opt directory with main files
  • a testing area in the home directory
  • repository with the code
  • auth.json files in the testing area and /opt directory with the Discord tokens and so on

While the new release process was limited to fetching a new branch from the remote repository, checkout on that branch in the testing area, making some tests on my private Discord server, and using Rsync to update changes. Did I highlight, that it was a very primitive process?

The first problem was the configuration process. Of course, I didn’t have any Ansible playbook with all the necessary steps or at least any bah script. As, you know, it was mine. Mine, not for my company o something like that, and I’m a couch potato. So moving the whole configuration to the docker image was exactly what I needed. Secondly, I didn’t have any sensible process for the new release. Again – I’m a lazy guy, and it is a community project. This time I decided to grab all those things, put into the docker image, and create at least a simple bash script to automate new release processes – both “test” and “prod”.

You wanna run a Discord Bot? Great, let him listening on a port!

Wait, what? Why the heck Discord Bot should listen on a port? There is no reason for doing that, right? It just listening, but events on a Discord server and communicate directly with the Discord API. Yeah, but there is one, a small thing that I missed during the preparation. But I really, really wanted to use a Cloud Run (did I mention I am a couch potato?), so I had to do a very ugly thing. But first, why did I have to do that?

When you see the Container runtime contract for Cloud Run, you will see the following thing:

„The container must listen for requests on 0.0.0.0 on the port to which requests are sent. By default, requests are sent to 8080, but you can configure Cloud Run to send requests to the port of your choice”.

Yeah, Cloud Run was not created for things like Discord Bots, but it assumes that you want to host some web service. But you know, it’s the requirement, so if you try to omit this thing, then your service won’t deploy properly. How did I solve this problem? In a simple way:

It’s not a clean solution, but… it works. Of course, I would never do that in a production environment, which is not just… my bot for a small community. Let treat this as a curiosity, not a solution.

How to build a Discord Bot on Cloud Run?

Ok, so I made this “necessary” change and I could make the next step. Prepare the rest of the files. My first directory structure looked like this:

Where package.json contains all necessary dependencies for node.js build, poke-main.js is the main file with all Bot content. auth.json, as probably you can guess, has all sensitive data like API token, and this file is not committed to the git repository.

But now it looks like this:

As you can see, all files have been moved to the additional “data” directory, and in the root of the repository we have two new files:

  • Dockerfile
  • build_and_deploy.sh

Two new magic files

Af you can guess, Dockerfile is needed for building the docker image. The second file is just a script for building and deploying the whole stuff to two different environments – test and prod. There are also two additional auth-*.json files, that contain all needed tokens and other configuration stuff, that should not be pushed into the remote repository – I added them also to the .gitignore with just simple “auth*.json”.

Ok, now you need to know, that I assumed, it is not a crucial service. So I did the whole work with the risk acceptance, so please do not copy my solution as is, especially to the production in your company! You should use that only as a very poor template because you will see some bad practices below!

Ok, so what is the content of the Dockerfile?

Simple, right? It just takes the package.json file, downloads all needed dependencies, and then takes the rest of the files. And there we have one of the bad practice – it ALSO takes the auth.json TO THE docker image. So the whole file resides inside the docker image. Fast, but not recommended for production. But it’s enough to run a Discord Bot on Cloud Run.

The second file is a very primitive bash script, that:

  1. Activate proper GCP configuration.
  2. Prepare an auth.json file.
  3. Build a docker image.
  4. Push the docker image to the Google Container Registry.
  5. Deploy a new Cloud Run revision.
  6. Create a scheduler job that makes requests to the Cloud Run service URL.

And there you have its content:

Wait, a scheduler job, that makes requests?

Probably you are surprised. Unfortunately, it’s quite… mandatory. During the tests, it turned out, that Cloud Run doesn’t want to be alive all the time. In theory, GCP documentation says that:

“When a revision does not receive any traffic, it is scaled down to the minimum container instances configured (zero by default).”

So according to those words, setting minimum instances should solve the problem. But it doesn’t. For me, it was much easier to just create a scheduler job and trying to reach the URL every minute, so I’m sure that my revision gets some traffic all the time. And it just works for me. Probably it’s a bug in the Cloud Run (or I misunderstand something), but the current solution is ok for me.

And that’s all. Just simply run this script and wait until the deployment process is done. Now you have your own Discord Bot on Cloud Run!

What can be improved?

Many things, literally. There are only a few of them:

  • Move all secret data outside of the docker image.
  • Pass secret data to the Cloud Run in a secure way.
  • Make a better deployment process.
  • Fix problem with opened port (make a better workaround).

I will for sure do all those things, but as soon as it’s only my private project, I’d prefer to spend my time on other projects. You know… “if it works, don’t touch it” and so on… 😀 But please feel free to use this skeleton for your purposes and change (and do better) whatever you want!

8 Replies to “How to run Discord bot on Cloud Run”

  1. This was great,

    I had done something similar with GKE, cloud run seems to be a few clicks less and a little more straightforward, I’ve been running my bot on this infrastructure for a couple of weeks now. If you’re on discord i’d love to chat more your site / blog has great content.

    1. Thanks! GKE would be something like shooting sparrows with a cannon. 😀 So I decided to use Cloud Run, but if I had more toys to run, I would use GKE as well (probably). Unfortunately, I don’t have my own Discord server, but generally, I use this communicator, so I join any great server. 😀

  2. What about pricing? Setting minimum instance straight up seems to be working for me. I am just a bit concerned as this bot is supposed to be running on multiple servers. Pricing looks generally fine to me but I am still confused about certain past. The only thing is that I never ran a cloud-run instance 24/7. It seems like you are doing pretty much the same. Can u share the ballpark of your monthly costs for GCP?

    1. Last month I paid 0.02$ for the whole services I use in GCP. 😀 So not only for Cloud Run (two Discord Bots running 24/7), but also few Cloud Functions (simple crawlers), Stackdriver Logging for all my services and occasionally Compute Engine for few hours.

      There you can see the billing details for Cloud Run on my account from February 2021 – each month looks similar: https://storage.googleapis.com/ew-public-stuff/cloud_run_february_2021.png

      Those “discounts” just cover the free part of each service (not Free Tier, as I finished Free Tier last year), so as you can see, I even cannot use the free resources. 😀

      Does it answer your question?

  3. Hey this was pretty cool!

    Did you ever find out how to do something similar for Discord Python on Cloud Run? So far I’ve been trying to mess with threads so that it can spin up a flask instance to list on the ports, then not block the discord bot, but no luck so far.

    Also do you have the JS Cloud Run source on github by any chance?

    1. Hi! Thanks a lot! No, I haven’t tried to run python, but I have good news – I’m in the middle of the migration from node.js to python, so for sure I will face the same problem. 😀

      No, I don’t have, I keep that on Bitbucket in private repository – unfortunately I was keeping credentials in repository (yeah, very bad thing) and I didn’t rewrite the repository history, so I’d prefer not to push everything to the public repo… 😀

  4. I utilized a strategy similar to what you described, but my bot takes 60+ seconds to respond to commands when running on Google Cloud, whereas it takes 1-3 seconds to respond when running on my local machine. It would be great if you could publish your source code or offer more details on the deployment process.

    1. My code is very simple, but I was also facing this problem. The solution was to move Cloud Run to another region… Probably you need to check which one would be the best for you, but I may say that in my case europe-west4 is the “fastest” one. 😀

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.