Jump to content

Welcome to CodeNameJessica

โœจ Welcome to CodeNameJessica! โœจ

๐Ÿ’ป Where tech meets community.

Hello, Guest! ๐Ÿ‘‹
You're just a few clicks away from joining an exclusive space for tech enthusiasts, problem-solvers, and lifelong learners like you.

๐Ÿ” Why Join?
By becoming a member of CodeNameJessica, youโ€™ll get access to:
โœ… In-depth discussions on Linux, Security, Server Administration, Programming, and more
โœ… Exclusive resources, tools, and scripts for IT professionals
โœ… A supportive community of like-minded individuals to share ideas, solve problems, and learn together
โœ… Project showcases, guides, and tutorials from our members
โœ… Personalized profiles and direct messaging to collaborate with other techies

๐ŸŒ Sign Up Now and Unlock Full Access!
As a guest, you're seeing just a glimpse of what we offer. Don't miss out on the complete experience! Create a free account today and start exploring everything CodeNameJessica has to offer.

Single Docker Container Generated 20 GB of Log. Here's How I Reclaimed Disk Space

(0 reviews)
by: Abhishek Prakash
Fri, 29 Aug 2025 12:40:58 +0530


Single Docker Container Generated 20 GB of Log. Here's How I Reclaimed Disk Space

I use Ghost CMS for my websites. Recently, they changed their self-hosting deployment recommendation to Docker instead of native installation.

I took the plunge and switched to the Docker version of Ghost. Things were smooth until I got notified about disk running out of space.

When I investigated to find which folders were taking the most space, I was surprised to see a Docker container taking around 21 GB of disk storage. And it was a container, not Docker volume or image or overlay.

Looking closer, I saw a single json log file that took 20 GB of disk space.

Docker container logs taking huge disk space

Unusual, right? I mean I was expecting logs to play a role here but I didn't expect it to find it here. I certainly needed to revise the Docker logging concepts.

Let me share how I fixed the issue for me.

Using log rotate in docker compose

The solution was to define log rotation in the docker compose file in this manner:

logging:
      driver: "json-file"
      options:
        max-size: 100M
        max-file: "5"

Which basically tells the container to use json log files but not exceeding the 100 MB size. If it does, it will create a new log file. The total number of log files at a time won't exceed 5. You can change these value as desired.

But it is more than that. You need to identify which service/conatiner was generating the huge log files. Although, you could put it under all the containers and make it "log-eating-disk-space-proof".

My setup involves a docker compose file. I believe most deployments running on Docker use compose as it often takes more than one service as you often have a server, a database and more components.

My first challenge was to find which Docker container was causing the issue. If there are multiple container running, you can identify which one is it by running:

docker ps

It will show the contaier IDs. The first few digits of the folder in /var/lib/docker/container should match this container ID. This way, you identify which container it is that causes such huge log outputs.

Identifying docker container

As you can see in the two screenshots I shared, the problem was with c702a1916f62ed6b67588f1f244a2d590fae41658c17bd3ef7b298babbc4dbb0 which is from the caddy container with ID c702a1916f62.

So, I modified the compose.yml file with the section I shared earlier. Here's what the compose file looked now:

Adding log rotate in docker compose

Restarted the service with:

docker compose up -d

This had two impacts. Since new containers were created, the older container c702a1916f62 got destroyed and disk space was freed automatically. And the new container created smaller log files.

Containers taking less disk space after log rotation in place

As you can see in the screenshot above, there are five log files with less than 100 MB in size.

What if you do not use compose?

If you are not using docker compose and has a single container, you can run the container with added parameters:

--log-opt max-size=100m --log-opt max-file=5

This could be a pseudocode for reference:

sudo docker run -ti --name my-container  --log-opt max-size=100m --log-opt max-file=5 debian /bin/bash

Set it up for all the docker containers, system wide

You could also configure the Docker daemon to use the same logging policy by creating/editing /etc/docker/daemon.json file:

{
  "log-driver": "json-file",
  "log-opts": {"max-size": "100m", "max-file": "5"}
}

Once you do that, you should restart the systemd service:

systemctl reload docker

I am sure there are other or perhaps better (?) ways to handle this situation. For now, this setup works for me so I am not going to experiment any further, specially not in a production environment. Hope it teaches you a few new things and helps you regain the precious disk space on your Linux server.

0 Comments

Recommended Comments

There are no comments to display.

Guest
Add a comment...

Important Information

Terms of Use Privacy Policy Guidelines We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.

Configure browser push notifications

Chrome (Android)
  1. Tap the lock icon next to the address bar.
  2. Tap Permissions โ†’ Notifications.
  3. Adjust your preference.
Chrome (Desktop)
  1. Click the padlock icon in the address bar.
  2. Select Site settings.
  3. Find Notifications and adjust your preference.