Simultaneous logging to a single file causing log corruption in Logback

46 views Asked by At

I am currently creating log files every hour using RollingFileAppender and TimeBasedRollingPolicy in Logback. The log files, located in the specified path, store log entries in the form of JSON objects.

Example: { "a": "test", "b": "good", "c": "nice" }

Normally, I write one log per transaction, but the size of the JSON-formatted text can be as large as 12MB. While logs are typically written without issues, under high traffic, the JSON format gets corrupted.

Example: { "a": "test", "b": { "d": "test2", "e": "good2", "f": "nice2" } }

In the middle of writing, logs like the one above are getting mixed in.

I suspect the reason is that the logs are too large, and another thread (transaction) is writing different logs to the same log file in the middle of the process.

Below is my logback configuration

<appender name="ApiRollingFile" class="ch.qos.logback.core.rolling.RollingFileAppender">
    <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
        <fileNamePattern>
            logs/%d{yyyyMMddHH}.jsonLogfile.log
        </fileNamePattern>
        <maxHistory>1</maxHistory>
    </rollingPolicy>
    <encoder class="ch.qos.logback.core.encoder.LayoutWrappingEncoder">
        <layout class="ch.qos.logback.classic.PatternLayout">
            <pattern>%m%n</pattern>
        </layout>
    </encoder>
</appender>

I want the JSON-formatted logs to be written sequentially without corruption.

{ "a": "test", "b": "good", "c": "nice" }
{ "d": "test2", "e": "good2", "f": "nice2" }

Can anyone provide a solution or advice on how to achieve this?

0

There are 0 answers