I'm getting this error on production after reciving many messages (code works fine for the first 5k+ messages), but later it starts to throw below error :
(node:36) UnhandledPromiseRejectionWarning: RangeError: Maximum call stack size exceeded
at Date.[Symbol.toPrimitive] (<anonymous>)
at Date.toJSON (<anonymous>)
at JSON.stringify (<anonymous>)
at Format.jsonFormatter [as transform] (/data/packages/nodes-base/src/Logging.ts:67:30)
at DerivedLogger._transform (/data/packages/nodes-base/node_modules/winston/lib/winston/logger.js:313:29)
at DerivedLogger.Transform._read (/data/packages/nodes-base/node_modules/readable-stream/lib/_stream_transform.js:177:10)
Once we start to recive this error, we get this same RangeError: Maximum call stack size exceeded on any other operation as well.
My code is not making any recursive call, but I'm using this log inside an async lock and once I start reciving the stack exceed error, the log isn't logged anymore. this is the log, I'm just logging a string through it : logger.info("kafka message received", {"MsgTs": msgTs});
this is my code flow whenever I recive a new message:
await CheckMemoryConsumptionAndTriggerCallback.call(this, consumerGroup, payload, logger, callback, module)
export async function CheckMemoryConsumptionAndTriggerCallback(this: ITriggerFunctions, payload: EachMessagePayload, logger: winston.Logger, callback: Function, listenerType: string) {
let intervalObj : NodeJS.Timeout
// isMemoryLimitReached method reads memory usage from a system file
const memoryLimitReached = await isMemoryLimitReached(logger);
if (memoryLimitReached !== true) {
const retVal = await callback(payload, logger);
if (retVal === KAFKA_MSG.EMITTED){
// saving message timestamp in db when message is emitted
await setKafkaDetailsDb(this, payload.message.timestamp, this.getWorkflow().executionUUID, logger)
}
}
}
async function setKafkaDetailsDb(caller: ITriggerFunctions, kafkaMsgTs: string, executionID: string | undefined, logger: winston.Logger){
const lock_key = "key_" + caller.getWorkflow().id!;
GetAsyncLockInstance().acquire(lock_key, async function() {
// Some code
await SaveStaticDataInDb.call(caller, caller.getWorkflow().id!, staticData)
// this log below doesn’t come anymore once we start receiving the warning
logger.info("kafka message received", {"MsgTs": msgTs});
}, {});
}
Later in the code, after above log, we are also using SetInterval() to limit the number of processing from running concurrently. I have my doubts in this setInterval() method as well.
const reCheckProcessCount = async () => {
if (WorkflowHelpers.GetActiveProcessCount() <= concurrentProcessCount) {
// removing the interval
clearInterval(intervalObj);
WorkflowHelpers.IncrActiveProcessCount()
return this.runSubprocess(data, loadStaticData);
}
return ""
};
if (WorkflowHelpers.GetActiveProcessCount() >= concurrentProcessCount) {
console.log("Total allowed concurrent process count limit reached.")
intervalObj = setInterval(reCheckProcessCount, 30000);
} else {
WorkflowHelpers.IncrActiveProcessCount()
return this.runSubprocess(data, loadStaticData);
}
But none of the method above is being called recursively, they are called repeatedly though for every new message, but they are not in any recursive loop.
This error is seen on production only and I'm unable to pin point the exact location and reason for this error, also unable to re-produce this locally, I tried calling the above flow in loop for 11k times to reproduce this error but couldn't.
Any help or guidance will be much appreciated, thanks.