-
-
Notifications
You must be signed in to change notification settings - Fork 564
Open
Description
I'm using kafkajs version 2.2.4. My code is as follows.
I set the producer's timeout to 300ms and added logging to monitor the time spent.
One day, due to network congestion, the reported time spent when the message was successfully sent was greater than 50000ms, and the reported time spent when the message failed was generally close to 100000ms.
This means the timeout is not working. Am I using it incorrectly?
export const kafka = new Kafka({
clientId: 'test',
brokers: [process.env.KAFKA_BROKER],
ssl: {
rejectUnauthorized: false,
},
sasl: {
mechanism: 'scram-sha-256',
username: process.env.KAFKA_USERNAME,
password: process.env.KAFKA_PASSWORD,
},
connectionTimeout: 3000,
requestTimeout: 5000,
});
export const kafkaProducer = kafka.producer({
retry: {
retries: 2,
initialRetryTime: 100,
maxRetryTime: 1000,
},
});
function sendMsg(logBatch) {
const beginTime = Date.now();
kafkaProducer.sendBatch({
compression: CompressionTypes.GZIP,
topicMessages: logBatch.map(item => ({
topic: process.env.KAFKA_TOPIC,
messages: [{
value: JSON.stringify([item]),
key: `${Date.now()}`,
}],
})),
timeout: 300,
}).then((result) => {
reportKafkaCostTime('success', Date.now() - beginTime);
})
.catch((err: any) => {
reportKafkaCostTime('error', Date.now() - beginTime);
});
}
theolpmodesto and GodHermit
Metadata
Metadata
Assignees
Labels
No labels