Kafka Streams application opening too many files on kafka servers









up vote
0
down vote

favorite












I've been working on an application based on the java kafka-streams API, whose goal is to process a stream of data coming from one kafka topic, and produce it into another topic.



As it seems, whenever I start producing messages using the kafka-streams application, file handles just keep opening on the kafka brokers I'm using, and they are never closed, meaning eventually the kafka server ends up with too many open files, and the kafka and zookeeper daemons crash.



I'm using kafka-streams-1.0.1 API jar for Java, and running on JDK 11. The kafka cluster is of Kafka version 1.0.0.



My application's configuration includes the following kafka producer configs:




  • batch.size: set to 100,000 messages.


  • linger.ms: set to 1,000 milliseconds.


  • buffer.memory: set to the byte equivalent of 5 MegaBytes.

The stream processing itself is very simple, and is composed:



stream.map((k,v) -> handle(k,v)).filter((k,v) -> v != null).to(outgoingTopic);


I would appreciate any suggestions you guys might have.










share|improve this question





















  • The recommendation by Confluent is to configure the kernel of the host to allow 100.000+ open file handles. Normally the default is way below that.
    – daniu
    yesterday











  • And it is configured to that level. Still, files keep opening
    – Stav Saad
    yesterday










  • I don't think Java 11 is officially supported by Kafka community, or officially tested. Not sure if that's the problem, though
    – cricket_007
    yesterday










  • Java 11 is used as the runtime of my own application and should not affect stuff happening on the Kafka servers
    – Stav Saad
    18 hours ago














up vote
0
down vote

favorite












I've been working on an application based on the java kafka-streams API, whose goal is to process a stream of data coming from one kafka topic, and produce it into another topic.



As it seems, whenever I start producing messages using the kafka-streams application, file handles just keep opening on the kafka brokers I'm using, and they are never closed, meaning eventually the kafka server ends up with too many open files, and the kafka and zookeeper daemons crash.



I'm using kafka-streams-1.0.1 API jar for Java, and running on JDK 11. The kafka cluster is of Kafka version 1.0.0.



My application's configuration includes the following kafka producer configs:




  • batch.size: set to 100,000 messages.


  • linger.ms: set to 1,000 milliseconds.


  • buffer.memory: set to the byte equivalent of 5 MegaBytes.

The stream processing itself is very simple, and is composed:



stream.map((k,v) -> handle(k,v)).filter((k,v) -> v != null).to(outgoingTopic);


I would appreciate any suggestions you guys might have.










share|improve this question





















  • The recommendation by Confluent is to configure the kernel of the host to allow 100.000+ open file handles. Normally the default is way below that.
    – daniu
    yesterday











  • And it is configured to that level. Still, files keep opening
    – Stav Saad
    yesterday










  • I don't think Java 11 is officially supported by Kafka community, or officially tested. Not sure if that's the problem, though
    – cricket_007
    yesterday










  • Java 11 is used as the runtime of my own application and should not affect stuff happening on the Kafka servers
    – Stav Saad
    18 hours ago












up vote
0
down vote

favorite









up vote
0
down vote

favorite











I've been working on an application based on the java kafka-streams API, whose goal is to process a stream of data coming from one kafka topic, and produce it into another topic.



As it seems, whenever I start producing messages using the kafka-streams application, file handles just keep opening on the kafka brokers I'm using, and they are never closed, meaning eventually the kafka server ends up with too many open files, and the kafka and zookeeper daemons crash.



I'm using kafka-streams-1.0.1 API jar for Java, and running on JDK 11. The kafka cluster is of Kafka version 1.0.0.



My application's configuration includes the following kafka producer configs:




  • batch.size: set to 100,000 messages.


  • linger.ms: set to 1,000 milliseconds.


  • buffer.memory: set to the byte equivalent of 5 MegaBytes.

The stream processing itself is very simple, and is composed:



stream.map((k,v) -> handle(k,v)).filter((k,v) -> v != null).to(outgoingTopic);


I would appreciate any suggestions you guys might have.










share|improve this question













I've been working on an application based on the java kafka-streams API, whose goal is to process a stream of data coming from one kafka topic, and produce it into another topic.



As it seems, whenever I start producing messages using the kafka-streams application, file handles just keep opening on the kafka brokers I'm using, and they are never closed, meaning eventually the kafka server ends up with too many open files, and the kafka and zookeeper daemons crash.



I'm using kafka-streams-1.0.1 API jar for Java, and running on JDK 11. The kafka cluster is of Kafka version 1.0.0.



My application's configuration includes the following kafka producer configs:




  • batch.size: set to 100,000 messages.


  • linger.ms: set to 1,000 milliseconds.


  • buffer.memory: set to the byte equivalent of 5 MegaBytes.

The stream processing itself is very simple, and is composed:



stream.map((k,v) -> handle(k,v)).filter((k,v) -> v != null).to(outgoingTopic);


I would appreciate any suggestions you guys might have.







java apache-kafka apache-kafka-streams






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked yesterday









Stav Saad

541413




541413











  • The recommendation by Confluent is to configure the kernel of the host to allow 100.000+ open file handles. Normally the default is way below that.
    – daniu
    yesterday











  • And it is configured to that level. Still, files keep opening
    – Stav Saad
    yesterday










  • I don't think Java 11 is officially supported by Kafka community, or officially tested. Not sure if that's the problem, though
    – cricket_007
    yesterday










  • Java 11 is used as the runtime of my own application and should not affect stuff happening on the Kafka servers
    – Stav Saad
    18 hours ago
















  • The recommendation by Confluent is to configure the kernel of the host to allow 100.000+ open file handles. Normally the default is way below that.
    – daniu
    yesterday











  • And it is configured to that level. Still, files keep opening
    – Stav Saad
    yesterday










  • I don't think Java 11 is officially supported by Kafka community, or officially tested. Not sure if that's the problem, though
    – cricket_007
    yesterday










  • Java 11 is used as the runtime of my own application and should not affect stuff happening on the Kafka servers
    – Stav Saad
    18 hours ago















The recommendation by Confluent is to configure the kernel of the host to allow 100.000+ open file handles. Normally the default is way below that.
– daniu
yesterday





The recommendation by Confluent is to configure the kernel of the host to allow 100.000+ open file handles. Normally the default is way below that.
– daniu
yesterday













And it is configured to that level. Still, files keep opening
– Stav Saad
yesterday




And it is configured to that level. Still, files keep opening
– Stav Saad
yesterday












I don't think Java 11 is officially supported by Kafka community, or officially tested. Not sure if that's the problem, though
– cricket_007
yesterday




I don't think Java 11 is officially supported by Kafka community, or officially tested. Not sure if that's the problem, though
– cricket_007
yesterday












Java 11 is used as the runtime of my own application and should not affect stuff happening on the Kafka servers
– Stav Saad
18 hours ago




Java 11 is used as the runtime of my own application and should not affect stuff happening on the Kafka servers
– Stav Saad
18 hours ago

















active

oldest

votes











Your Answer






StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "1"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);













 

draft saved


draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53237372%2fkafka-streams-application-opening-too-many-files-on-kafka-servers%23new-answer', 'question_page');

);

Post as a guest



































active

oldest

votes













active

oldest

votes









active

oldest

votes






active

oldest

votes















 

draft saved


draft discarded















































 


draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53237372%2fkafka-streams-application-opening-too-many-files-on-kafka-servers%23new-answer', 'question_page');

);

Post as a guest














































































這個網誌中的熱門文章

Barbados

How to read a connectionString WITH PROVIDER in .NET Core?

Node.js Script on GitHub Pages or Amazon S3