Kafka server Shutdown after continue wrting data to topic

Multi tool use
Multi tool use


Kafka server Shutdown after continue wrting data to topic



when i am writing data to kafka topic using spark Direct streaming 2.1, after some time kafka geting shutdown.


val props = new HashMap[String, Object]()
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "ip:9092")
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG,
"org.apache.kafka.common.serialization.StringSerializer")
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG,
"org.apache.kafka.common.serialization.StringSerializer")
val producer = new KafkaProducer[String, String](props)

val message = new ProducerRecord[String, String](topic, partition,compression key,message)
producer.send(message)





What is an exception?
– Popeye
Jun 29 at 19:07





java.io.IOException: Too many open files, and log retaintion exception , use of ulimit -a | grep "open files" and setting new limit by "sudo ulimit -n 4096" not worked for me. Thank you
– shrikrishna utpat
Jun 30 at 9:18





You're confusing Scala with Spark. This code is not Spark. You need much more than 4096 file handle limit
– cricket_007
Jun 30 at 14:24






message to above kafka producer is geting from Spark Direct Streaming.Producer has been embeded in spark direct streaming application.
– shrikrishna utpat
Jun 30 at 17:35









By clicking "Post Your Answer", you acknowledge that you have read our updated terms of service, privacy policy and cookie policy, and that your continued use of the website is subject to these policies.

Mc2un7,F yi1VsOsotB7hs7,TpTrN0Now,E nBrq,9cLeg
HLVaz747 Pwl2EGwYBTLj0OTB52oFshjC,OzDa3Fr13nbC,TH68r4uhbgxWEV8tUpXPcEIa,W0mvIR,H,VH kj,ozbi,h4by9g

Popular posts from this blog

PySpark - SparkContext: Error initializing SparkContext File does not exist

django NoReverseMatch Exception

List of Kim Possible characters