Getting buffer limit exceeded exception inside Kryo

0 votes

Hi. I am running a Spark application and I am facing some problem in serialization. During the serialization, I am getting the following error in Kyro:

buffer limit exceeded
Mar 7, 2019 in Apache Spark by Firoz
749 views

1 answer to this question.

0 votes

Seems like the object being sent for serialization is exceeding the buffer size of Kyro. By default the maximum allowed size is 64MiB and to increase this, you can do the following:

val sc = new SparkContext(new SparkConf())
./bin/spark-submit <all your existing options> --spark.kryoserializer.buffer.max=<new size>
answered Mar 7, 2019 by Pavitra

Related Questions In Apache Spark

0 votes
1 answer

Getting error while connecting zookeeper in Kafka - Spark Streaming integration

I guess you need provide this kafka.bootstrap.servers ...READ MORE

answered May 24, 2018 in Apache Spark by Shubham
• 13,480 points
1,861 views
+1 vote
1 answer

getting null values in spark dataframe while reading data from hbase

Can you share the screenshots for the ...READ MORE

answered Jul 31, 2018 in Apache Spark by kurt_cobain
• 9,390 points
1,309 views
0 votes
1 answer

How to limit the cores being used by a cluster?

You can set the maximum number of ...READ MORE

answered Mar 11, 2019 in Apache Spark by Raj
206 views
0 votes
1 answer

Client connection getting rejected due to SASL authentication

Seems like you have set the configuration ...READ MORE

answered Mar 14, 2019 in Apache Spark by Raj
299 views
0 votes
1 answer

Limit displaying completed applications

By default, the number of completed applications ...READ MORE

answered Mar 25, 2019 in Apache Spark by hari
129 views
+1 vote
2 answers
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
8,048 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
1,371 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
67,103 views