How to get the number of elements in partition

0 votes
I'm exploring apache spark and wanted to know if there's any way to get the number of elements in a particular RDD partition using the partition ID?

Help required as it will make my tasks very easy.

Thanks in Advance
May 8, 2018 in Apache Spark by Data_Nerd
• 2,390 points
2,190 views

1 answer to this question.

0 votes
rdd.mapPartitions(iter => Array(iter.size).iterator, true) 

This command will give you a new RDD with elements that are the sizes of each partition

answered May 8, 2018 by kurt_cobain
• 9,390 points

Related Questions In Apache Spark

0 votes
1 answer
0 votes
1 answer

How to find the number of null contain in dataframe?

Hey there! You can use the select method of the ...READ MORE

answered May 3, 2019 in Apache Spark by Omkar
• 69,220 points
5,056 views
+1 vote
8 answers

How to print the contents of RDD in Apache Spark?

Save it to a text file: line.saveAsTextFile("alicia.txt") Print contains ...READ MORE

answered Dec 10, 2018 in Apache Spark by Akshay
61,710 views
0 votes
1 answer

How to get ID of a map task in Spark?

you can access task information using TaskContext: import org.apache.spark.TaskContext sc.parallelize(Seq[Int](), ...READ MORE

answered Nov 20, 2018 in Apache Spark by Frankie
• 9,830 points
3,425 views
+1 vote
1 answer
0 votes
3 answers

Can we run Spark without using Hadoop?

No, you can run spark without hadoop. ...READ MORE

answered May 7, 2019 in Big Data Hadoop by pradeep
2,269 views
0 votes
1 answer

What is the benefit of using CDH over other Distributors?

CDH is basically a packaged deal, where ...READ MORE

answered Mar 29, 2018 in Big Data Hadoop by kurt_cobain
• 9,390 points
634 views
0 votes
1 answer

Writing File into HDFS using spark scala

The reason you are not able to ...READ MORE

answered Apr 6, 2018 in Big Data Hadoop by kurt_cobain
• 9,390 points
17,279 views
0 votes
1 answer

How to get Spark dataset metadata?

There are a bunch of functions that ...READ MORE

answered Apr 26, 2018 in Apache Spark by kurt_cobain
• 9,390 points
4,845 views
0 votes
1 answer

Which query to use for better performance, join in SQL or using Dataset API?

DataFrames and SparkSQL performed almost about the ...READ MORE

answered Apr 19, 2018 in Apache Spark by kurt_cobain
• 9,390 points
1,786 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP