Scala: 30: error: value partitions is not a member of String

0 votes
val emp_data =spark.sparkContext.textFile("dataset");

val emp_header=emp_data.first()

val emp_data_without_header =emp_header.filter(x=>(!x.equals(emp_header)))

println("No.ofpartition="+emp_data_without_header.partitions.size)
30: error: value partitions is not a member of String emp_data_without_header.partitions.size ^

How to check the partitions for the above statement?

Jul 29 in Apache Spark by Kamal
87 views

1 answer to this question.

0 votes

Try this code:

val rdd= sc.textFile (“file.txt”, 5)

rdd.partitions.size

Output = 5

answered Jul 29 by Nijit

Related Questions In Apache Spark

0 votes
1 answer

Scala: error: value unary_+ is not a member of (Int, Int)

All prefix operators' symbols are predefined: +, -, ...READ MORE

answered Jul 22 in Apache Spark by karan
122 views
+1 vote
1 answer

Error: value textfile is not a member of org.apache.spark.SparkContext

Hi, Regarding this error, you just need to change ...READ MORE

answered Jul 4 in Apache Spark by Gitika
• 25,420 points
333 views
0 votes
1 answer

Error : split value is not a member of org.apache.spark.sql.Row

spark.read.csv is used when loading into a ...READ MORE

answered Jul 10 in Apache Spark by Rishi
718 views
0 votes
1 answer

Error : split value is not a member of org.apache.spark.sql.Row

spark.read.csv is used when loading into a ...READ MORE

answered Jul 22 in Apache Spark by Firoz
457 views
+1 vote
1 answer
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,800 points
3,576 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,800 points
454 views
+1 vote
11 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyFr ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
18,432 views
0 votes
1 answer
0 votes
1 answer

Appending " to a string in Scala

1) Use the concat() function. Refer to the below ...READ MORE

answered Jul 23 in Apache Spark by Ritu
27 views