Error value textfile is not a member of org apache spark SparkContext

+1 vote

While creating the the RDD fro external file sources, I got this error. Can anyone help how can I resolve that. I used this command below: And got this error:

Jul 4, 2019 in Apache Spark by amrita
3,932 views

1 answer to this question.

+1 vote

Hi,

Regarding this error, you just need to change a simple syntax show below:

scala> var test = sc .textFile("hdfs: localost:9000/example/sample")

You can see the output below:

answered Jul 4, 2019 by Gitika
• 65,910 points

sc.textFile(path);


Related Questions In Apache Spark

0 votes
2 answers

Error : split value is not a member of org.apache.spark.sql.Row

var d=rdd2col.rdd.map(x=>x.split(",")) or val names=rd ...READ MORE

answered Aug 5, 2020 in Apache Spark by Ramkumar Ramasamy.
11,105 views
0 votes
1 answer

Error : split value is not a member of org.apache.spark.sql.Row

spark.read.csv is used when loading into a ...READ MORE

answered Jul 22, 2019 in Apache Spark by Firoz
2,796 views
0 votes
1 answer

Scala: error: value unary_+ is not a member of (Int, Int)

All prefix operators' symbols are predefined: +, -, ...READ MORE

answered Jul 22, 2019 in Apache Spark by karan
2,911 views
0 votes
1 answer

Scala: 30: error: value partitions is not a member of String

Try this code: val rdd= sc.textFile (“file.txt”, 5) rdd.partitions.size Output ...READ MORE

answered Jul 29, 2019 in Apache Spark by Nijit
2,779 views
+1 vote
2 answers
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
10,597 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
2,203 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
104,691 views
0 votes
1 answer
0 votes
1 answer

What is RDD in Apache spark?

Hi, RDD in spark stands for REsilient distributed ...READ MORE

answered Jul 1, 2019 in Apache Spark by Gitika
• 65,910 points
1,163 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP