51014/error-value-textfile-not-member-apache-spark-sparkcontext
While creating the the RDD fro external file sources, I got this error. Can anyone help how can I resolve that. I used this command below: And got this error:
Hi,
Regarding this error, you just need to change a simple syntax show below:
scala> var test = sc .textFile("hdfs: localost:9000/example/sample")
You can see the output below:
sc.textFile(path);
var d=rdd2col.rdd.map(x=>x.split(",")) or val names=rd ...READ MORE
spark.read.csv is used when loading into a ...READ MORE
All prefix operators' symbols are predefined: +, -, ...READ MORE
Try this code: val rdd= sc.textFile (“file.txt”, 5) rdd.partitions.size Output ...READ MORE
Instead of spliting on '\n'. You should ...READ MORE
Firstly you need to understand the concept ...READ MORE
org.apache.hadoop.mapred is the Old API org.apache.hadoop.mapreduce is the ...READ MORE
Hi, You can create one directory in HDFS ...READ MORE
Hey, It already has SparkContent.union and it does know how to ...READ MORE
Hi, RDD in spark stands for REsilient distributed ...READ MORE
OR
At least 1 upper-case and 1 lower-case letter
Minimum 8 characters and Maximum 50 characters
Already have an account? Sign in.