How can I write a text file in HDFS not from an RDD, in Spark program?

0 votes
I am executing a command in Spark, where I am using saveAsTextFile to save my RDD. I am able to save the RDD in both my local filesystem as well as in HDFS present on my cluster. Both work fine.

Now I  also have to write some more additional files generated during processing, which I am writing to local filesystem. I want to send them to my HDFS cluster.

Can anyone help me out?
May 29, 2018 in Apache Spark by code799
1,773 views

1 answer to this question.

0 votes

Yes, you can go ahead and write a text file into HDFS using Spark.

val filesystem = FileSystem.get(sparkContext.hadoopConfiguration);
val output_stream = filesystem.create(new Path(file));
val buffered_output = BufferedOutputStream(output_stream)
buffered_output.write("My Text".getBytes("UTF-8"))
buffered_output.close()

Important thing to note down is FSDataOutputStream, is object output stream not a text output stream.

answered May 29, 2018 by Shubham
• 13,350 points

Related Questions In Apache Spark

0 votes
1 answer

How to read a data from text file in Spark?

Hey, You can try this: from pyspark import SparkContext SparkContext.stop(sc) sc ...READ MORE

answered Aug 6 in Apache Spark by Gitika
• 25,420 points
605 views
0 votes
1 answer

How is RDD in Spark different from Distributed Storage Management? Can anyone help me with this ?

Some of the key differences between an RDD and ...READ MORE

answered Jul 26, 2018 in Apache Spark by zombie
• 3,750 points
204 views
0 votes
1 answer

where can i get spark-terasort.jar and not .scala file, to do spark terasort in windows.

Hi! I found 2 links on github where ...READ MORE

answered Feb 13 in Apache Spark by Omkar
• 68,480 points
156 views
+1 vote
1 answer
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,800 points
3,589 views
+1 vote
11 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyFr ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
18,519 views
0 votes
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,280 points
1,337 views
+1 vote
1 answer
0 votes
2 answers

In a Spark DataFrame how can I flatten the struct?

// Collect data from input avro file ...READ MORE

answered Jul 4 in Apache Spark by Dhara dhruve
1,365 views
0 votes
1 answer

How to save and retrieve the Spark RDD from HDFS?

You can save the RDD using saveAsObjectFile and saveAsTextFile method. ...READ MORE

answered May 29, 2018 in Apache Spark by Shubham
• 13,350 points
2,830 views