How can I write a text file in HDFS not from an RDD, in Spark program?

0 votes
I am executing a command in Spark, where I am using saveAsTextFile to save my RDD. I am able to save the RDD in both my local filesystem as well as in HDFS present on my cluster. Both work fine.

Now I  also have to write some more additional files generated during processing, which I am writing to local filesystem. I want to send them to my HDFS cluster.

Can anyone help me out?
May 29, 2018 in Apache Spark by code799
1,386 views

1 answer to this question.

0 votes

Yes, you can go ahead and write a text file into HDFS using Spark.

val filesystem = FileSystem.get(sparkContext.hadoopConfiguration);
val output_stream = filesystem.create(new Path(file));
val buffered_output = BufferedOutputStream(output_stream)
buffered_output.write("My Text".getBytes("UTF-8"))
buffered_output.close()

Important thing to note down is FSDataOutputStream, is object output stream not a text output stream.

answered May 29, 2018 by Shubham
• 13,300 points

Related Questions In Apache Spark

0 votes
1 answer

How to read a data from text file in Spark?

Hey, You can try this: from pyspark import SparkContext SparkContext.stop(sc) sc ...READ MORE

answered Aug 6 in Apache Spark by Gitika
• 25,340 points
156 views
0 votes
1 answer

How is RDD in Spark different from Distributed Storage Management? Can anyone help me with this ?

Some of the key differences between an RDD and ...READ MORE

answered Jul 26, 2018 in Apache Spark by zombie
• 3,690 points
162 views
0 votes
1 answer

where can i get spark-terasort.jar and not .scala file, to do spark terasort in windows.

Hi! I found 2 links on github where ...READ MORE

answered Feb 13 in Apache Spark by Omkar
• 67,600 points
118 views
0 votes
1 answer
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,690 points
3,034 views
0 votes
10 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyFr ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
14,980 views
0 votes
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,260 points
1,114 views
+1 vote
1 answer
0 votes
2 answers

In a Spark DataFrame how can I flatten the struct?

// Collect data from input avro file ...READ MORE

answered Jul 4 in Apache Spark by Dhara dhruve
1,043 views
0 votes
1 answer

How to save and retrieve the Spark RDD from HDFS?

You can save the RDD using saveAsObjectFile and saveAsTextFile method. ...READ MORE

answered May 29, 2018 in Apache Spark by Shubham
• 13,300 points
2,293 views