How to read Spark elements having multiple lines each?

0 votes
I have an element in Spark that has multiple lines each. I want to read the element using Spark. How can I do that?
Dec 12, 2018 in Big Data Hadoop by digger
• 27,620 points
54 views

1 answer to this question.

Your answer

Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
0 votes

Try this:

val new_records = sc.newAPIHadoopRDD(hadoopConf,classOf[NLineInputFormat],classOf[LongWritable],classOf[Text])

answered Dec 12, 2018 by Omkar
• 65,820 points

Related Questions In Big Data Hadoop

0 votes
1 answer

How to read multiple files in hdfs?

If you want to read files with ...READ MORE

answered Feb 7 in Big Data Hadoop by Omkar
• 65,820 points
99 views
0 votes
1 answer

How to sync Hadoop configuration files to multiple nodes?

For syncing Hadoop configuration files, you have ...READ MORE

answered Jun 21, 2018 in Big Data Hadoop by HackTheCode
108 views
0 votes
1 answer

How do I connect my Spark based HDInsight cluster to my blob storage?

Go through this blog: https://docs.microsoft.com/en-us/azure/hdinsight/hdinsight-hadoop-use-blob-storage#access-blobs I went through this ...READ MORE

answered Apr 15, 2018 in Big Data Hadoop by Shubham
• 12,110 points
398 views
0 votes
1 answer

How to tune Spark jobs & optimize the performance?

You need to know the cluster properly ...READ MORE

answered Apr 18, 2018 in Big Data Hadoop by coldcode
• 1,980 points
299 views
0 votes
0 answers
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 9,030 points
1,639 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 9,030 points
130 views
0 votes
10 answers

hadoop fs -put command?

copy command can be used to copy files ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Sujay
7,953 views
0 votes
1 answer

How to read more than one files in Apache Spark?

Try this: val text = sc.wholeTextFiles("student/*") text.collect() READ MORE

answered Dec 11, 2018 in Big Data Hadoop by Omkar
• 65,820 points
105 views
0 votes
1 answer

How to use multiple spark version?

You can use the  SPARK_MAJOR_VERSION for this. Suppose ...READ MORE

answered Dec 27, 2018 in Big Data Hadoop by Omkar
• 65,820 points
45 views

© 2018 Brain4ce Education Solutions Pvt. Ltd. All rights Reserved.
"PMP®","PMI®", "PMI-ACP®" and "PMBOK®" are registered marks of the Project Management Institute, Inc. MongoDB®, Mongo and the leaf logo are the registered trademarks of MongoDB, Inc.