Spark Scala Load custom delimited file

0 votes
I have a DAT file, which is pipe(|) delimited. How can I load the custom delimited file into the dataframe?
Jul 16, 2019 in Apache Spark by Esha
8,954 views

1 answer to this question.

0 votes

You can load a DAT file into a dataframe using the below command:

val sqlContext = sqlContext.read.format("csv").option("delimiter","|").load("emp_pipeline.DAT)


Hope this helps!

To know more about Spark Scala, It's recommended to join Spark training online today.

Thanks!!

answered Jul 16, 2019 by Shri

Related Questions In Apache Spark

0 votes
1 answer

where can i get spark-terasort.jar and not .scala file, to do spark terasort in windows.

Hi! I found 2 links on github where ...READ MORE

answered Feb 13, 2019 in Apache Spark by Omkar
• 69,210 points
1,140 views
0 votes
1 answer

load/save text file in spark

The reason you are able to load ...READ MORE

answered Jul 22, 2019 in Apache Spark by Giri
3,193 views
0 votes
1 answer

Load .xlsx files to hive tables with spark scala

This should work: def readExcel(file: String): DataFrame = ...READ MORE

answered Jul 22, 2019 in Apache Spark by Kishan
4,055 views
0 votes
1 answer
+1 vote
2 answers
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
10,601 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
2,207 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
104,772 views
0 votes
1 answer

Load custom delimited file in Spark

Refer to the following code: val sqlContext = ...READ MORE

answered Jul 24, 2019 in Apache Spark by Ritu
1,692 views
0 votes
1 answer

Scala join comma delimited file as tables

Dataframe creation commands:​ Now we will register them ...READ MORE

answered Jul 9, 2019 in Apache Spark by Suraj
745 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP