Error while reading multiline Json

0 votes

Hi, 

I am getting below error while reading json data.

scala> val peopleDF = spark.read.option(multiline,true).json(/user/edureka_311477/sampjson.json);
peopleDF: org.apache.spark.sql.DataFrame = [_corrupt_record: string]
May 23 in Apache Spark by Ritu
40 views

1 answer to this question.

0 votes

peopleDF: org.apache.spark.sql.DataFrame = [_corrupt_record: string]

The above that you are getting is not an error message. Instead, it is just an info message stating that your dataframe has been created but the json format is wrong. The json api of sqlContext is reading it as a corrupt record.

Please refer to the below screenshot:

image

answered May 23 by Conny

Related Questions In Apache Spark

0 votes
1 answer

Getting error while connecting zookeeper in Kafka - Spark Streaming integration

I guess you need provide this kafka.bootstrap.servers ...READ MORE

answered May 24, 2018 in Apache Spark by Shubham
• 13,290 points
497 views
+1 vote
1 answer

getting null values in spark dataframe while reading data from hbase

Can you share the screenshots for the ...READ MORE

answered Jul 31, 2018 in Apache Spark by kurt_cobain
• 9,240 points
326 views
0 votes
1 answer

Error reading avro dataset in spark

For avro, you need to download and ...READ MORE

answered Feb 4 in Apache Spark by Omkar
• 67,290 points
214 views
0 votes
0 answers

_spark_metadata/0 doesn't exist while Compacting batch 9 Structured streaming error

We have Streaming Application implemented using Spark ...READ MORE

May 31 in Apache Spark by AzimKangda
• 120 points
127 views
0 votes
1 answer
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,490 points
2,380 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,490 points
242 views
0 votes
10 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyFr ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
12,145 views
0 votes
1 answer

Error while using Spark SQL filter API

You have to use "===" instead of ...READ MORE

answered Feb 4 in Apache Spark by Omkar
• 67,290 points
26 views
0 votes
1 answer