Error while reading multiline Json

0 votes

Hi, 

I am getting below error while reading json data.

scala> val peopleDF = spark.read.option(multiline,true).json(/user/edureka_311477/sampjson.json);
peopleDF: org.apache.spark.sql.DataFrame = [_corrupt_record: string]
May 23, 2019 in Apache Spark by Ritu
489 views

1 answer to this question.

0 votes

peopleDF: org.apache.spark.sql.DataFrame = [_corrupt_record: string]

The above that you are getting is not an error message. Instead, it is just an info message stating that your dataframe has been created but the json format is wrong. The json api of sqlContext is reading it as a corrupt record.

Please refer to the below screenshot:

image

answered May 23, 2019 by Conny

Related Questions In Apache Spark

0 votes
1 answer

Getting error while connecting zookeeper in Kafka - Spark Streaming integration

I guess you need provide this kafka.bootstrap.servers ...READ MORE

answered May 24, 2018 in Apache Spark by Shubham
• 13,380 points
936 views
+1 vote
1 answer

getting null values in spark dataframe while reading data from hbase

Can you share the screenshots for the ...READ MORE

answered Jul 31, 2018 in Apache Spark by kurt_cobain
• 9,310 points
641 views
0 votes
1 answer

Error reading avro dataset in spark

For avro, you need to download and ...READ MORE

answered Feb 4, 2019 in Apache Spark by Omkar
• 69,000 points
509 views
+1 vote
1 answer

_spark_metadata/0 doesn't exist while Compacting batch 9 Structured streaming error

Please check https://kb.databricks.com/streaming/file-sink-stre ...READ MORE

answered Nov 20, 2019 in Apache Spark by anonymous
425 views
+1 vote
1 answer
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,870 points
4,500 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,870 points
640 views
+1 vote
11 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyFr ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
25,310 views
0 votes
1 answer

Error while using Spark SQL filter API

You have to use "===" instead of ...READ MORE

answered Feb 4, 2019 in Apache Spark by Omkar
• 69,000 points
60 views
0 votes
1 answer

Spark: Error while instantiating "org.apache.spark.sql.hive.HiveSessionState"

Seems like you have not started the ...READ MORE

answered Jul 25, 2019 in Apache Spark by Rohit
1,408 views