Error reading avro dataset in spark

0 votes

I'm trying to read an avro file using spark data frame. but getting below error:

var df = sqc.read.avro("sqoop/order")
:28: error: value avro is not a member of org.apache.spark.sql.DataFrameReader
var df = sqc.read.avro("sqoop/order")
^

scala> import com.
amazonaws cloudera esotericsoftware fasterxml google jcraft microsoft oracle sun twitter yammer 
clearspring codahale facebook github jamesmurty jolbox ning squareup thoughtworks univocity 

scala> import com.databrics.spark.com
:25: error: object databrics is not a member of package com
import com.databrics.spark.com
Feb 4, 2019 in Apache Spark by Rohit
2,281 views

1 answer to this question.

0 votes

For avro, you need to download and include spark-avro provided by databricks in packages list. You can use this version:

com.databricks:spark-avro_2.11:3.2.0

After including use the code as shown below.

val df=spark.read.format("com.databricks.spark.avro").option("header","true").load("file")
answered Feb 4, 2019 by Omkar
• 69,220 points

Related Questions In Apache Spark

0 votes
1 answer

Getting error while connecting zookeeper in Kafka - Spark Streaming integration

I guess you need provide this kafka.bootstrap.servers ...READ MORE

answered May 24, 2018 in Apache Spark by Shubham
• 13,490 points
2,813 views
0 votes
1 answer
+1 vote
1 answer

getting null values in spark dataframe while reading data from hbase

Can you share the screenshots for the ...READ MORE

answered Jul 31, 2018 in Apache Spark by kurt_cobain
• 9,350 points
2,331 views
+1 vote
1 answer

How to add package com.databricks.spark.avro in spark?

Start spark shell using below line of ...READ MORE

answered Jul 10, 2019 in Apache Spark by Jishnu
6,173 views
+1 vote
2 answers
0 votes
1 answer

Setting textinputformat.record.delimiter in spark

I got this working with plain uncompressed ...READ MORE

answered Oct 10, 2018 in Big Data Hadoop by Omkar
• 69,220 points
2,477 views
0 votes
3 answers

Spark Scala: How to list all folders in directory

val spark = SparkSession.builder().appName("Demo").getOrCreate() val path = new ...READ MORE

answered Dec 5, 2018 in Big Data Hadoop by Mark
17,562 views
0 votes
1 answer

Spark and Scale Auxiliary constructor doubt

println("Slayer") is an anonymous block and gets ...READ MORE

answered Jan 8, 2019 in Apache Spark by Omkar
• 69,220 points
751 views
–1 vote
1 answer

Not able to use sc in spark shell

Seems like master and worker are not ...READ MORE

answered Jan 3, 2019 in Apache Spark by Omkar
• 69,220 points
1,790 views
0 votes
1 answer

Sliding function in spark

The sliding function is used when you ...READ MORE

answered Jan 29, 2019 in Apache Spark by Omkar
• 69,220 points
2,865 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP