Error reading avro dataset in spark

0 votes

I'm trying to read an avro file using spark data frame. but getting below error:

var df = sqc.read.avro("sqoop/order")
:28: error: value avro is not a member of org.apache.spark.sql.DataFrameReader
var df = sqc.read.avro("sqoop/order")
^

scala> import com.
amazonaws cloudera esotericsoftware fasterxml google jcraft microsoft oracle sun twitter yammer 
clearspring codahale facebook github jamesmurty jolbox ning squareup thoughtworks univocity 

scala> import com.databrics.spark.com
:25: error: object databrics is not a member of package com
import com.databrics.spark.com
Feb 4 in Apache Spark by Rohit
351 views

1 answer to this question.

0 votes

For avro, you need to download and include spark-avro provided by databricks in packages list. You can use this version:

com.databricks:spark-avro_2.11:3.2.0

After including use the code as shown below.

val df=spark.read.format("com.databricks.spark.avro").option("header","true").load("file")
answered Feb 4 by Omkar
• 67,660 points

Related Questions In Apache Spark

0 votes
1 answer

Getting error while connecting zookeeper in Kafka - Spark Streaming integration

I guess you need provide this kafka.bootstrap.servers ...READ MORE

answered May 24, 2018 in Apache Spark by Shubham
• 13,310 points
709 views
0 votes
1 answer
+1 vote
1 answer

getting null values in spark dataframe while reading data from hbase

Can you share the screenshots for the ...READ MORE

answered Jul 31, 2018 in Apache Spark by kurt_cobain
• 9,260 points
454 views
0 votes
1 answer

How to add package com.databricks.spark.avro in spark?

Start spark shell using below line of ...READ MORE

answered Jul 10 in Apache Spark by Jishnu
407 views
0 votes
1 answer
0 votes
1 answer

Setting textinputformat.record.delimiter in spark

I got this working with plain uncompressed ...READ MORE

answered Oct 10, 2018 in Big Data Hadoop by Omkar
• 67,660 points
489 views
0 votes
3 answers

Spark Scala: How to list all folders in directory

val spark = SparkSession.builder().appName("Demo").getOrCreate() val path = new ...READ MORE

answered Dec 4, 2018 in Big Data Hadoop by Mark
2,479 views
0 votes
1 answer

Spark and Scale Auxiliary constructor doubt

println("Slayer") is an anonymous block and gets ...READ MORE

answered Jan 8 in Apache Spark by Omkar
• 67,660 points
46 views
–1 vote
1 answer

Not able to use sc in spark shell

Seems like master and worker are not ...READ MORE

answered Jan 3 in Apache Spark by Omkar
• 67,660 points
179 views
0 votes
1 answer

Sliding function in spark

The sliding function is used when you ...READ MORE

answered Jan 29 in Apache Spark by Omkar
• 67,660 points
256 views