How to add package com.databricks.spark.avro in spark?

0 votes

Hi, I am trying to read avro file stored on hdfs, but getting error. I tried to add package. How to read avro file in webconsole spark2-shell
code: 

val df = sqlContext.read.format("com.databricks.spark.avro").load("hdfs://ip-20-0-21-161.ec2.internal:8020/user/edureka_315701/blogspot/category/part-m-00000.avro")
Jul 23 in Apache Spark by Jimmy
260 views

1 answer to this question.

0 votes

Start spark shell using below line of command 

$ spark2-shell --packages com.databricks:spark-avro_2.11:4.0.0 

Now try using below line of code, change the path to exact path.

import com.databricks.spark.avro._

val df= spark.read.format("com.databricks.spark.avro").option("header","true").load("hdfs://nameservice1/user/edureka_315701/blogspot/category/part-m-00000.avro")  // Loading avro

answered Jul 23 by Ritu

Related Questions In Apache Spark

0 votes
4 answers

How to change the spark Session configuration in Pyspark?

You can dynamically load properties. First create ...READ MORE

answered Dec 10, 2018 in Apache Spark by Vini
17,568 views
0 votes
1 answer

How to convert rdd object to dataframe in spark

SqlContext has a number of createDataFrame methods ...READ MORE

answered May 30, 2018 in Apache Spark by nitinrawat895
• 10,760 points
1,647 views
0 votes
6 answers
0 votes
1 answer
+1 vote
1 answer
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,760 points
3,531 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,760 points
433 views
+1 vote
11 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyFr ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
17,963 views
+1 vote
1 answer

How to add package com.databricks.spark.avro in spark?

Start spark shell using below line of ...READ MORE

answered Jul 10 in Apache Spark by Jishnu
492 views
0 votes
1 answer

How to check if a particular keyword exists in Apache Spark?

Hey, You can try this code to get ...READ MORE

answered Jul 22 in Apache Spark by Gitika
• 25,420 points
102 views