Unresolved dependency issue on sbt package command

0 votes

Hi, After a long time again I have started to practice the spark n scala.
I have written a word count program which is successfully running on eclipse but when I have tried it through sbt is failing and throwing below Exception:

sbt.ResolveException: unresolved dependency: org.scala-lang#scala-library;2.11.8: not found
[error] unresolved dependency: org.apache.spark#spark-core;2.1.1: not found
[error] unresolved dependency: org.scala-lang#scala-compiler;2.11.8: not found

My Spark is showing version 2.1.1 using scala version 2.11.8.
I have taken the reference of scala version 2.11.8 but its failing . Please tell me how to solve this issue?

Jan 3 in Apache Spark by slayer
• 29,050 points

edited Jan 3 by Omkar 225 views

1 answer to this question.

0 votes

Check if you are able to access internet to the system. Build issue sometimes happen if you do not have proper connectivity.

If you have internet connection and build still isn't working, try changing scala version.

As given below file.

build.sbt 

name := "WordcountFirstapp" 
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.1.1"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.1.1"

==============

answered Jan 3 by Omkar
• 67,480 points

Related Questions In Apache Spark

+1 vote
1 answer

How to install Scala Build Tool (SBT) on ubuntu?

Hey, To install SBT on Ubuntu first you need ...READ MORE

answered Jul 23 in Apache Spark by Gitika
• 25,340 points
86 views
0 votes
1 answer

How to stop messages from being displayed on spark console?

In your log4j.properties file you need to ...READ MORE

answered Apr 24, 2018 in Apache Spark by kurt_cobain
• 9,240 points
1,064 views
0 votes
1 answer
0 votes
1 answer

Filtering a row in Spark DataFrame based on matching values from a list

Use the function as following: var notFollowingList=List(9.8,7,6,3, ...READ MORE

answered Jun 5, 2018 in Apache Spark by Shubham
• 13,290 points
25,014 views
0 votes
1 answer

When running Spark on Yarn, do I need to install Spark on all nodes of Yarn Cluster?

No, it is not necessary to install ...READ MORE

answered Jun 14, 2018 in Apache Spark by nitinrawat895
• 10,670 points
1,108 views
0 votes
1 answer
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,670 points
2,742 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,670 points
289 views
0 votes
10 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyFr ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
13,585 views
0 votes
1 answer

Installing Spark on Ubuntu

Hey. Follow these steps to install Spark ...READ MORE

answered Feb 20 in Apache Spark by Omkar
• 67,480 points
191 views