where can i get spark-terasort.jar and not .scala file, to do spark terasort in windows.

0 votes

Hello, Is there any way that i could get a spark-terasort.jar file online inorder to do the spark terasort? everywhere i see it only gives spark-terasort.scala but i need a jar file to run the terasort in spark through windows. It would be really helpful if you can help me with this.

Feb 12, 2019 in Apache Spark by anonymous
442 views

1 answer to this question.

0 votes
Hi!

I found 2 links on github where jar files are available.

https://github.com/ehiggs/spark-terasort

https://github.com/nexr/spark-terasort

I have not used them so I don't how good they are. Try those and let me know if it works.
answered Feb 13, 2019 by Omkar
• 69,030 points

Related Questions In Apache Spark

+1 vote
1 answer

How can I write a text file in HDFS not from an RDD, in Spark program?

Yes, you can go ahead and write ...READ MORE

answered May 29, 2018 in Apache Spark by Shubham
• 13,450 points
4,369 views
0 votes
0 answers

not able to get output in spark streaming??

Hi everyone, I tried to count individual words ...READ MORE

Feb 4 in Apache Spark by akhtar
• 33,920 points
161 views
0 votes
1 answer

Where can I get best spark tutorials for beginners?

Hi@akhtar There are lots of online courses available ...READ MORE

answered May 14 in Apache Spark by MD
• 80,590 points
181 views
0 votes
1 answer

I am not able to run the apache spark program in mac oc

Hi@Srinath, It seems you didn't set Hadoop for ...READ MORE

answered Sep 21 in Apache Spark by MD
• 80,590 points
67 views
+1 vote
2 answers
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,950 points
6,397 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,950 points
992 views
+1 vote
11 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyF ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
42,265 views
–1 vote
1 answer

Not able to use sc in spark shell

Seems like master and worker are not ...READ MORE

answered Jan 3, 2019 in Apache Spark by Omkar
• 69,030 points
505 views
0 votes
1 answer

Spark and Scale Auxiliary constructor doubt

println("Slayer") is an anonymous block and gets ...READ MORE

answered Jan 8, 2019 in Apache Spark by Omkar
• 69,030 points
90 views