Do we have any platform where we can submit spark application.

0 votes
looking for a platform where we can run the spark job in cluster for practice purpose.

where we can get kafka streaming data.
May 12 in Apache Spark by anonymous
• 120 points
69 views

No answer to this question. Be the first to respond.

Your answer

Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.

Related Questions In Apache Spark

0 votes
1 answer

where can i get spark-terasort.jar and not .scala file, to do spark terasort in windows.

Hi! I found 2 links on github where ...READ MORE

answered Feb 13, 2019 in Apache Spark by Omkar
• 69,060 points
298 views
0 votes
1 answer

What do we mean by an RDD in Spark?

The full form of RDD is a ...READ MORE

answered Jun 18, 2018 in Apache Spark by nitinrawat895
• 10,920 points
845 views
0 votes
1 answer

Spark Yarn: Changing maximum number of time to submit application

By default, the maximum number of times ...READ MORE

answered Mar 28, 2019 in Apache Spark by Raj
466 views
0 votes
1 answer

How can we use spark shell for scala without cluster?

You can run the Spark shell for ...READ MORE

answered Apr 28, 2019 in Apache Spark by Giri
90 views
0 votes
1 answer

How can we iterate any function using "foreach" function in scala?

Hi, Yes, "foreach" function you use because it will ...READ MORE

answered Jul 5, 2019 in Apache Spark by Gitika
• 29,690 points
431 views
0 votes
1 answer
+1 vote
2 answers

Spark: Can we add column to dataframe?

Yes we can add columns to the ...READ MORE

answered Oct 24, 2019 in Apache Spark by Siva
• 160 points
760 views
0 votes
1 answer

Where can I get best spark tutorials for beginners?

Hi@akhtar There are lots of online courses available ...READ MORE

answered May 14 in Apache Spark by MD
• 24,500 points
74 views
0 votes
1 answer

Is there any way to check the Spark version?

There are 2 ways to check the ...READ MORE

answered Apr 19, 2018 in Apache Spark by nitinrawat895
• 10,920 points
2,727 views
0 votes
1 answer

Spark Kill Running Application

you can copy the application id from ...READ MORE

answered Apr 25, 2018 in Apache Spark by kurt_cobain
• 9,310 points
452 views