Monitoring Spark application

0 votes
Hi, in Hadoop map-reduce form edge node we had run jar file. But in spark on which node we will execute spark-submit
1) How do we monitor the spark application, is there any UI,
2) If UI is there which node of the cluster it is representing.
Aug 9, 2019 in Apache Spark by Rishi
1,021 views

1 answer to this question.

0 votes

Spark-submit jobs are also run from client/edge node but its execution environment will depend on the parameters --master and --deploy-mode such as if the --deploy-mode is cluster then the driver class of the code will be executed in Spark cluster and if its client then it will get executed in the edge node itself.

1. To monitor the spark application there will a spark web UI from which you can see the spark application current status as well as history. The image attached below will help you understand how.

image

2. The Spark Web UI will show you all the jobs running in all worker nodes as a whole and no specific number will be mentioned as its an automated thing to choose which worker node will get assigned to which job, only the master node log has the details.

answered Aug 9, 2019 by Umesh

Related Questions In Apache Spark

0 votes
1 answer

Spark Kill Running Application

you can copy the application id from ...READ MORE

answered Apr 25, 2018 in Apache Spark by kurt_cobain
• 9,350 points
1,814 views
0 votes
1 answer

Spark Monitoring with Ganglia

Ganglia looks like a good option for ...READ MORE

answered May 4, 2018 in Apache Spark by kurt_cobain
• 9,350 points
1,278 views
0 votes
1 answer

Is it mandatory to start Hadoop to run spark application?

No, it is not mandatory, but there ...READ MORE

answered Jun 14, 2018 in Apache Spark by nitinrawat895
• 11,380 points
852 views
0 votes
1 answer

Passing condition dynamically to Spark application.

You can try this: d.filter(col("value").isin(desiredThings: _*)) and if you ...READ MORE

answered Feb 19, 2019 in Apache Spark by Omkar
• 69,220 points
9,111 views
+1 vote
2 answers
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
11,078 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
2,575 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
109,081 views
0 votes
1 answer

What is Executor Memory in a Spark application?

Every spark application has same fixed heap ...READ MORE

answered Jan 5, 2019 in Apache Spark by Frankie
• 9,830 points
6,541 views
0 votes
1 answer

Change encryption key length fro Spark application

You can do this by running the ...READ MORE

answered Mar 13, 2019 in Apache Spark by Venu
727 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP