What is the command to check the number of cores in Spark?

0 votes

My spark.cores.max property is 24 and I have 3 worker nodes. Once I log into my worker node, I can see one process running which is the consuming CPU. I think it is not using all the 8 cores. How can I check the number of cores?

May 16, 2018 in Big Data Hadoop by code799
2,277 views

1 answer to this question.

0 votes

Go to your Spark Web UI & you can see you’re the number of cores over there:

image

answered May 16, 2018 by Shubham
• 13,450 points

Related Questions In Big Data Hadoop

0 votes
1 answer

What is the command to count number of lines in a file in hdfs?

hadoop fs -cat /example2/doc1 | wc -l READ MORE

answered Nov 22, 2018 in Big Data Hadoop by Omkar
• 69,030 points
1,136 views
0 votes
1 answer

What is the command to know the details of your data created in a table in Hive?

Hey, Yes, there is a way to check ...READ MORE

answered May 14, 2019 in Big Data Hadoop by Gitika
• 51,280 points
463 views
0 votes
1 answer

I have to ingest in hadoop cluster large number of files for testing , what is the best way to do it?

Hi@sonali, It depends on what kind of testing ...READ MORE

answered Jul 7 in Big Data Hadoop by MD
• 79,930 points
196 views
+1 vote
2 answers
0 votes
1 answer

Is it possible to run Apache Spark without Hadoop?

Though Spark and Hadoop were the frameworks designed ...READ MORE

answered May 2, 2019 in Big Data Hadoop by ravikiran
• 4,600 points
270 views
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,950 points
6,366 views
+1 vote
11 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyF ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
41,980 views
0 votes
5 answers
0 votes
11 answers