What is the command to check the number of cores in Spark?

0 votes

My spark.cores.max property is 24 and I have 3 worker nodes. Once I log into my worker node, I can see one process running which is the consuming CPU. I think it is not using all the 8 cores. How can I check the number of cores?

May 16, 2018 in Big Data Hadoop by code799
506 views

1 answer to this question.

0 votes

Go to your Spark Web UI & you can see you’re the number of cores over there:

image

answered May 16, 2018 by Shubham
• 13,300 points

Related Questions In Big Data Hadoop

0 votes
1 answer

What is the command to count number of lines in a file in hdfs?

hadoop fs -cat /example2/doc1 | wc -l READ MORE

answered Nov 22, 2018 in Big Data Hadoop by Omkar
• 67,620 points
345 views
0 votes
1 answer

What is the command to know the details of your data created in a table in Hive?

Hey, Yes, there is a way to check ...READ MORE

answered May 14 in Big Data Hadoop by Gitika
• 25,340 points
135 views
0 votes
1 answer

What is the command to navigate in HDFS?

First of all there is no command ...READ MORE

answered Apr 27, 2018 in Big Data Hadoop by Shubham
• 13,300 points
478 views
0 votes
1 answer
0 votes
1 answer

Is it possible to run Apache Spark without Hadoop?

Though Spark and Hadoop were the frameworks designed ...READ MORE

answered May 2 in Big Data Hadoop by ravikiran
• 4,560 points
71 views
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,690 points
3,012 views
0 votes
10 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyFr ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
14,880 views
0 votes
5 answers
0 votes
11 answers