How to find the number of null contain in dataframe?

0 votes
May 3, 2019 in Apache Spark by anonymous
• 120 points

edited May 3, 2019 by Omkar 477 views

1 answer to this question.

0 votes

Hey there!

You can use the select method of the dataframe to filter out the values.

df.select([count(when(isnull(c), c)).alias(c) for c in df.columns]).show()

This will display a table with column names and the number of Null values in each column.

If you want to check Null values for a column, then you can use the below code:

df.where(df.col("<Enter column name here>").isNull).count()
answered May 3, 2019 by Omkar
• 69,000 points

I am getting an error with this command and it says "illegal start of simple expresssion". Please help.

df.select([count(when(isnull(c), c)).alias(c) for c in df.columns]).show()

Related Questions In Apache Spark

0 votes
1 answer
0 votes
1 answer

How to get the number of elements in partition?

rdd.mapPartitions(iter => Array(iter.size).iterator, true) This command will ...READ MORE

answered May 8, 2018 in Apache Spark by kurt_cobain
• 9,310 points
390 views
0 votes
7 answers
0 votes
7 answers

How to print the contents of RDD in Apache Spark?

Save it to a text file: line.saveAsTextFile("alicia.txt") Print contains ...READ MORE

answered Dec 10, 2018 in Apache Spark by Akshay
21,360 views
+1 vote
1 answer
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,870 points
4,556 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,870 points
648 views
+1 vote
11 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyFr ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
25,875 views
0 votes
1 answer

How to increase the amount of data to be transferred to shuffle service at the same time?

The amount of data to be transferred ...READ MORE

answered Mar 1, 2019 in Apache Spark by Omkar
• 69,000 points
103 views
–1 vote
1 answer

Not able to use sc in spark shell

Seems like master and worker are not ...READ MORE

answered Jan 3, 2019 in Apache Spark by Omkar
• 69,000 points
302 views