Cannot resolve Error In Spark when filter records with two where condition

+1 vote

SPARK 1.6, SCALA, MAVEN

i have created a dataframe from RDD and trying to filter out all records where cola= null or empty string and colb = 2 or 3.

i tried something like this.

df.filter($"cola in(null or '') where colb ='01' & '05'")

but getting cannot resolve error. May someone please help, am i missing anything. kindly suggest

Oct 1, 2019 in Apache Spark by anonymous
• 130 points
2,713 views
Can you post the error logs? Cannot resolve could be for multiple reasons.

1 answer to this question.

+1 vote
Try

df.where($"cola".isNotNull && $"cola" =!= "" && !$"colb".isin(2,3))

your syntax is incorect.
answered Dec 13, 2019 by Alexandru
• 510 points

edited Dec 13, 2019 by Alexandru

Related Questions In Apache Spark

0 votes
1 answer

What's the difference between 'filter' and 'where' in Spark SQL?

Both 'filter' and 'where' in Spark SQL ...READ MORE

answered May 23, 2018 in Apache Spark by nitinrawat895
• 11,380 points
34,456 views
0 votes
1 answer
0 votes
1 answer

Getting error while connecting zookeeper in Kafka - Spark Streaming integration

I guess you need provide this kafka.bootstrap.servers ...READ MORE

answered May 24, 2018 in Apache Spark by Shubham
• 13,490 points
2,818 views
+2 votes
14 answers

How to create new column with function in Spark Dataframe?

val coder: (Int => String) = v ...READ MORE

answered Apr 5, 2019 in Apache Spark by anonymous

edited Apr 5, 2019 by Omkar 88,893 views
+1 vote
2 answers
0 votes
1 answer

How to find the number of null contain in dataframe?

Hey there! You can use the select method of the ...READ MORE

answered May 3, 2019 in Apache Spark by Omkar
• 69,220 points
5,135 views
+2 votes
4 answers

use length function in substring in spark

You can use the function expr val data ...READ MORE

answered May 3, 2018 in Apache Spark by kurt_cobain
• 9,350 points
43,051 views
0 votes
3 answers

How to connect Spark to a remote Hive server?

JDBC is not required here. Create a hive ...READ MORE

answered Mar 8, 2019 in Big Data Hadoop by Vijay Dixon
• 190 points
12,815 views
+2 votes
1 answer

Type mismatch error in scala

Hello, Your problem is here: val df_merge_final = df_merge .withColumn("version_key", ...READ MORE

answered Dec 13, 2019 in Apache Spark by Alexandru
• 510 points
12,474 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP