Scala SparkSQL dataframes filter issue data type mismatch

0 votes
My probleme is i have a code that gives filter column and values in a list as parameters here is my code

myInnerlist=['entred_use','2015-01-01','2017-01-01'] for example
and i use : df2.filter("age IN ('0') AND" + df2(myInnerList(0)).between(myInnerList(1), myInnerList(2)))
but it throws an error:
org.apache.spark.sql.AnalysisException: cannot resolve '(`entered_user` >= ((2014 - 5) - 5))' due to data type mismatch: differing types in '(`entered_user` >= ((2014 - 5) - 5))' (date and int).; line 1 pos 18;
'Filter (cast(age#17 as string) IN (cast(0 as string)) AND ((entered_user#74 >= ((2014 - 5) - 5)) AND (entered_user#74 <= ((2016 - 10) - 10))))
and when i do the same query with other column except date it works

and it works fine when i use only one filter column(for example only date whithout age)

Any Suggestions Please
Mar 24, 2022 in Apache Spark by Hamza

edited Mar 4 27 views

No answer to this question. Be the first to respond.

Your answer

Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP