My probleme is i have a code that gives filter column and values in a list as parameters here is my code
myInnerlist=['entred_use','2015-01-01','2017-01-01'] for example
and i use : df2.filter("age IN ('0') AND" + df2(myInnerList(0)).between(myInnerList(1), myInnerList(2)))
but it throws an error:
org.apache.spark.sql.AnalysisException: cannot resolve '(`entered_user` >= ((2014 - 5) - 5))' due to data type mismatch: differing types in '(`entered_user` >= ((2014 - 5) - 5))' (date and int).; line 1 pos 18;
'Filter (cast(age#17 as string) IN (cast(0 as string)) AND ((entered_user#74 >= ((2014 - 5) - 5)) AND (entered_user#74 <= ((2016 - 10) - 10))))
and when i do the same query with other column except date it works
and it works fine when i use only one filter column(for example only date whithout age)
Any Suggestions Please