HIve query data dates issue

0 votes

Hello All,

I am facing one issue, my project runs daily ETL job, once the job finish will push the data to other sql server. The issue is until last month the data we push it is coming from 2013, but from this month it is coming from 2016. I will share my query here it is confusing, if anyone have any idea pls do let me know and thanks in advance.

Query:

SELECT ACCOUNT.account,
         DENORM.account_id,
         DENORM.incident_number,
         DENORM.incident_id,
         DENORM.casenumber,
         DENORM.incident_type,
         DENORM.incident_status,
         DENORM.comm_pref_code,
         DENORM.complexity,
         DENORM.current_severity,
         DENORM.initial_severity,
         DENORM.max_severity,
         DENORM.bug_cnt,
         DENORM.outage,
         DENORM.initial_portfolio_name,
         DENORM.entry_channel,
         DENORM.creation_date,
         DENORM.closed_date,
         DENORM.current_serial_number,
         DENORM.router_node_name,
         DENORM.summary,
         DENORM.customer_ticket_number,
         DENORM.incident_contact_email,
         DENORM.problem_code,
         DENORM.resolution_code,
         DENORM.sr_create_pfg,
         DENORM.install_at_site_id,
         DENORM.solution_release,
         DENORM.nlp_status,
         DENORM.b2b_flag,
         DENORM.install_at_site_key,
         DENORM.portfolio_number,
         DENORM.portfolio_desc,
         DENORM.contact_party_name,
         DENORM.contact_details,
         DENORM.org_party_name,
         DENORM.cco_id,
         DENORM.contract_number,
         DENORM.contract_service_line,
         DENORM.contract_line_status,
         DENORM.coverage_template_desc,
         DENORM.contract_start_date,
         DENORM.contract_end_date,
         DENORM.contract_expire_date,
         DENORM.tech_name,
         DENORM.hw_part_number,
         DENORM.hw_family,
         DENORM.hw_platform,
         DENORM.hw_business_unit,
         DENORM.sw_part_number,
         DENORM.sw_version,
         DENORM.sw_part_type,
         DENORM.sw_business_unit,
         DENORM.sw_family,
         DENORM.producttable_item_name,
         DENORM.producttable_item_description,

         DENORM.producttable_business_unit,
         DENORM.producttable_family,
         DENORM.bl_last_update_date,
         DENORM.sub_tech_name,
         DENORM.change_done_by_cco_id
  

FROM   csp_tsbi.curated_input ACCOUNT
         

INNER JOIN service_request_transformed_tsbi.sr_denorm_incidents DENORM
                 

ON ACCOUNT.contract = DENORM.contract_number
  

WHERE  COALESCE(To_date(closed_date), To_date(From_unixtime(Unix_timestamp()))
         ) BETWEEN
                Date_sub(To_date(From_unixtime(Unix_timestamp())), 1095) AND
                To_date(
                From_unixtime(Unix_timestamp()))
Aug 21 in Big Data Hadoop by anonymous

edited Aug 22 by Omkar 47 views
Hi. The query looks fine. Have you checked if the data for the year 2014 and 2015 has been generated?

yes it has generated, if you dont mind can u explain this part 

WHERE  COALESCE(To_date(closed_date), To_date(From_unixtime(Unix_timestamp()))
         ) BETWEEN
                Date_sub(To_date(From_unixtime(Unix_timestamp())), 1095) AND
                To_date(
                From_unixtime(Unix_timestamp()))

I couldn't able to figure this condition it is confusing and thanks in adv

I'll break down the query and explain. 

  1. The COALESCE function is used to get the first non-NULL value. Which means it will select closed_data if it has some value and if closed_data doesn't have any value, it will select the unix_timestamp.
  2. Now the date selected in the above logic should be between certain conditions, that's that next condition
  3. Date_sub(To_date(From_unixtime(Unix_timestamp())), 1095) means the date 1095 days ago. Because 1095 days is equal to 3 years.
  4. From_unixtime(Unix_timestamp())) is the current date.
  5. So basically, the date value obtained in the first step should be between 3 years ago and today. 

No answer to this question. Be the first to respond.

Your answer

Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.

Related Questions In Big Data Hadoop

0 votes
1 answer

Hive Query to sort data

If you are trying to sort first ...READ MORE

answered Jul 14 in Big Data Hadoop by Tina
30 views
0 votes
1 answer

How Impala is fast compared to Hive in terms of query response?

Impala provides faster response as it uses MPP(massively ...READ MORE

answered Mar 21, 2018 in Big Data Hadoop by nitinrawat895
• 10,710 points
350 views
0 votes
1 answer

Setting Hive/Hadoop property using Hive Query

You can set Hadoop & Hive conf ...READ MORE

answered Apr 18, 2018 in Big Data Hadoop by Shubham
• 13,300 points
295 views
+2 votes
5 answers

How to transpose/pivot data in hive?

Below is also a way for Pivot SELECT ...READ MORE

answered Oct 12, 2018 in Big Data Hadoop by Rahul
5,810 views
+3 votes
1 answer

Getting Connection Error while loading data into table using cloudera hive

Hey Nafeesa, Itseems that Hive is not able ...READ MORE

answered Oct 3, 2018 in Big Data Hadoop by Vardhan
• 12,730 points
85 views
0 votes
1 answer

Hive: How to use insert query like SQL

It is now possible to insert like ...READ MORE

answered Nov 5, 2018 in Big Data Hadoop by Omkar
• 67,660 points
68 views
0 votes
1 answer

What is Metastore in Hive?

It stores metadata for Hive tables (like their schema ...READ MORE

answered Dec 20, 2018 in Big Data Hadoop by Frankie
• 9,810 points
175 views
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,710 points
3,339 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,710 points
399 views
0 votes
10 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyFr ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
16,538 views