Brief Introduction to Oozie

Recommended by 46 users

Dec 17,2016
Brief Introduction to Oozie
Add to Bookmark Email this Post 10.8K    5

Oozie is a workflow scheduler system to manage Apache Hadoop jobs. Oozie is integrated with the rest of the Hadoop stack supporting several types of Hadoop jobs such as Java MapReduce, Streaming MapReduce, Pig, Hive and Sqoop. Oozie is a scalable, reliable and extensible system. Oozie is used in production at Yahoo!, running more than 200,000 jobs every day.

Features of Oozie:

  • Execute and monitor workflows in Hadoop
  • Periodic scheduling of workflows
  • Trigger execution of data availability
  • HTTP and command line interface and web console

Oozie Workflow – Directed Acyclic Graph of Jobs:

Oozie Workflow Example:

Oozie Workflow Example

<workflow-app nome='wordcount –wf’>
 <start to= ‘wordcount’/>
<action name=’Wordcount'>
 <map-reduce>
<job-tracker>foo.com:9001</job-tracker>
<name-node>hdfs://bar.com:9000</name-node>
 <configuration>
<property>
 <name>mapred.input.dir</name>
 <value>${inputDir}</value,>
 </property>
<property>
 <name>mapred.output.dir</name>
 <value> ${outputDir}</value>
 </property>
 </configuration>
 </map-reduce>
<ok to='end’/>
 <error to='kill'/>
 </action>
<kill name='kill'/>
<end name='end'/>
 </Workflow-app>

Workflow Definition:

A workflow definition is a DAG with control flow nodes or action nodes, where the nodes are connected by transitions arrows.

Control Flow Nodes:     

The control flow provides a way to control the Workflow execution path. Flow control operations within the workflow applications can be done through the following nodes:

  • Start/end/kill
  • Decision
  • Fork/join

Action Nodes:

  • Map-reduce
  • Pig
  • HDFS
  • Sub-workflow
  • Java – Run custom Java code

Oozie Workflow Application:

Workflow application is a ZIP file that includes the workflow definition and the necessary files to run all the actions. It contains the following files:

  • Configuration file – config-default.xml
  • App files – lib/ directory with JAR and SO files
  • Pig scripts

Application Deployment:

$ hadoop fs-put wordcount-wf hdfs://bar.com:9000/usr/abc/wordcount

Workflow Job Parameters:

$ cat job.properites
Oozie.wf.application.path=hdfs://bar.com:9000/usr/abc/wordcount
Input=/usr/abc/input-data
Output=/usr/abc/output-data

Job Execution:

$ oozie job –run –config job.properties
Job:1-20090525161321-oozie-xyz-W

Got a question for us? Mention them in the comments section and we will get back to you. 

Related Posts:

Big Data and Hadoop Training

Why Learn Hadoop?

Hadoop 2.0 FAQs

Introduction to Hadoop 2.0

Share on

Browse Categories

Comments
5 Comments
.
Ka-ching!

Find out how lucky you are. Discounts upto 100% on Edureka Courses!

Congratulations !

Congratulations !

Coupon Code Use this code on the payment page.