How Oozie workflow can be parameterised?

0 votes
Can anyone explain how Oozie workflow can be parameterised with an example.
Jun 11 in Big Data Hadoop by Kishor
17 views

1 answer to this question.

Your answer

Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
0 votes

Hi,

Oozie workflows can be parameterized. The parameters come from a configuration file called as property file

We can run multiple jobs using the same workflow by using multiple .property files.

Suppose we want to change the jobtracker url or change the script name or value of a param​. We can specify a config file (.property) and pass it while running the workflow.​

Following is an example of a property file we will use in our workflow example:

proprties
nameNode = hdfs://rootname
jobTracker = xyz.com:8088
script_name_external = hdfs_path_of_script/external.hive
script_name_orc=hdfs_path_of_script/orc.hive
script_name_copy=hdfs_path_of_script/Copydata.hive
database = database_name

answered Jun 11 by Gitika
• 17,130 points

Related Questions In Big Data Hadoop

0 votes
1 answer

How to create a FileSystem object that can be used for reading from and writing to HDFS?

Read operation on HDFS In order to read ...READ MORE

answered Mar 21, 2018 in Big Data Hadoop by nitinrawat895
• 9,670 points

edited Mar 21, 2018 by nitinrawat895 181 views
0 votes
1 answer

How to Suspending a Workflow, Coordinator or Bundle Job in Oozie?

Hey, The suspend option suspends a workflow job in RUNNING status. After ...READ MORE

answered 2 days ago in Big Data Hadoop by Gitika
• 17,130 points
20 views
0 votes
1 answer

How to mention workflow job in Oozie?

Hey, You can do this with the help ...READ MORE

answered 1 day ago in Big Data Hadoop by Gitika
• 17,130 points
13 views
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 9,670 points
1,879 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 9,670 points
168 views
0 votes
10 answers

hadoop fs -put command?

copy command can be used to copy files ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Sujay
9,401 views
0 votes
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,240 points
676 views
0 votes
1 answer
0 votes
1 answer

How to see a running workflow, coordinate or bundle of job in Oozie?

Hey, You can use this example to see, $ ...READ MORE

answered Jun 17 in Big Data Hadoop by Gitika
• 17,130 points
19 views

© 2018 Brain4ce Education Solutions Pvt. Ltd. All rights Reserved.
"PMP®","PMI®", "PMI-ACP®" and "PMBOK®" are registered marks of the Project Management Institute, Inc. MongoDB®, Mongo and the leaf logo are the registered trademarks of MongoDB, Inc.