While writing this post, I am assuming that you have
- Installed oozie on your linux machine
- Installed hadoop-0.20.1+
If not, I have already mentioned the Oozie Installation in my previous post.
Steps to get an Oozie app running
Having started hadoop and the oozie server, follow the steps below to get an sample oozie application running :
- In case Oozie installation has been done using debian packag, you can find the oozie examples tar.gz at /etc/oozie/doc/oozie else it can be located in the oozie setup folder.
- Extract this and the obtained /examples folder would contain apps, input-data and src sub-directories.
- Add the following properties to the conf/core-site.xml of your hadoop setup
<property> <name>hadoop.proxyuser.oozie.hosts</name> <value>*</value> </property> <property> <name>hadoop.proxyuser.oozie.groups</name> <value>*</value> </property> |
- In order to run any of the apps, remember to edit the port nos. of jobtracker and the namenode in the job.properties file of the app depending upon your hadoop configuration
JobTracker port no. is set in: /conf/mapred-site.xml
and NameNode port no. is set in : /conf/core-site.xml
Accordingly replace the 'JTPortNo' and 'NNPortNo' in job.properties as below :
oozie.wf.application.path=hdfs://localhost:NNPortNo/path_to_examples/apps/map-reduce jobTracker=localhost:JTPortNo nameNode=hdfs://localhost:NNPortNo queueName=default outputDir=map-reduce |
- Now its time to copy the examples dir to hdfs, but if there is already an examples dir in hdfs you must delete it else the files are not copied. Here's the command :
/path_to_hadoopdir/bin/hadoop fs -put /path_to_egdir/examples examples |
For a confirmation, you can check if the copy has been successful at
http://localhost:50070
- Run the following command to get the example running
In case of debian package used for installation of Oozie
/usr/lib/oozie/bin/oozie job -oozie http://localhost:11000/oozie -config /path_to_egdir/examples/apps/map-reduce/job.properties -run |
else
/path_to_oozie/bin/oozie job -oozie http://localhost:11000/oozie -config /path_to_egdir/examples/apps/map-reduce/job.properties -run |
Here an important note is that you need to specify the local system path to job.properties and not that of hdfs in the command.
If the application has started off successfully, a job id would be returned in response to the above command, something like this :
job: 14-20090525161321-oozie-tucu
If you have the web console installed, you can view the status of the job on
http://localhost:11000/oozie
else the following command will do
/path_to_oozie/bin/oozie job -oozie http://localhost:11000/oozie -info 14-20090525161321-oozie-tucu |
That's it … You can apply the same steps for running any of the documented examples. Well, if the things have not worked as smoothly as they seem, my next post on 'Errors while installation and running Oozie' could be an answer.
This comment has been removed by the author.
ReplyDeleteoozie@ubuntu:/opt/oozie/bin$ ./oozie job -oozie http://localhost:8080/oozie -config /opt/oozie/examples/apps/map-reduce/job.properties -run
ReplyDeleteError: HTTP error code: 500 : Internal Server Error
any clue?
Could you please check if your conf/oozie-site.xml contains the following :
ReplyDelete< property>
< name>hadoop.proxyuser.oozie.hosts< /name>
< value>*< / value>
< /property>
< property>
< name>hadoop.proxyuser.oozie.groups< /name>
< value>*< /value>
< /property>
and you are using the port 8080 whereas the oozie server runs on the port 11000 by default. So is it that you have made some changes?
That is, do have some entry similar to the one below in your conf/oozie-site.xml ?
ReplyDelete< property>
< name>oozie.base.url< /name>
< value>http://localhost:8080/oozie< /value>
< description>
Base Oozie URL.
< /description>
< /property>
I emailed you the entire details of the problem. Thanks.
ReplyDeleteIt seems to be a problem regarding the user and the group. So firstly, keep your "hadoop.proxyuser.oozie.hosts" and "hadoop.proxyuser.oozie.groups" properties set to * only. Next, I feel there is some problem in the job.properties file too. It could be something as simple as :
ReplyDeleteoozie.wf.application.path=hdfs://localhost:9000/user/${user.name}/examples/apps/map-reduce
jobTracker=localhost:9001
nameNode=hdfs://localhost:9000
queueName=default
outputDir=map-reduce
Specifying the user.name as you have in your job.properties is not required. Moreover setting it to "hadoop" is faulty, it should be "oozie" and you have specified the port nos. 54310 and 54311. So is this what you have set in /conf/mapred-site.xml and /conf/core-site.xml of your hadoop setup?
Also try running the oozie server at port 11000 since this is the default port it uses. Do make the change in "oozie-site.xml".
See if these changes help.
Jayati
That didn't work and I get the same type of errors as before. I have the correct port no. in job.properties as it's mentioned in the hadoop config files.
ReplyDeleteHow's the user/group set up for oozie and hadoop done? I can run successfully map-reduce jobs in hadoop (having hadoop as its own user/group). Any permission or user needs to be included to the hadoop cluster?
ReplyDeleteNo, we don't need to do any configurations in hadoop cluster. But one more check that came to my mind is, do you have the following entry in /conf/oozie-site.xml
ReplyDelete< property>
< name>oozie.service.HadoopAccessorService.kerberos.enabled< /name>
< value>false< /value>
< /property>
See if this helps
Jayati
I downloaded cloudera version of oozie as well as hadoop and it worked. But I am still troubleshooting with the apache version I have, to see if I can fix it. I'm seeing AuthorizationException (permission issue) while running the job. The error messages are not very clear in the non-cloudera version.
ReplyDeleteI am getting following error :
ReplyDeleteError: E0902 : E0902: Exception occured: [java.net.ConnectException: Call to localhost/127.0.0.1:54310 failed on connection exception: java.net.ConnectException: Connection refused]
with reference to this -->http://www.mail-archive.com/oozie-users@incubator.apache.org/msg00243.html
ReplyDeleteI tried to replace localhost by 127.0.0.1 in job.properties file ..now when i ran the job through oozie its taking infinite time to return ..infact its not even returning with status ..it just hangs
Worked for me ..Thanks Joyti ..
ReplyDeletehere is fix
1) do this
https://github.com/yahoo/oozie/issues/710%29
2) Do this
/home/cass-hadoop/oozie-3.0.2/examples/apps/map-reduce/job.properties
nameNode=hdfs://localhost:54310
jobTracker=localhost:54311
queueName=default
examplesRoot=examples/
oozie.wf.application.path=${nameNode}/user/${user.name}/${examplesRoot}/apps/map-reduce
outputDir=map-reduce
This comment has been removed by the author.
DeleteHi Yogesh,
Deletewhen i run the oozie example,,
/usr/lib/oozie/bin/oozie job -oozie http://localhost:11000/oozie -config Desktop/examples/apps/map-reduce/job.properties -run
Iam getting this error
Error: E0902 : E0902: Exception occured: [java.net.ConnectException: Call to localhost/127.0.0.1:54310 failed on connection exception: java.net.ConnectException: Connection refused]
how to find namenode number and jobtracker number
plz help..
thanks
venu
btw above mentioned is woked for normal oozie installation and not cloudera's
ReplyDeleteoozie job -oozie http://localhost:11000/oozie -config /usr/local/hadoop/temp/examples/apps/map-reduce/job.properties -run
ReplyDeleteError: E0902 : E0902: Exception occured: [java.io.IOException: Call to localhost/127.0.0.1:54310 failed on local exception: java.io.EOFException]
[root@avinash ~]# oozie job -oozie http://192.168.1.137:11000/oozie -config /usr/share/doc/oozie-2.3.2+27.2/examples/apps/map-reduce/job.properties -run
ReplyDeleteError: HTTP error code: 500 : Internal Server Error
did you resolve the problem?
Deleteroot@ECH-DES-001:/opt/oozie/bin# sudo -u oozie /opt/oozie/bin/oozie job -oozie http://192.168.101.128:11000/oozie/ -config /opt/oozie/examples/apps/map-reduce/job.properties -run
ReplyDeleteError: E0902 : E0902: Exception occured: [java.net.ConnectException: Call to localhost/127.0.0.1:54310 failed on connection exception: java.net.ConnectException: Connection refused]
Could you please check if your conf/oozie-site.xml contains the following :
ReplyDelete< property>
< name>hadoop.proxyuser.oozie.hosts< /name>
< value>*< / value>
< /property>
< property>
< name>hadoop.proxyuser.oozie.groups< /name>
< value>*< /value>
< /property>
Hi Jayati,
DeleteIm geeting below error while running oozie workflow.
Error: E0901 : E0901: Namenode [localhost:50070] not allowed, not in Oozies whitelist
job.properties file :
nameNode=hdfs://localhost:50070
jobTracker=localhost:50030
queueName=default
examplesRoot=Workflow1
oozie.wf.application.path=${nameNode}/user/${user.name}/${examplesRoot}
inputDir=/tmp/Hadoop.txt
outputDir=/tmp/OutputWF
-----------------
I have added below entries to oozie-site.xml
oozie.service.HadoopAccessorService.nameNode.whitelist
localhost:50070
oozie.service.HadoopAccessorService.jobTracker.whitelist
localhost:50030
------------------
I have placeed a folwer with workflow.xml and lib folder on hdfs.
Can you suggest me here.
Best Regards,
Sandeep
Hi Jayati,
ReplyDeleteI wanted to know if its possible in oozie to call other oozie job(s). I wanted to have a master job which calls other jobs in oozie.
-Rakesh
Hi, I was also faceing the same problem, was unable to submit jobs. The problem was in the hostname in the "/user/USER_NAME/examples/apps/map-reduce/job.propertie" file.
ReplyDeleteI used hostname for hadoop as "xxx_system" not as "localhost", where the the ip address int "/etc/hosts" for both "xxx_system" and "localhost" was different
nameNode=hdfs://localhost:8020
jobTracker=localhost:8021
queueName=default
examplesRoot=examples
So after changing the hostname from "localhost" host to "xxx_system" it worked for me
nameNode=hdfs://xxx_system:8020
jobTracker=xxx_system:8021
queueName=default
examplesRoot=examples
This comment has been removed by the author.
ReplyDeleteI am running a mapreduce oozie example and I get this error after submitting the job.
ReplyDeleteJA017: Error submitting launcher for action 'jobID'
Did u solve the issue? i have the same
DeletePlease Mention the path for job.properties ..
ReplyDeleteI'm using hadoop 0.21.0 version and downloaded oozie tarball from cloudera..Can you explain in detail where to change job.properties file and steps after doing that...
ReplyDeleteI have tried
ReplyDelete/home/oozie/oozie/bin/oozie job -oozie http://localhost:11000/oozie -config /home/hduser/Downloads/examples/apps/map-reduce/job.properties -run
Error: HTTP error code: 404 : Not Found
The job.properties file is located in the root directory of your app. The path "/home/hduser/Downloads/examples/apps/map-reduce/job.properties" seems correct.
DeleteHave you copied your application to hdfs?
You need to have the something like this in the job.properties file :
oozie.wf.application.path=hdfs://localhost:9000/hdfs_path_to_your_app
jobTracker=localhost:9001
nameNode=hdfs://localhost:9000
queueName=default
where 9001 and 9000 are the NameNode port number and JobTracker port number. Change it according to your hadoop configuration.
I have got this weird problem with my co-ordinator application.
ReplyDelete====================================
Case 1 :
For :
start = "2012-09-07 13:00Z" end="2012-09-07 16:00Z" frequency="coord:hour(1)"
No of actions : 1 (expected is 3)
Nominal Times :
1) 2012-09-07 13:00Z (Two more are expected. 2012-09-07 14:00Z,2012-09-07 15:00Z)
====================================
====================================
Case 2 :
For :
start = "2012-09-07 13:00Z" end="2012-09-07 16:00Z" frequency="coord:minutes(10)"
No of actions : 6 (expected is 18)
Nominal Times :
1) 2012-09-07 13:00Z
2) 2012-09-07 13:10Z
3) 2012-09-07 13:20Z
4) 2012-09-07 13:30Z
5) 2012-09-07 13:40Z
6) 2012-09-07 13:50Z (12 more are expected. 2012-09-07 14:00Z,2012-09-07 14:10Z and so on..)
====================================
Generalization based on observation :
Any frequency from coord:minutes(1) to coord:minutes(59), the nominal times are perfectly calculated, but only till one hour.
Please suggest if I am missing anything here. Using oozie 2.0, trying with a basic co-ordinator app which is working fine for :
start = "2012-09-07 13:00Z" end = "2012-09-07 13:30Z" frequency = "coord:minutes(10)"
This comment has been removed by the author.
ReplyDeleteThis seems very strange. Are you sure you have the 'oozie' file in the bin folder of your oozie setup?
DeleteYes I'm Sure it is there in bin directory of my oozie setup.
ReplyDeleteThis comment has been removed by the author.
ReplyDeleteBut even in that case ... the error which you have reported should not have occurred.
ReplyDeleteThis comment has been removed by the author.
ReplyDeleteHi Jayati,
ReplyDeleteI have one doubt that i am scheduling oozie job to run at a particular frequency say 10 minutes and i am running it for say 2 hours as an output it should contain 12 folders but the problem i am having is for the very first time i mean for the first 10 min it is creating output at specified folder but remaining 11 times it is killed as the output path already exists so can u please tell me how to change the output folder name dynamically.... hoping a reply as soon as possible.
did you try something like this ...
Delete${baseFsURI}/${YEAR}/${MONTH}/${DAY}/${HOUR}/${MINUTE}
see use cases from https://github.com/yahoo/oozie/wiki/Oozie-Coord-Use-Cases?
Yes i have tried its not working
DeleteHi Jayati,
ReplyDeleteI am using the oozie version from CDH3 and Hadoop version 1.0.3. I am able to successfully start oozie. But running a job throws the following error,
Error: E0902 : E0902: Exception occured: [org.apache.hadoop.ipc.RPC$VersionMismatch: Protocol org.apache.hadoop.hdfs.protocol.ClientProtocol version mismatch. (client = 63, server = 61)]
Have you seen this before ? Your help would be appreciated. Thanks!
This seems to be a version mismatch problem with Hadoop and Oozie. Try using some other version of CDH3 distribution.
Deletein the fs action move you can move only whole directories? Is it possible to move only files in that dir?
ReplyDeleteYou can move files as well.
DeleteThis comment has been removed by the author.
ReplyDeletehi Jayati,
ReplyDeleteI am facing this problem, can you please help me.
2013-01-15 19:16:09,535 INFO CoordActionUpdateCommand:525 - USER[hadoop_oozie] GROUP[users] TOKEN[] APP[bds-wf] JOB[0000001-130115185934511-oozie-hado-W] ACTION[0000001-130115185934511-oozie-hado-W@mr-node] ENDED CoordActionUpdateCommand for wfId=0000001-130115185934511-oozie-hado-W, coord action is null
2013-01-15 19:16:17,824 WARN AuthenticationFilter:341 - AuthenticationToken ignored: AuthenticationToken expired
Thanq.
Hi Jayati
ReplyDeleteI am using oozie-3.3.0 (latest version). In this when i run a job for e.g "oozie job -oozie http://localhost:8080/oozie -config map-reduce-job.properties -run" I get this error "org.apache.oozie.oozieCLI" not found.
I am also unable to create distro. I executed "./mkdistro.sh" but it says "BUILD FAILURE".
Also oozie-3.3.0.tar.gz is not having oozie.war (by default).
can you please help me out where i went wrong. My email id grpraveen147@gmail.com
Hi jayati,
ReplyDeleteWhen i run oozie example,
./oozie job -oozie http://localhost:11000/oozie -config ../examples/apps/map-reduce/job.properties -run
I am getting the error as
Error: E0501 : E0501: Could not perform authorization operation, Call From hadoop-VirtualBox/127.0.1.2 to localhost:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
so please help
thank you
Hi jayati,
ReplyDeleteWhen i run the example mr-job through oozie
I am getting the error as:
WARN MapReduceActionExecutor:542 - USER[hadoop] GROUP[-] TOKEN[] APP[map-reduce-wf] JOB[0000000-130429103619149-oozie-hado-W] ACTION[0000000-130429103619149-oozie-hado-W@mr-node] credentials is null for the action
Hi Jayathi,
ReplyDeletewhen i run the oozie example,,
/usr/lib/oozie/bin/oozie job -oozie http://localhost:11000/oozie -config Desktop/examples/apps/map-reduce/job.properties -run
Iam getting this error
Error: E0902 : E0902: Exception occured: [java.net.ConnectException: Call to localhost/127.0.0.1:54310 failed on connection exception: java.net.ConnectException: Connection refused]
plz help..
thanks
venu
Hi Jayati,
ReplyDeletei'm trying to run oozie example but i keep on getting the below error. May you please Help.
[mathon_k@nlabigrvr01 examples]$ oozie job -oozie http://localhost:11000/oozie -config coordinator.properties -run
Error: E0501 : E0501: Could not perform authorization operation, Call From nlabigrvr01.mtn.co.za/10.211.178.16 to localhost:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
-------------------------------------------------------------------------------------
to solve the problem, i did try these.....
hadoop.proxyuser.myOwnUserName.hosts
localhost
hadoop.proxyuser.myOwnUserName.groups
staff
------------------------------------------------------------------------------------
HI,
ReplyDeleteMay you please help!
i am running oozie workflow to delete all directories older than 5day. HDFS receive directories every our.
Thanks
to do that i use:
Deletefs
delete path="hdfs://localhost:8020/user/ma_username_k/*"/
fs
i did that on workflow.xml but is not working.
i put * because i want to delete the file of any name that is older than 5 days
Hi,
DeleteThis issue seems to be a connectivity issue between Oozie and Hadoop cluster.
Can you please check whether the port at which your Hadoop Namenode is running is what you have specified?
Jayati
Some more tryouts from the hadoop wiki on this error, if you have not tried them already pls consider them:
Delete1. Check the hostname the client using is correct
2. Check the IP address the client gets for the hostname is correct.
3. Check that there isn't an entry for your hostname mapped to 127.0.0.1 or 127.0.1.1 in /etc/hosts (Ubuntu is notorious for this)
4.Check the port the client is using matches that the server is offering a service on.
On the server, try a telnet localhost to see if the port is open there.
On the client, try a telnet to see if the port is accessible remotely.
5. Try connecting to the server/port from a different machine, to see if it just the single client misbehaving.
6. If you are using a Hadoop-based product from a third party, including those from Cloudera, Hortonworks, Intel, EMC and others -please use the support channels provided by the vendor.
Jayati
Thank you very much but the problem is not connection in this case.
DeleteWhen i try the below statement works:
fs
delete path="hdfs://localhost:8020/user/ma_username_k"
fs
-the above action delete everything in the HDFS:/user/ma_username_k
But what i want is to delete only oldest file, files older than 5days----i do not want to specify the name of the actual file i want to delete, that is the reason i have tried to use *.
or
delete 10 files every hour.
since you don't have to know the names of the files you are deleting how do you make sure that 10 files are deleted every hour.
This comment has been removed by the author.
ReplyDeleteHi Jayati,
ReplyDeleteI am facing the below error when running the examples. Can you please help me on this.
Error: E0803 : E0803: IO error, The transaction has been rolled back. See the nested exceptions for details on the errors that occurred.
Hi All,
ReplyDeleteIm geeting below error while running oozie workflow.
Error: E0901 : E0901: Namenode [localhost:50070] not allowed, not in Oozies whitelist
job.properties file :
nameNode=hdfs://localhost:50070
jobTracker=localhost:50030
queueName=default
examplesRoot=Workflow1
oozie.wf.application.path=${nameNode}/user/${user.name}/${examplesRoot}
inputDir=/tmp/Hadoop.txt
outputDir=/tmp/OutputWF
-----------------
I have added below entries to oozie-site.xml
oozie.service.HadoopAccessorService.nameNode.whitelist
localhost:50070
oozie.service.HadoopAccessorService.jobTracker.whitelist
localhost:50030
------------------
I have placeed a folwer with workflow.xml and lib folder on hdfs.
Can you suggest me here.
Best Regards,
Sandeep
$./oozie job -oozie http://localhost:11000/oozie -config /home/hadoop/hadoop/oozie-3.3.1/examples/target/oozie-examples-3.3.1-examples/examples/apps/map-reduce/job.properties -run
ReplyDeleteError: E0501 : E0501: Could not perform authorization operation, Call From java.net.UnknownHostException: hadoop-Satellite-C665: hadoop-Satellite-C665 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
job.properties
nameNode=hdfs://localhost:9000
jobTracker=localhost:9001
queueName=default
examplesRoot=examples
My /etc/hosts
Delete127.0.0.1 localhost
#127.0.1.1 hadoop-Satellite-C665
#127.0.1.1 masternode
# The following lines are desirable for IPv6 capable hosts
::1 ip6-localhost ip6-loopback
fe00::0 ip6-localnet
ff00::0 ip6-mcastprefix
ff02::1 ip6-allnodes
ff02::2 ip6-allrouters
Jayati.. waiting for your response
Deleteif u want to mail..
vinaykumar.shetty6@gmail.com
Hi Vinay,
DeleteCan you please on below issue.
Im geeting below error while running oozie workflow.
Error: E0901 : E0901: Namenode [localhost:50070] not allowed, not in Oozies whitelist
job.properties file :
nameNode=hdfs://localhost:50070
jobTracker=localhost:50030
queueName=default
examplesRoot=Workflow1
oozie.wf.application.path=${nameNode}/user/${user.name}/${examplesRoot}
inputDir=/tmp/Hadoop.txt
outputDir=/tmp/OutputWF
Hello,
ReplyDeleteCan some please help me out on the below issue.
Error: E0501 : E0501: Could not perform authorization operation, Failed on local exception: java.io.EOFException; Host Details : local host is: "localhost.localdomain/127.0.0.1"; destination host is: ""localhost":50000;
Hi Jayati, I am using cloudera Quickstart Vm. But I am getting 500 internal error.
ReplyDeletecloudera@localhost map-reduce]$ hadoop fs -rmr map-reduce
ReplyDeletermr: DEPRECATED: Please use 'rm -r' instead.
Moved: 'hdfs://localhost.localdomain:8020/user/cloudera/map-reduce' to trash at: hdfs://localhost.localdomain:8020/user/cloudera/.Trash/Current
[cloudera@localhost map-reduce]$ cat ~/map_reduce/job.properties
cat: /home/cloudera/map_reduce/job.properties: No such file or directory
[cloudera@localhost map-reduce]$ cat ~/map-reduce/job.properties
nameNode=localhost.localdomain:8020 # or use a remote-server url. eg: hdfs://abc.xyz.yahoo.com:8020
jobTracker=localhost.localdomain:8021 # or use a remote-server url. eg: abc.xyz.yahoo.com:50300
queueName=default
examplesRoot=map-reduce
oozie.wf.application.path=${nameNode}/user/cloudera/${examplesRoot}
inputDir=input-data
outputDir=map-reduce
[cloudera@localhost map-reduce]$ hadoop fs -put ~/map-reduce map-reduce
[cloudera@localhost map-reduce]$ oozie job -oozie http://localhost.localdomain:11000/oozie/ -config ~/map-reduce/job.properties -run
Error: HTTP error code: 500 : Internal Server Error
[cloudera@localhost map-reduce]$ oozie job -oozie 192.168.253.130:11000/oozie -config ~/map-reduce/job.properties -run
Error: IO_ERROR : java.net.MalformedURLException: no protocol: 192.168.253.130:11000/oozie/versions
[cloudera@localhost map-reduce]$ oozie job -oozie http://192.168.253.130:11000/oozie -config ~/map-reduce/job.properties -run
Error: HTTP error code: 500 : Internal Server Error
[cloudera@localhost map-reduce]$
My requirement is, whenever a file of particular format(say "x") enters into hdfs then,automatically it should trigger a java job.If the input file is of the format(say "y") enters the hdfs the, a pig job should to be triggered. If five such files of a particular format("x" or "y") enters at a time then five jobs should be triggered.
ReplyDeleteTriggering should not be based on frequency,instead,as soon as the file/files enters the hdfs,the jobs should be triggered. Is this possible in oozie.? If so, Can you please suggest me .
oozie job failed,
ReplyDelete2014-08-17 13:17:25,913 INFO BaseJobServlet:536 - USER[?] GROUP[users] TOKEN[-] APP[-] JOB[-] ACTION[-] AuthorizationException
org.apache.oozie.service.AuthorizationException: E0902: Exception occured: [java.io.IOException: failure to login]
at org.apache.oozie.service.AuthorizationService.authorizeForApp(AuthorizationService.java:320)
at org.apache.oozie.servlet.BaseJobServlet.checkAuthorizationForApp(BaseJobServlet.java:185)
at org.apache.oozie.servlet.BaseJobsServlet.doPost(BaseJobsServlet.java:89)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:637)
at org.apache.oozie.servlet.JsonRestServlet.service(JsonRestServlet.java:281)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:298)
at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:859)
at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:588)
at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489)
at java.lang.Thread.run(Thread.java:679)
Caused by: org.apache.oozie.service.HadoopAccessorException: E0902: Exception occured: [java.io.IOException: failure to login]
at org.apache.oozie.service.HadoopAccessorService.createFileSystem(HadoopAccessorService.java:144)
at org.apache.oozie.service.AuthorizationService.authorizeForApp(AuthorizationService.java:285)
... 17 more
my job.properites file
ReplyDeletenameNode=hdfs://localhost:50000
jobTracker=localhost:50001
queueName=default
oozie.wf.application.path=${nameNode}/user/${user.name}/mrwordcount_oozie
i have added this
ReplyDeletehadoop.proxyuser.oozie.hosts
*
hadoop.proxyuser.oozie.groups
*
to core-site.xml
if i try to add this
hadoop.proxyuser.oozie.hosts
*
hadoop.proxyuser.oozie.groups
*
to oozie-site.xml, then oozie web url goes down ... 404 http error
please somebody help to resolve this error:
ReplyDeleteError: E0902 : E0902: Exception occured: [java.io.IOException: failure to login]
ubuntu@ubuntu:~/Pari/oozie_examples/mrwordcount_oozie$ oozie job -oozie http://localhost:11000/oozie -config job.properties -run
Error: E0902 : E0902: Exception occured: [java.io.IOException: failure to login]
ubuntu@ubuntu:~/Pari/oozie_examples/mrwordcount_oozie$ oozie job -oozie http://localhost:11000/oozie -config job.properties -run
Error: E0902 : E0902: Exception occured: [java.io.IOException: failure to login]
ubuntu@ubuntu:~/Pari/oozie_examples/mrwordcount_oozie$ oozie job -oozie http://localhost:11000/oozie -config job.properties -submit
Error: E0902 : E0902: Exception occured: [java.io.IOException: failure to login]
ubuntu@ubuntu:~/Pari/oozie_examples/mrwordcount_oozie$ oozie job -oozie http://localhost:11000/oozie -config job.properties -run
Error: E0902 : E0902: Exception occured: [java.io.IOException: failure to login]
Have you resolved it? If you, Guide me also..
DeleteThis comment has been removed by the author.
ReplyDelete