In spite of the fact that, Oozie Installation Steps have been entered down quite intelligibly and makes one hallucinate it as a matter of a few minutes, but as per my experience, its not that easy.. So, to somehow make it easy for you, one of my previous post is a jest of all those docs that are meant to help you install Oozie. Still here in this post, I am providing the solution to the 5 most common errors that you might have unfortunately encountered.
Error 1 :
Cannot create /var/run/oozie/oozie.pid: Directory nonexistent |
Solution :
Changing the permissions of the run folder as in
sudo chmod -cR 777 ./run
sudo chown root:root -R /var/run/
Error 2 :
put: org.apache.hadoop.security.AccessControlException: Permission denied: user=jt, access=WRITE, inode="user":root:supergroup:rwxr-xr-x |
Solution :
Add the following entry to your hadoop setup's conf/hdfs-site.xml
<property>
<name>dfs.permissions</name>
<value>false</value>
</property>
Error 3 :
put:org.apache.hadoop.hdfs.server.namenode.SafeModeException:Cannot create directory /user/jt/examples. Name node is in safe mode. |
Solution :
Use "hadoop dfsadmin -safemode leave" command to make the namenode leave safe mode forcefully.
Or use "hadoop dfsadmin -safemode wait" to block till NN leaves by itself.
If you need to get your cluster up and running quickly, you can manipulate the parameter dfs.namenode.threshold.percent.
If you set it to 0, NN will not enter safe mode.
Error 4 :
E0902: Exception occured: [java.io.IOException: Call to localhost/127.0.0.1:9000 failed on local exception: java.io.EOFException] |
Solution :
Check whether the port nos of jobtracker and namenode are correctly set in the job.properties file of the application you are running.
Error 5 :
Hadoop StartUp Issue : Hadoop fs command not working and datanode is not running |
Solution :
localpath_to_hadoop_data_store/dfs/data/current/VERSION and localpath_to_hadoop_data_store/dfs/name/current/VERSION should have the same ids , if they are not change that of the datanode(s) .
If these included one of the points where you were stuck up, I hope to have helped you. All the very best for Oozie ...
You should never ever change the default permissions on the /var/run directory.
ReplyDeleteMuch much less give rwx to the whole world in the machine, that is a security flaw.
What you need to do is change the permission to the specific directory used by oozie and for the specific user that runs oozie (if it is root, only give rwx access to the root user).
quite helpful :)
ReplyDeletehelped me a lot :)
ReplyDeleteError: E0902 : E0902: Exception occured: [java.io.IOException: Call to localhost/127.0.0.1:54310 failed on local exception: java.io.EOFException]
ReplyDeleteThe same error is repeated .despite all the cahnges..
Actually im using two users oozie for oozie installed in /usr/lib and hduser for hadoop installed in /usr/local
So is there anything should i do to enable all the permissions
please help!!!
Thank you very much!helped me a lot!!
ReplyDeleteError: HTTP error code: 500 : Internal Server Error
ReplyDeleteI am getting the above error, when running the following command
bin/oozie job -oozie http://localhost:11000/oozie -config examples/apps/map-reduce/job.properties -run
Can you please help me..
ok the fix to my error is simple. Do not use localhost or port number in examples/apps/map-reduce/job.properties use hostname. Do not ask me 'y' or 'how' but the fix worked. The blog was useful. Thanks.
ReplyDeleteCan you publish the job.properties that you used for this?.
DeleteThanks
Abhijit
Glad that it worked for you !!!
ReplyDeletewhile running hive script in oozie i got a jobid when if check in webconsole it is showing as killed.when i check job log it is showing the above error
ReplyDeleteLauncher ERROR, reason: Main class [org.apache.oozie.action.hadoop.HiveMain], main() threw exception, java.io.IOException: Permission denied
i am using cloudera 4.1.1 and oozie 3.2.0
please help me out
Hi I am getting the following error while running the example
ReplyDeleteError: E0902 : E0902: Exception occured: [org.apache.hadoop.ipc.RemoteException: Unauthorized connection for super-user: hduser from IP 127.0.0.1]