hadoop streaming job failed Unable to load realm info from SCDynamicStore env: ruby\r: No such file or directory -


while running hadoop streaming using ruby mapper , reduce functions, following error.

packagejobjar: [summarymapper.rb, wcreducer.rb, /var/lib/hadoop/hadoop-unjar6514686449101598265/] [] /var/folders/md/0ww65qrx1_n1nlhrr7hrs8d00000gn/t/streamjob9165241112855689376.jar tmpdir=null 14/06/25 19:54:35 warn util.nativecodeloader: unable load native-hadoop library platform... using builtin-java classes applicable 14/06/25 19:54:35 warn snappy.loadsnappy: snappy native library not loaded 14/06/25 19:54:35 info mapred.fileinputformat: total input paths process : 1 14/06/25 19:54:35 info streaming.streamjob: getlocaldirs(): [/var/lib/hadoop/mapred/local] 14/06/25 19:54:35 info streaming.streamjob: running job: job_201406251944_0005 14/06/25 19:54:35 info streaming.streamjob: kill job, run: 14/06/25 19:54:35 info streaming.streamjob: /users/oladotunopasina/hadoop-1.2.1/libexec/../bin/hadoop job  -dmapred.job.tracker=localhost:8021 -kill job_201406251944_0005 14/06/25 19:54:35 info streaming.streamjob: tracking url: http://localhost:50030/jobdetails.jsp?jobid=job_201406251944_0005 14/06/25 19:54:36 info streaming.streamjob:  map 0%  reduce 0% 14/06/25 19:55:18 info streaming.streamjob:  map 100%  reduce 100% 14/06/25 19:55:18 info streaming.streamjob: kill job, run: 14/06/25 19:55:18 info streaming.streamjob: /users/oladotunopasina/hadoop-1.2.1/libexec/../bin/hadoop job  -dmapred.job.tracker=localhost:8021 -kill job_201406251944_0005 14/06/25 19:55:18 info streaming.streamjob: tracking url: http://localhost:50030/jobdetails.jsp?jobid=job_201406251944_0005 14/06/25 19:55:18 error streaming.streamjob: job not successful. error: # of failed map tasks exceeded allowed limit. failedcount: 1. lastfailedtask: task_201406251944_0005_m_000001 14/06/25 19:55:18 info streaming.streamjob: killjob... streaming command failed! 

on checking log file produced , see

stderr logs  2014-06-25 19:54:38.332 java[8468:1003] unable load realm info scdynamicstore env: ruby\r: no such file or directory java.lang.runtimeexception: pipemapred.waitoutputthreads(): subprocess failed code 127     @ org.apache.hadoop.streaming.pipemapred.waitoutputthreads(pipemapred.java:362)     @ org.apache.hadoop.streaming.pipemapred.mapredfinished(pipemapred.java:576)     @ org.apache.hadoop.streaming.pipemapper.close(pipemapper.java:135)     @ org.apache.hadoop.mapred.maprunner.run(maprunner.java:57)     @ org.apache.hadoop.streaming.pipemaprunner.run(pipemaprunner.java:36)     @ org.apache.hadoop.mapred.maptask.runoldmapper(maptask.java:430)     @ org.apache.hadoop.mapred.maptask.run(maptask.java:366)     @ org.apache.hadoop.mapred.child$4.run(child.java:255)     @ java.security.accesscontroller.doprivileged(native method)     @ javax.security.auth.subject.doas(subject.java:394)     @ org.apache.hadoop.security.usergroupinformation.doas(usergroupinformation.java:1190)     @ org.apache.hadoop.mapred.child.main(child.java:249) 

i have tried using thread hadoop environment variables

but still have no success. kindly help.

i solved problem resaving .rb files on mac.it seems version downloaded saved pc file. "\r" hidden character present in mapper , reducer classes.


Comments

Popular posts from this blog

google api - Incomplete response from Gmail API threads.list -

Installing Android SQLite Asset Helper -

Qt Creator - Searching files with Locator including folder -