hive执行hadoop任务报错,求救!

养猪 发布于 2013/07/22 17:43
阅读 1K+
收藏 0
执行了这样一个hive sql select * from nginx_log where remote_address='10.10.126.126';
这个nginx的日志文件有200多m 就一个文件。执行了10分钟 map一直是百分之0

错误日志如下
2013-07-22 15:12:52,720 WARN org.apache.hadoop.conf.Configuration: /pvdata/hadoopdata/mapred/local/taskTracker/sohuplus/jobcache/job_201307111249_24350/job.xml:a attempt to override final parameter: mapred.child.java.opts;  Ignoring.
2013-07-22 15:12:52,723 WARN org.apache.hadoop.conf.Configuration: /pvdata/hadoopdata/mapred/local/taskTracker/sohuplus/jobcache/job_201307111249_24350/job.xml:a attempt to override final parameter: mapred.fairscheduler.poolnameproperty;  Ignoring.
2013-07-22 15:12:52,755 INFO org.apache.hadoop.util.NativeCodeLoader: Loaded the native-hadoop library
2013-07-22 15:12:52,881 INFO org.apache.hadoop.mapred.TaskRunner: Creating symlink: /pvdata/hadoopdata/mapred/local/taskTracker/distcache/-6884914047590614184_-1326683603_87560619/zw-hadoop-master/tmp/hive-sohuplus/hive_2013-07-22_15-11-34_591_4400134296393853099/-mr-10003/8cef1b0b-e699-4253-b361-57df0e58814e <- /pvdata/hadoopdata/mapred/local/taskTracker/sohuplus/jobcache/job_201307111249_24350/attempt_201307111249_24350_m_000000_0/work/HIVE_PLAN8cef1b0b-e699-4253-b361-57df0e58814e
2013-07-22 15:12:52,886 INFO org.apache.hadoop.filecache.TrackerDistributedCacheManager: Creating symlink: /pvdata/hadoopdata/mapred/local/taskTracker/sohuplus/jobcache/job_201307111249_24350/jars/job.jar <- /pvdata/hadoopdata/mapred/local/taskTracker/sohuplus/jobcache/job_201307111249_24350/attempt_201307111249_24350_m_000000_0/work/job.jar
2013-07-22 15:12:52,887 INFO org.apache.hadoop.filecache.TrackerDistributedCacheManager: Creating symlink: /pvdata/hadoopdata/mapred/local/taskTracker/sohuplus/jobcache/job_201307111249_24350/jars/.job.jar.crc <- /pvdata/hadoopdata/mapred/local/taskTracker/sohuplus/jobcache/job_201307111249_24350/attempt_201307111249_24350_m_000000_0/work/.job.jar.crc
2013-07-22 15:12:52,932 INFO org.apache.hadoop.metrics.jvm.JvmMetrics: Initializing JVM Metrics with processName=MAP, sessionId=
2013-07-22 15:12:52,990 WARN org.apache.hadoop.conf.Configuration: /pvdata/hadoopdata/mapred/local/taskTracker/sohuplus/jobcache/job_201307111249_24350/job.xml:a attempt to override final parameter: mapred.child.java.opts;  Ignoring.
2013-07-22 15:12:52,990 WARN org.apache.hadoop.conf.Configuration: /pvdata/hadoopdata/mapred/local/taskTracker/sohuplus/jobcache/job_201307111249_24350/job.xml:a attempt to override final parameter: dfs.hosts.exclude;  Ignoring.
2013-07-22 15:12:52,991 WARN org.apache.hadoop.conf.Configuration: /pvdata/hadoopdata/mapred/local/taskTracker/sohuplus/jobcache/job_201307111249_24350/job.xml:a attempt to override final parameter: mapred.fairscheduler.poolnameproperty;  Ignoring.
2013-07-22 15:12:53,265 INFO com.hadoop.compression.lzo.GPLNativeCodeLoader: Loaded native gpl library
2013-07-22 15:12:53,269 INFO com.hadoop.compression.lzo.LzoCodec: Successfully loaded & initialized native-lzo library [hadoop-lzo rev 4537f94556c9ba71ffca316514e0c0101f76b63b]
2013-07-22 15:12:53,491 INFO org.apache.hadoop.mapred.MapTask: numReduceTasks: 0
2013-07-22 15:12:53,507 INFO ExecMapper: maximum memory = 954466304
2013-07-22 15:12:53,507 INFO ExecMapper: conf classpath = [file:/pvdata/sohuhadoop/conf/, file:/opt/java/jdk1.6.0_26/lib/tools.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/hadoop-core-0.20.2-cdh3u1.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/ant-contrib-1.0b3.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/aspectjrt-1.6.5.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/aspectjtools-1.6.5.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/commons-beanutils-1.8.0.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/commons-cli-1.2.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/commons-codec-1.4.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/commons-collections-3.2.1.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/commons-daemon-1.0.1.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/commons-el-1.0.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/commons-httpclient-3.0.1.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/commons-lang-2.4.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/commons-logging-1.0.4.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/commons-logging-api-1.0.4.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/commons-net-1.4.1.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/core-3.1.1.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/guava-r06.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/hadoop-fairscheduler-0.20.2-cdh3u1.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/hadoop-lzo-0.4.12.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/hsqldb-1.8.0.10.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/jackson-core-asl-1.5.2.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/jackson-mapper-asl-1.5.2.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/jasper-compiler-5.5.12.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/jasper-runtime-5.5.12.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/jets3t-0.6.1.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/jetty-6.1.26.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/jetty-servlet-tester-6.1.26.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/jetty-util-6.1.26.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/jsch-0.1.42.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/json-lib-2.3.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/junit-4.5.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/kfs-0.2.2.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/libfb303.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/libthrift.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/log4j-1.2.15.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/mockito-all-1.8.2.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/mysql-connector-java-5.0.8-bin.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/oro-2.0.8.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/servlet-api-2.5-20081211.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/servlet-api-2.5-6.1.14.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/slf4j-api-1.4.3.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/slf4j-log4j12-1.4.3.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/xmlenc-0.52.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/zookeeper-3.3.3-cdh3u1.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/jsp-2.1/jsp-2.1.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/jsp-2.1/jsp-api-2.1.jar, file:/pvdata/hadoopdata/mapred/local/taskTracker/sohuplus/jobcache/job_201307111249_24350/jars/classes, file:/pvdata/hadoopdata/mapred/local/taskTracker/sohuplus/jobcache/job_201307111249_24350/jars/job.jar, file:/pvdata/hadoopdata/mapred/local/taskTracker/sohuplus/distcache/3713554128706075054_2102202947_87562139/zw-hadoop-master./pvdata/hadoopdata/tmp/hadoop-hadoopmc/mapred/staging/sohuplus/.staging/job_201307111249_24350/libjars/hive-contrib-0.7.1-cdh3u1.jar, file:/pvdata/hadoopdata/mapred/local/taskTracker/sohuplus/jobcache/job_201307111249_24350/attempt_201307111249_24350_m_000000_0/work/]
2013-07-22 15:12:53,508 INFO ExecMapper: thread classpath = [file:/pvdata/sohuhadoop/conf/, file:/opt/java/jdk1.6.0_26/lib/tools.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/hadoop-core-0.20.2-cdh3u1.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/ant-contrib-1.0b3.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/aspectjrt-1.6.5.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/aspectjtools-1.6.5.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/commons-beanutils-1.8.0.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/commons-cli-1.2.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/commons-codec-1.4.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/commons-collections-3.2.1.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/commons-daemon-1.0.1.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/commons-el-1.0.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/commons-httpclient-3.0.1.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/commons-lang-2.4.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/commons-logging-1.0.4.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/commons-logging-api-1.0.4.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/commons-net-1.4.1.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/core-3.1.1.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/guava-r06.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/hadoop-fairscheduler-0.20.2-cdh3u1.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/hadoop-lzo-0.4.12.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/hsqldb-1.8.0.10.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/jackson-core-asl-1.5.2.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/jackson-mapper-asl-1.5.2.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/jasper-compiler-5.5.12.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/jasper-runtime-5.5.12.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/jets3t-0.6.1.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/jetty-6.1.26.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/jetty-servlet-tester-6.1.26.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/jetty-util-6.1.26.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/jsch-0.1.42.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/json-lib-2.3.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/junit-4.5.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/kfs-0.2.2.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/libfb303.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/libthrift.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/log4j-1.2.15.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/mockito-all-1.8.2.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/mysql-connector-java-5.0.8-bin.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/oro-2.0.8.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/servlet-api-2.5-20081211.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/servlet-api-2.5-6.1.14.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/slf4j-api-1.4.3.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/slf4j-log4j12-1.4.3.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/xmlenc-0.52.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/zookeeper-3.3.3-cdh3u1.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/jsp-2.1/jsp-2.1.jar, file:/pvdata/sohuhadoop/hadoop-0.20.2-cdh3u1/lib/jsp-2.1/jsp-api-2.1.jar, file:/pvdata/hadoopdata/mapred/local/taskTracker/sohuplus/jobcache/job_201307111249_24350/jars/classes, file:/pvdata/hadoopdata/mapred/local/taskTracker/sohuplus/jobcache/job_201307111249_24350/jars/job.jar, file:/pvdata/hadoopdata/mapred/local/taskTracker/sohuplus/distcache/3713554128706075054_2102202947_87562139/zw-hadoop-master./pvdata/hadoopdata/tmp/hadoop-hadoopmc/mapred/staging/sohuplus/.staging/job_201307111249_24350/libjars/hive-contrib-0.7.1-cdh3u1.jar, file:/pvdata/hadoopdata/mapred/local/taskTracker/sohuplus/jobcache/job_201307111249_24350/attempt_201307111249_24350_m_000000_0/work/]
2013-07-22 15:12:53,591 INFO org.apache.hadoop.hive.ql.exec.MapOperator: Adding alias nginx_log to work list for file hdfs://zw-hadoop-master:9000/user/sohuplus/hive/warehouse/nginx_log
2013-07-22 15:12:53,593 INFO org.apache.hadoop.hive.ql.exec.MapOperator: dump TS struct<passport_sid:string,userid:string,x_forwarded_for:string,remote_address:string,host:string,space:string,remote_user:string,time:string,request:string,status:string,body_bytes_sent:string,referer:string,user_agent:string,response_time:string>
2013-07-22 15:12:53,593 INFO ExecMapper: 
<MAP>Id =3
  <Children>
    <TS>Id =0
      <Children>
        <SEL>Id =1
          <Children>
            <FS>Id =2
              <Parent>Id = 1 null<\Parent>
            <\FS>
          <\Children>
          <Parent>Id = 0 null<\Parent>
        <\SEL>
      <\Children>
      <Parent>Id = 3 null<\Parent>
    <\TS>
  <\Children>
<\MAP>
2013-07-22 15:12:53,593 INFO org.apache.hadoop.hive.ql.exec.MapOperator: Initializing Self 3 MAP
2013-07-22 15:12:53,594 INFO org.apache.hadoop.hive.ql.exec.TableScanOperator: Initializing Self 0 TS
2013-07-22 15:12:53,594 INFO org.apache.hadoop.hive.ql.exec.TableScanOperator: Operator 0 TS initialized
2013-07-22 15:12:53,594 INFO org.apache.hadoop.hive.ql.exec.TableScanOperator: Initializing children of 0 TS
2013-07-22 15:12:53,594 INFO org.apache.hadoop.hive.ql.exec.SelectOperator: Initializing child 1 SEL
2013-07-22 15:12:53,594 INFO org.apache.hadoop.hive.ql.exec.SelectOperator: Initializing Self 1 SEL
2013-07-22 15:12:53,597 INFO org.apache.hadoop.hive.ql.exec.SelectOperator: SELECT struct<passport_sid:string,userid:string,x_forwarded_for:string,remote_address:string,host:string,space:string,remote_user:string,time:string,request:string,status:string,body_bytes_sent:string,referer:string,user_agent:string,response_time:string>
2013-07-22 15:12:53,597 INFO org.apache.hadoop.hive.ql.exec.SelectOperator: Operator 1 SEL initialized
2013-07-22 15:12:53,597 INFO org.apache.hadoop.hive.ql.exec.SelectOperator: Initializing children of 1 SEL
2013-07-22 15:12:53,597 INFO org.apache.hadoop.hive.ql.exec.FileSinkOperator: Initializing child 2 FS
2013-07-22 15:12:53,597 INFO org.apache.hadoop.hive.ql.exec.FileSinkOperator: Initializing Self 2 FS
2013-07-22 15:12:53,621 INFO org.apache.hadoop.hive.ql.exec.FileSinkOperator: Operator 2 FS initialized
2013-07-22 15:12:53,621 INFO org.apache.hadoop.hive.ql.exec.FileSinkOperator: Initialization Done 2 FS
2013-07-22 15:12:53,621 INFO org.apache.hadoop.hive.ql.exec.SelectOperator: Initialization Done 1 SEL
2013-07-22 15:12:53,621 INFO org.apache.hadoop.hive.ql.exec.TableScanOperator: Initialization Done 0 TS
2013-07-22 15:12:53,621 INFO org.apache.hadoop.hive.ql.exec.MapOperator: Initialization Done 3 MAP
2013-07-22 15:12:53,626 INFO org.apache.hadoop.hive.ql.exec.MapOperator: Processing path /user/sohuplus/hive/warehouse/nginx_log/access.log
2013-07-22 15:12:53,626 INFO org.apache.hadoop.hive.ql.exec.MapOperator: Processing alias nginx_log for file hdfs://zw-hadoop-master:9000/user/sohuplus/hive/warehouse/nginx_log
2013-07-22 15:12:53,628 INFO org.apache.hadoop.hive.ql.exec.MapOperator: 3 forwarding 1 rows
2013-07-22 15:12:53,628 INFO org.apache.hadoop.hive.ql.exec.TableScanOperator: 0 forwarding 1 rows
2013-07-22 15:12:53,628 INFO org.apache.hadoop.hive.ql.exec.SelectOperator: 1 forwarding 1 rows
2013-07-22 15:12:53,628 INFO org.apache.hadoop.hive.ql.exec.FileSinkOperator: Final Path: FS hdfs://zw-hadoop-master:9000/tmp/hive-sohuplus/hive_2013-07-22_15-11-34_591_4400134296393853099/_tmp.-ext-10001/000000_0
2013-07-22 15:12:53,629 INFO org.apache.hadoop.hive.ql.exec.FileSinkOperator: Writing to temp file: FS hdfs://zw-hadoop-master:9000/tmp/hive-sohuplus/hive_2013-07-22_15-11-34_591_4400134296393853099/_tmp.-ext-10001/_tmp.000000_0
2013-07-22 15:12:53,629 INFO org.apache.hadoop.hive.ql.exec.FileSinkOperator: New Final Path: FS hdfs://zw-hadoop-master:9000/tmp/hive-sohuplus/hive_2013-07-22_15-11-34_591_4400134296393853099/_tmp.-ext-10001/000000_0
2013-07-22 15:12:53,746 INFO ExecMapper: ExecMapper: processing 1 rows: used memory = 51655480
2013-07-22 15:12:53,751 INFO org.apache.hadoop.hive.ql.exec.MapOperator: 3 forwarding 10 rows
2013-07-22 15:12:53,751 INFO org.apache.hadoop.hive.ql.exec.TableScanOperator: 0 forwarding 10 rows
2013-07-22 15:12:53,751 INFO org.apache.hadoop.hive.ql.exec.SelectOperator: 1 forwarding 10 rows
2013-07-22 15:12:53,751 INFO ExecMapper: ExecMapper: processing 10 rows: used memory = 51655480
2013-07-22 15:12:53,830 INFO org.apache.hadoop.hive.ql.exec.MapOperator: 3 forwarding 100 rows
2013-07-22 15:12:53,830 INFO org.apache.hadoop.hive.ql.exec.TableScanOperator: 0 forwarding 100 rows
2013-07-22 15:12:53,830 INFO org.apache.hadoop.hive.ql.exec.SelectOperator: 1 forwarding 100 rows
2013-07-22 15:12:53,830 INFO ExecMapper: ExecMapper: processing 100 rows: used memory = 53629784
2013-07-22 15:22:59,783 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception: org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.hdfs.server.namenode.LeaseExpiredException: No lease on /tmp/hive-sohuplus/hive_2013-07-22_15-11-34_591_4400134296393853099/_tmp.-ext-10001/_tmp.000000_0 File does not exist. Holder DFSClient_attempt_201307111249_24350_m_000000_0 does not have any open files.
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkLease(FSNamesystem.java:1557)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkLease(FSNamesystem.java:1548)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1464)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock(NameNode.java:653)
	at sun.reflect.GeneratedMethodAccessor18.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:557)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1434)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1430)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:396)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1428)

	at org.apache.hadoop.ipc.Client.call(Client.java:1107)
	at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:226)
	at $Proxy6.addBlock(Unknown Source)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
	at $Proxy6.addBlock(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.locateFollowingBlock(DFSClient.java:3178)
	at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:3047)
	at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$1900(DFSClient.java:2305)
	at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2500)

2013-07-22 15:22:59,783 WARN org.apache.hadoop.hdfs.DFSClient: Error Recovery for block null bad datanode[0] nodes == null
2013-07-22 15:22:59,783 WARN org.apache.hadoop.hdfs.DFSClient: Could not get block locations. Source file "/tmp/hive-sohuplus/hive_2013-07-22_15-11-34_591_4400134296393853099/_tmp.-ext-10001/_tmp.000000_0" - Aborting...
2013-07-22 15:22:59,784 ERROR org.apache.hadoop.hdfs.DFSClient: Exception closing file /tmp/hive-sohuplus/hive_2013-07-22_15-11-34_591_4400134296393853099/_tmp.-ext-10001/_tmp.000000_0 : org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.hdfs.server.namenode.LeaseExpiredException: No lease on /tmp/hive-sohuplus/hive_2013-07-22_15-11-34_591_4400134296393853099/_tmp.-ext-10001/_tmp.000000_0 File does not exist. Holder DFSClient_attempt_201307111249_24350_m_000000_0 does not have any open files.
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkLease(FSNamesystem.java:1557)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkLease(FSNamesystem.java:1548)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1464)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock(NameNode.java:653)
	at sun.reflect.GeneratedMethodAccessor18.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:557)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1434)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1430)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:396)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1428)

org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.hdfs.server.namenode.LeaseExpiredException: No lease on /tmp/hive-sohuplus/hive_2013-07-22_15-11-34_591_4400134296393853099/_tmp.-ext-10001/_tmp.000000_0 File does not exist. Holder DFSClient_attempt_201307111249_24350_m_000000_0 does not have any open files.
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkLease(FSNamesystem.java:1557)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkLease(FSNamesystem.java:1548)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1464)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock(NameNode.java:653)
	at sun.reflect.GeneratedMethodAccessor18.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:557)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1434)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1430)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:396)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1428)

	at org.apache.hadoop.ipc.Client.call(Client.java:1107)
	at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:226)
	at $Proxy6.addBlock(Unknown Source)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
	at $Proxy6.addBlock(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.locateFollowingBlock(DFSClient.java:3178)
	at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:3047)
	at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$1900(DFSClient.java:2305)
	at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2500)
加载中
0
文文木
文文木
日志都没贴全。。。
0
荔枝壳
荔枝壳
看不到,什么情况
0
liunkor
liunkor
想知道 楼上的问题最后是怎么解决的
养猪
养猪
最后查出来是我的hive的序列化的错误
返回顶部
顶部