hbase import 数据报错 org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException

非洲大妹子 发布于 2015/06/05 17:16
阅读 5K+
收藏 0
大神们好!
用hbase org.apache.hadoop.hbase.mapreduce.Import Point /bak/2015-05-14/point_2015-05-14
导入数据的时候报如下错误,一共5个节点,每个节点偶尔都会报错,貌似是写数据的时候问题。
前几天也导过同样的数据,没有问题。
网上搜了搜,大体说是regionserver进行分割split.没有找到可行方法。目前负载均衡已经关闭,
我的region是1个G,hdfs导出的表大概几十个G,问题是报错也就罢了,等一会重新执行也行,但是map到25%左右,就失败了。不知道这类问题,需要检查哪些地方,请赐教!高分答谢!
linux 内存 64G
配置文件如下:
<configuration>
<property>
    <name>hbase.rootdir</name>
    <value>hdfs://namenode01:9000/hbase</value>
</property>
<property>
    <name>hbase.cluster.distributed</name>
    <value>true</value>
</property>

<property>
    <name>hbase.master</name>
    <value>hdfs://mongo-02:60000</value>
</property>
<property>
    <name>hbase.zookeeper.quorum</name>
    <value>mongo-01,mongo-02,mongo-03</value>
</property>
<property>
    <name>hbase.hregion.max.filesize</name>
    <value>1073741824</value>
</property>
<property>
    <name>hbase.client.write.buffer</name>
    <value>107374180</value>
</property>
<property>
    <name>hbase.regionserver.handler.count</name>
    <value>40</value>
</property>
<property>
    <name>hbase.hregion.majorcompaction</name>
    <value>0</value>
</property>
<property>
        <name>dfs.datanode.max.xcievers</name>
        <value>8192</value>
 </property>
</configuration>

15/06/05 16:51:20 INFO mapred.JobClient: Task Id : attempt_201503142143_0924_m_000093_1, Status : FAILED
org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 23560 actions: servers with issues: datanode-01:60020, 
        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.processBatchCallback(HConnectionManager.java:1642)
        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.processBatch(HConnectionManager.java:1418)
        at org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:918)
        at org.apache.hadoop.hbase.client.HTable.close(HTable.java:955)
        at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.close(TableOutputFormat.java:109)
        at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:650)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:767)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:364)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
        at org.apache.hadoop.mapred.Child.main(Child.java:249)
加载中
返回顶部
顶部