hive中遇到的异常

goldenMoon 发布于 2018/03/07 11:34
阅读 2K+
收藏 0

NGINX社区官方微服务训练营,深入了解K8s网络,线上课程+专家答疑,立即加入>>>

linux hadoop集群,将数据经storm存hive。
hive2.3.2版本频繁异常:

  • ERROR [pool-7-thread-1718] metastore.RetryingHMSHandler:Retrying HMSHandler after 2000 ms (attempt 1 of 3) with error : javax.jdo.JDOUserException: One or more instances could not be made persistent.   org.datanucleus.exceptions.NucleusDataStoreException: Insert of object "org.apache.hadoop.hive.metastore.model.MPartition@3f8a96cf" using statement "INSERT INTO `PARTITIONS` (`PART_ID`,`CREATE_TIME`,`LAST_ACCESS_TIME`,`PART_NAME`,`SD_ID`,`TBL_ID`) VALUES (?,?,?,?,?,?)" failed : Duplicate entry 'time=2018-03-07-10-0-56' for key 'UNIQUEPARTITION' Caused by: com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Duplicate entry 'time=2018-03-07-10-0-56' for key 'UNIQUEPARTITION' at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:932)
  • java.sql.SQLException: Timed out waiting for a free available connection
  • java.sql.SQLException: java.lang.OutOfMemoryError: Java heap space at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:957) ~[mysql-connector-java-5.1.38-bin.jar:5.1.38] at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:896) ~[mysql-connector-java-5.1.38-bin.jar:5.1.38] at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:885) ~[mysql-connector-java-5.1.38-bin.jar:5.1.38] at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:860) ~[mysql-connector-java-5.1.38-bin.jar:5.1.38] at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:877) ~[mysql-connector-java-5.1.38-bin.jar:5.1.38] at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:873) ~[mysql-connector-java-5.1.38-bin.jar:5.1.38] at com.mysql.jdbc.Util.handleNewInstance(Util.java:422) ~[mysql-connector-java-5.1.38-bin.jar:5.1.38] at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:400) ~[mysql-connector-java-5.1.38-bin.jar:5.1.38] at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:327) ~[mysql-connector-java-5.1.38-bin.jar:5.1.38] at java.sql.DriverManager.getConnection(Dri2018-03-07T02:58:09,990  WARN [cbdcfdda-437d-4145-8975-917b4c5069eb main] metadata.Hive: Failed to register all functions.
  • compactor.Worker: Caught an exception in the main loop of compactor worker ZTest-master.hadoop-50, java.lang.OutOfMemoryError: Java heap space
  • ERROR [pool-7-thread-23] metastore.RetryingHMSHandler: java.lang.OutOfMemoryError: GC overhead limit exceeded

GC异常比较多,hive-env.sh 中export HADOOP_HEAPSIZE=4096
hadoop配置 
                  HADOOP_OPTS=.... -Xmx512m  ....
                  HADOOP_DATANODE_OPTS=-Xmx2g -Xms2g -XX:+UseParNewGC..
                  HADOOP_CLIENT_OPTS=-Xmx512m.....
                  mapreduce.reduce.memory.mb=1536
          mapreduce.reduce.java.opts=-Xmx1200m -XX:+UseConcMarkSweepGC -   XX:-UseGCOverheadLimit
               mapreduce.map.memory.mb=1024
               mapreduce.map.java.opts=-Xmx800m -XX:+UseConcMarkSweepGC
               mapred.child.java.opts=-Xmx2048m -XX:+UseOverheadLimit
参数如上,哪些参数大小不行需要调,求路过的各位留下你宝贵意见,谢谢!

默认使用的bonecp,连接mysql总会有Duplicate entry,这个该怎么避免异常?


              

 

 

 

                                                                 

加载中
OSCHINA
登录后可查看更多优质内容
返回顶部
顶部