Linux下搭建Eclipse+Maven的Hbase开发环境
由于Hive不支持数据的更新,所以不得不考虑使用Hbase来实现每日更新数据的需求。环境搭建过程中遇到了许多问题,所以写下此篇博文,给准备使用Hbase的小伙伴做个参考,也给自己做个备忘。开发环境以及版本介绍,Hbase采用集群方式运行,不适用Hbase自带的zookeeper,单独搭建zookeeper环境,但这些应该对开发没有什么影响。zookeeper-3.4.6集群:
·
由于Hive不支持数据的更新,所以不得不考虑使用Hbase来实现每日更新数据的需求。环境搭建过程中遇到了许多问题,所以写下此篇博文,给准备使用Hbase的小伙伴做个参考,也给自己做个备忘。
- 开发环境以及版本介绍,Hbase采用集群方式运行,不适用Hbase自带的zookeeper,单独搭建zookeeper环境,但这些应该对开发没有什么影响。
zookeeper-3.4.6集群:
host ip spark1 192.168.4.31 spark2 192.168.4.32 spark3 192.168.4.33 spark4 192.168.4.34 spark5 192.168.4.35 hbase-0.98.13集群:
host ip hadoop1 192.168.4.21 hadoop2 192.168.4.22 hadoop3 192.168.4.23 hadoop4 192.168.4.24 hadoop5 192.168.4.25 hadoop-2.4.1集群:
host ip node hadoop1 192.168.4.21 datanode hadoop2 192.168.4.22 datanode hadoop3 192.168.4.23 datanode hadoop4 192.168.4.24 datanode hadoop5 192.168.4.25 datanode hadoop6 192.168.5.19 namenode hadoop7 192.168.4.26 datanode
- Hbase配置文件hbase-site.xml
<configuration> <property> <name>hbase.rootdir</name> <value>hdfs://hadoop6:9000/hbase</value> </property> <property> <name>hbase.cluster.distributed</name> <value>true</value> </property> <property> <name>hbase.zookeeper.quorum</name> <value>spark1,spark2,spark3,spark4,spark5</value> </property> <property> <name>hbase.zookeeper.property.clientPort</name> <value>2181</value> </property> </configuration>
- 开发机hosts配置,务必把Hbase和Zookeeper集群的所有机器都配置进去
192.168.4.21 hadoop1 192.168.4.22 hadoop2 192.168.4.23 hadoop3 192.168.4.24 hadoop4 192.168.4.25 hadoop5 192.168.5.19 hadoop6 192.168.4.26 hadoop7 192.168.4.31 spark1 192.168.4.32 spark2 192.168.4.33 spark3 192.168.4.34 spark4 192.168.4.35 spark5
- Eclipse开发环境配置,建立标准的Java Maven项目
目录结构
HbaseStudy-0.98.13/ ├── pom.xml ├── src ├── main │ ├── java │ └── resources │ ├── hbase-site.xml │ └── log4j.properties └── test └── java └── com └── hua └── hbase └── test └── TestHbase.java
pom.xml
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> <modelVersion>4.0.0</modelVersion> <groupId>HbaseStudy-0.98.13</groupId> <artifactId>HbaseStudy-0.98.13</artifactId> <version>0.0.1-SNAPSHOT</version> <dependencies> <dependency> <groupId>org.apache.hbase</groupId> <artifactId>hbase-client</artifactId> <version>0.98.13-hadoop2</version> </dependency> <dependency> <groupId>org.apache.hbase</groupId> <artifactId>hbase-examples</artifactId> <version>0.98.13-hadoop2</version> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-hdfs</artifactId> <version>2.2.0</version> </dependency> <dependency> <groupId>log4j</groupId> <artifactId>log4j</artifactId> <version>1.2.17</version> </dependency> <dependency> <groupId>junit</groupId> <artifactId>junit</artifactId> <version>4.12</version> <scope>test</scope> </dependency> </dependencies> <build> <plugins> <plugin> <artifactId>maven-compiler-plugin</artifactId> <version>3.3</version> <configuration> <source>1.7</source> <target>1.7</target> </configuration> </plugin> </plugins> </build> </project>
在src/main/resources下面新建hbase-site.xml配置文件,内容拷贝habse配置文件内容,zookeeper端口配置默认可以不加测试程序TestHbase.java<?xml version="1.0" encoding="UTF-8"?> <configuration> <property> <name>hbase.rootdir</name> <value>hdfs://hadoop6:9000/hbase</value> </property> <property> <name>hbase.cluster.distributed</name> <value>true</value> </property> <property> <name>hbase.zookeeper.quorum</name> <value>spark1,spark2,spark3,spark4,spark5</value> </property> </configuration>
查看user1表是否成功建立package com.hua.hbase.test; import java.io.IOException; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.hbase.HBaseConfiguration; import org.apache.hadoop.hbase.HColumnDescriptor; import org.apache.hadoop.hbase.HTableDescriptor; import org.apache.hadoop.hbase.TableName; import org.apache.hadoop.hbase.client.HBaseAdmin; import org.apache.hadoop.hbase.util.Bytes; import org.apache.log4j.Logger; import org.junit.After; import org.junit.Before; import org.junit.Test; public class TestHbase { private final Logger logger = Logger.getLogger(TestHbase.class); private Configuration conf = null; private HBaseAdmin admin = null; @Before public void before() throws IOException { conf = HBaseConfiguration.create(); admin = new HBaseAdmin(conf); } @After public void after() throws IOException { admin.close(); } @Test public void testCreate() throws Exception { TableName tableName = TableName.valueOf("user1"); HTableDescriptor tableDesc = new HTableDescriptor(tableName); tableDesc.addFamily(new HColumnDescriptor(Bytes.toBytes("basic"))); tableDesc.addFamily(new HColumnDescriptor(Bytes.toBytes("advance"))); if(admin.tableExists(tableName)) { admin.disableTable(tableName); admin.deleteTable(tableName); } admin.createTable(tableDesc); logger.info("Create table success!"); } }
执行日志
[org.apache.hadoop.metrics2.lib.MutableMetricsFactory]field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(value=[Rate of successful kerberos logins and latency (milliseconds)], about=, valueName=Time, type=DEFAULT, always=false, sampleName=Ops) [org.apache.hadoop.metrics2.lib.MutableMetricsFactory]field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(value=[Rate of failed kerberos logins and latency (milliseconds)], about=, valueName=Time, type=DEFAULT, always=false, sampleName=Ops) [org.apache.hadoop.metrics2.impl.MetricsSystemImpl]UgiMetrics, User and group related metrics [org.apache.hadoop.security.authentication.util.KerberosName]Kerberos krb5 configuration not found, setting default realm to empty [org.apache.hadoop.security.Groups] Creating new Groups object [org.apache.hadoop.util.NativeCodeLoader]Trying to load the custom-built native-hadoop library... [org.apache.hadoop.util.NativeCodeLoader]Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path [org.apache.hadoop.util.NativeCodeLoader]java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib [org.apache.hadoop.util.NativeCodeLoader]Unable to load native-hadoop library for your platform... using builtin-java classes where applicable [org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback]Falling back to shell based [org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback]Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping [org.apache.hadoop.security.Groups]Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000 [org.apache.hadoop.security.UserGroupInformation]hadoop login [org.apache.hadoop.security.UserGroupInformation]hadoop login commit [org.apache.hadoop.security.UserGroupInformation]using local user:UnixPrincipal: hadoop [org.apache.hadoop.security.UserGroupInformation]UGI loginUser:hadoop (auth:SIMPLE) [org.apache.hadoop.util.Shell]setsid exited with exit code 0 [org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper]Process identifier=hconnection-0x18e4d5ba connecting to ZooKeeper ensemble=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181 [org.apache.zookeeper.ZooKeeper]Client environment:zookeeper.version=3.4.6-1569965, built on 02/20/2014 09:09 GMT [org.apache.zookeeper.ZooKeeper]Client environment:host.name=hadoop-nenusoul [org.apache.zookeeper.ZooKeeper]Client environment:java.version=1.7.0_75 [org.apache.zookeeper.ZooKeeper]Client environment:java.vendor=Oracle Corporation [org.apache.zookeeper.ZooKeeper]Client environment:java.home=/home/hadoop/work/DevelopTools/jdk1.7.0_75/jre [org.apache.zookeeper.ZooKeeper]Client environment:java.class.path=/home/hadoop/workspace/HbaseStudy-0.98.13/target/test-classes:/home/hadoop/workspace/HbaseStudy-0.98.13/target/classes:/home/hadoop/.m2/repository/org/apache/hbase/hbase-client/0.98.13-hadoop2/hbase-client-0.98.13-hadoop2.jar:/home/hadoop/.m2/repository/org/apache/hbase/hbase-annotations/0.98.13-hadoop2/hbase-annotations-0.98.13-hadoop2.jar:/home/hadoop/work/DevelopTools/jdk1.7.0_75/lib/tools.jar:/home/hadoop/.m2/repository/org/apache/hbase/hbase-common/0.98.13-hadoop2/hbase-common-0.98.13-hadoop2.jar:/home/hadoop/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar:/home/hadoop/.m2/repository/org/apache/hbase/hbase-protocol/0.98.13-hadoop2/hbase-protocol-0.98.13-hadoop2.jar:/home/hadoop/.m2/repository/commons-codec/commons-codec/1.7/commons-codec-1.7.jar:/home/hadoop/.m2/repository/commons-io/commons-io/2.4/commons-io-2.4.jar:/home/hadoop/.m2/repository/commons-lang/commons-lang/2.6/commons-lang-2.6.jar:/home/hadoop/.m2/repository/commons-logging/commons-logging/1.1.1/commons-logging-1.1.1.jar:/home/hadoop/.m2/repository/com/google/guava/guava/12.0.1/guava-12.0.1.jar:/home/hadoop/.m2/repository/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar:/home/hadoop/.m2/repository/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar:/home/hadoop/.m2/repository/io/netty/netty/3.6.6.Final/netty-3.6.6.Final.jar:/home/hadoop/.m2/repository/org/apache/zookeeper/zookeeper/3.4.6/zookeeper-3.4.6.jar:/home/hadoop/.m2/repository/org/slf4j/slf4j-api/1.6.1/slf4j-api-1.6.1.jar:/home/hadoop/.m2/repository/org/slf4j/slf4j-log4j12/1.6.1/slf4j-log4j12-1.6.1.jar:/home/hadoop/.m2/repository/org/cloudera/htrace/htrace-core/2.04/htrace-core-2.04.jar:/home/hadoop/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.8.8/jackson-mapper-asl-1.8.8.jar:/home/hadoop/.m2/repository/org/jruby/jcodings/jcodings/1.0.8/jcodings-1.0.8.jar:/home/hadoop/.m2/repository/org/jruby/joni/joni/2.1.2/joni-2.1.2.jar:/home/hadoop/.m2/repository/org/apache/hadoop/hadoop-auth/2.2.0/hadoop-auth-2.2.0.jar:/home/hadoop/.m2/repository/org/apache/hadoop/hadoop-common/2.2.0/hadoop-common-2.2.0.jar:/home/hadoop/.m2/repository/org/apache/hadoop/hadoop-annotations/2.2.0/hadoop-annotations-2.2.0.jar:/home/hadoop/.m2/repository/org/apache/commons/commons-math/2.1/commons-math-2.1.jar:/home/hadoop/.m2/repository/commons-httpclient/commons-httpclient/3.1/commons-httpclient-3.1.jar:/home/hadoop/.m2/repository/commons-net/commons-net/3.1/commons-net-3.1.jar:/home/hadoop/.m2/repository/com/sun/jersey/jersey-json/1.9/jersey-json-1.9.jar:/home/hadoop/.m2/repository/org/codehaus/jettison/jettison/1.1/jettison-1.1.jar:/home/hadoop/.m2/repository/stax/stax-api/1.0.1/stax-api-1.0.1.jar:/home/hadoop/.m2/repository/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-impl-2.2.3-1.jar:/home/hadoop/.m2/repository/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar:/home/hadoop/.m2/repository/javax/activation/activation/1.1/activation-1.1.jar:/home/hadoop/.m2/repository/org/codehaus/jackson/jackson-xc/1.8.3/jackson-xc-1.8.3.jar:/home/hadoop/.m2/repository/commons-el/commons-el/1.0/commons-el-1.0.jar:/home/hadoop/.m2/repository/net/java/dev/jets3t/jets3t/0.6.1/jets3t-0.6.1.jar:/home/hadoop/.m2/repository/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar:/home/hadoop/.m2/repository/commons-digester/commons-digester/1.8/commons-digester-1.8.jar:/home/hadoop/.m2/repository/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar:/home/hadoop/.m2/repository/commons-beanutils/commons-beanutils-core/1.8.0/commons-beanutils-core-1.8.0.jar:/home/hadoop/.m2/repository/org/apache/avro/avro/1.7.4/avro-1.7.4.jar:/home/hadoop/.m2/repository/com/thoughtworks/paranamer/paranamer/2.3/paranamer-2.3.jar:/home/hadoop/.m2/repository/org/xerial/snappy/snappy-java/1.0.4.1/snappy-java-1.0.4.1.jar:/home/hadoop/.m2/repository/com/jcraft/jsch/0.1.42/jsch-0.1.42.jar:/home/hadoop/.m2/repository/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar:/home/hadoop/.m2/repository/org/tukaani/xz/1.0/xz-1.0.jar:/home/hadoop/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-core/2.2.0/hadoop-mapreduce-client-core-2.2.0.jar:/home/hadoop/.m2/repository/org/apache/hadoop/hadoop-yarn-common/2.2.0/hadoop-yarn-common-2.2.0.jar:/home/hadoop/.m2/repository/org/apache/hadoop/hadoop-yarn-api/2.2.0/hadoop-yarn-api-2.2.0.jar:/home/hadoop/.m2/repository/com/google/inject/guice/3.0/guice-3.0.jar:/home/hadoop/.m2/repository/javax/inject/javax.inject/1/javax.inject-1.jar:/home/hadoop/.m2/repository/aopalliance/aopalliance/1.0/aopalliance-1.0.jar:/home/hadoop/.m2/repository/com/sun/jersey/contribs/jersey-guice/1.9/jersey-guice-1.9.jar:/home/hadoop/.m2/repository/com/google/inject/extensions/guice-servlet/3.0/guice-servlet-3.0.jar:/home/hadoop/.m2/repository/com/github/stephenc/findbugs/findbugs-annotations/1.3.9-1/findbugs-annotations-1.3.9-1.jar:/home/hadoop/.m2/repository/org/apache/hbase/hbase-examples/0.98.13-hadoop2/hbase-examples-0.98.13-hadoop2.jar:/home/hadoop/.m2/repository/org/apache/hbase/hbase-server/0.98.13-hadoop2/hbase-server-0.98.13-hadoop2.jar:/home/hadoop/.m2/repository/org/apache/hbase/hbase-prefix-tree/0.98.13-hadoop2/hbase-prefix-tree-0.98.13-hadoop2.jar:/home/hadoop/.m2/repository/org/apache/hbase/hbase-common/0.98.13-hadoop2/hbase-common-0.98.13-hadoop2-tests.jar:/home/hadoop/.m2/repository/org/apache/hbase/hbase-hadoop-compat/0.98.13-hadoop2/hbase-hadoop-compat-0.98.13-hadoop2.jar:/home/hadoop/.m2/repository/org/apache/hbase/hbase-hadoop2-compat/0.98.13-hadoop2/hbase-hadoop2-compat-0.98.13-hadoop2.jar:/home/hadoop/.m2/repository/com/yammer/metrics/metrics-core/2.2.0/metrics-core-2.2.0.jar:/home/hadoop/.m2/repository/com/github/stephenc/high-scale-lib/high-scale-lib/1.1.1/high-scale-lib-1.1.1.jar:/home/hadoop/.m2/repository/org/mortbay/jetty/jetty-sslengine/6.1.26/jetty-sslengine-6.1.26.jar:/home/hadoop/.m2/repository/org/mortbay/jetty/jsp-2.1/6.1.14/jsp-2.1-6.1.14.jar:/home/hadoop/.m2/repository/org/mortbay/jetty/jsp-api-2.1/6.1.14/jsp-api-2.1-6.1.14.jar:/home/hadoop/.m2/repository/org/mortbay/jetty/servlet-api-2.5/6.1.14/servlet-api-2.5-6.1.14.jar:/home/hadoop/.m2/repository/org/codehaus/jackson/jackson-jaxrs/1.8.8/jackson-jaxrs-1.8.8.jar:/home/hadoop/.m2/repository/tomcat/jasper-compiler/5.5.23/jasper-compiler-5.5.23.jar:/home/hadoop/.m2/repository/org/jamon/jamon-runtime/2.3.1/jamon-runtime-2.3.1.jar:/home/hadoop/.m2/repository/org/apache/hadoop/hadoop-client/2.2.0/hadoop-client-2.2.0.jar:/home/hadoop/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-app/2.2.0/hadoop-mapreduce-client-app-2.2.0.jar:/home/hadoop/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-common/2.2.0/hadoop-mapreduce-client-common-2.2.0.jar:/home/hadoop/.m2/repository/org/apache/hadoop/hadoop-yarn-client/2.2.0/hadoop-yarn-client-2.2.0.jar:/home/hadoop/.m2/repository/com/sun/jersey/jersey-test-framework/jersey-test-framework-grizzly2/1.9/jersey-test-framework-grizzly2-1.9.jar:/home/hadoop/.m2/repository/com/sun/jersey/jersey-test-framework/jersey-test-framework-core/1.9/jersey-test-framework-core-1.9.jar:/home/hadoop/.m2/repository/javax/servlet/javax.servlet-api/3.0.1/javax.servlet-api-3.0.1.jar:/home/hadoop/.m2/repository/com/sun/jersey/jersey-client/1.9/jersey-client-1.9.jar:/home/hadoop/.m2/repository/com/sun/jersey/jersey-grizzly2/1.9/jersey-grizzly2-1.9.jar:/home/hadoop/.m2/repository/org/glassfish/grizzly/grizzly-http/2.1.2/grizzly-http-2.1.2.jar:/home/hadoop/.m2/repository/org/glassfish/grizzly/grizzly-framework/2.1.2/grizzly-framework-2.1.2.jar:/home/hadoop/.m2/repository/org/glassfish/gmbal/gmbal-api-only/3.0.0-b023/gmbal-api-only-3.0.0-b023.jar:/home/hadoop/.m2/repository/org/glassfish/external/management-api/3.0.0-b012/management-api-3.0.0-b012.jar:/home/hadoop/.m2/repository/org/glassfish/grizzly/grizzly-http-server/2.1.2/grizzly-http-server-2.1.2.jar:/home/hadoop/.m2/repository/org/glassfish/grizzly/grizzly-rcm/2.1.2/grizzly-rcm-2.1.2.jar:/home/hadoop/.m2/repository/org/glassfish/grizzly/grizzly-http-servlet/2.1.2/grizzly-http-servlet-2.1.2.jar:/home/hadoop/.m2/repository/org/glassfish/javax.servlet/3.1/javax.servlet-3.1.jar:/home/hadoop/.m2/repository/org/apache/hadoop/hadoop-yarn-server-common/2.2.0/hadoop-yarn-server-common-2.2.0.jar:/home/hadoop/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-shuffle/2.2.0/hadoop-mapreduce-client-shuffle-2.2.0.jar:/home/hadoop/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-jobclient/2.2.0/hadoop-mapreduce-client-jobclient-2.2.0.jar:/home/hadoop/.m2/repository/org/apache/hbase/hbase-thrift/0.98.13-hadoop2/hbase-thrift-0.98.13-hadoop2.jar:/home/hadoop/.m2/repository/org/apache/thrift/libthrift/0.9.0/libthrift-0.9.0.jar:/home/hadoop/.m2/repository/org/apache/httpcomponents/httpclient/4.1.3/httpclient-4.1.3.jar:/home/hadoop/.m2/repository/org/apache/httpcomponents/httpcore/4.1.3/httpcore-4.1.3.jar:/home/hadoop/.m2/repository/org/apache/hadoop/hadoop-hdfs/2.2.0/hadoop-hdfs-2.2.0.jar:/home/hadoop/.m2/repository/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar:/home/hadoop/.m2/repository/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar:/home/hadoop/.m2/repository/com/sun/jersey/jersey-core/1.9/jersey-core-1.9.jar:/home/hadoop/.m2/repository/com/sun/jersey/jersey-server/1.9/jersey-server-1.9.jar:/home/hadoop/.m2/repository/asm/asm/3.1/asm-3.1.jar:/home/hadoop/.m2/repository/commons-cli/commons-cli/1.2/commons-cli-1.2.jar:/home/hadoop/.m2/repository/commons-daemon/commons-daemon/1.0.13/commons-daemon-1.0.13.jar:/home/hadoop/.m2/repository/javax/servlet/jsp/jsp-api/2.1/jsp-api-2.1.jar:/home/hadoop/.m2/repository/javax/servlet/servlet-api/2.5/servlet-api-2.5.jar:/home/hadoop/.m2/repository/org/codehaus/jackson/jackson-core-asl/1.8.8/jackson-core-asl-1.8.8.jar:/home/hadoop/.m2/repository/tomcat/jasper-runtime/5.5.23/jasper-runtime-5.5.23.jar:/home/hadoop/.m2/repository/xmlenc/xmlenc/0.52/xmlenc-0.52.jar:/home/hadoop/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar:/home/hadoop/.m2/repository/junit/junit/4.12/junit-4.12.jar:/home/hadoop/.m2/repository/org/hamcrest/hamcrest-core/1.3/hamcrest-core-1.3.jar:/home/hadoop/work/DevelopTools/eclipse/configuration/org.eclipse.osgi/380/0/.cp/:/home/hadoop/work/DevelopTools/eclipse/configuration/org.eclipse.osgi/379/0/.cp/ [org.apache.zookeeper.ZooKeeper]Client environment:java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib [org.apache.zookeeper.ZooKeeper]Client environment:java.io.tmpdir=/tmp [org.apache.zookeeper.ZooKeeper]Client environment:java.compiler=<NA> [org.apache.zookeeper.ZooKeeper]Client environment:os.name=Linux [org.apache.zookeeper.ZooKeeper]Client environment:os.arch=amd64 [org.apache.zookeeper.ZooKeeper]Client environment:os.version=3.13.0-32-generic [org.apache.zookeeper.ZooKeeper]Client environment:user.name=hadoop [org.apache.zookeeper.ZooKeeper]Client environment:user.home=/home/hadoop [org.apache.zookeeper.ZooKeeper]Client environment:user.dir=/home/hadoop/workspace/HbaseStudy-0.98.13 [org.apache.zookeeper.ZooKeeper]Initiating client connection, connectString=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181 sessionTimeout=90000 watcher=hconnection-0x18e4d5ba0x0, quorum=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181, baseZNode=/hbase [org.apache.zookeeper.ClientCnxn]zookeeper.disableAutoWatchReset is false [org.apache.zookeeper.ClientCnxn]Opening socket connection to server spark1/192.168.4.31:2181. Will not attempt to authenticate using SASL (unknown error) [org.apache.zookeeper.ClientCnxn]Socket connection established to spark1/192.168.4.31:2181, initiating session [org.apache.zookeeper.ClientCnxn]Session establishment request sent on spark1/192.168.4.31:2181 [org.apache.zookeeper.ClientCnxn]Session establishment complete on server spark1/192.168.4.31:2181, sessionid = 0x14f878806d5000c, negotiated timeout = 40000 [org.apache.hadoop.hbase.zookeeper.ZooKeeperWatcher]hconnection-0x18e4d5ba0x0, quorum=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181, baseZNode=/hbase Received ZooKeeper Event, type=None, state=SyncConnected, path=null [org.apache.hadoop.hbase.zookeeper.ZooKeeperWatcher]hconnection-0x18e4d5ba-0x14f878806d5000c connected [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x14f878806d5000c, packet:: clientPath:null serverPath:null finished:false header:: 1,3 replyHeader:: 1,47244640502,0 request:: '/hbase/hbaseid,F response:: s{42949673661,47244640264,1438670836817,1441088022742,4,0,0,0,67,0,42949673661} [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x14f878806d5000c, packet:: clientPath:null serverPath:null finished:false header:: 2,4 replyHeader:: 2,47244640502,0 request:: '/hbase/hbaseid,F response:: #ffffffff000146d61737465723a3630303030ffffffa5fffffff1ffffffe0fffffff3481346ffffffa450425546a2435313637666262392d356133622d346634382d386532302d366530623761383562373063,s{42949673661,47244640264,1438670836817,1441088022742,4,0,0,0,67,0,42949673661} [org.apache.hadoop.hdfs.BlockReaderLocal]dfs.client.use.legacy.blockreader.local = false [org.apache.hadoop.hdfs.BlockReaderLocal]dfs.client.read.shortcircuit = false [org.apache.hadoop.hdfs.BlockReaderLocal]dfs.client.domain.socket.data.traffic = false [org.apache.hadoop.hdfs.BlockReaderLocal]dfs.domain.socket.path = [org.apache.hadoop.metrics2.impl.MetricsSystemImpl]StartupProgress, NameNode startup progress [org.apache.hadoop.io.retry.RetryUtils]multipleLinearRandomRetry = null [org.apache.hadoop.ipc.Server]rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@52474d7f [org.apache.hadoop.hdfs.BlockReaderLocal]Both short-circuit local reads and UNIX domain socket are disabled. [org.apache.hadoop.ipc.RpcClient]Codec=org.apache.hadoop.hbase.codec.KeyValueCodec@62f73ff7, compressor=null, tcpKeepAlive=true, tcpNoDelay=true, maxIdleTime=10000, maxRetries=0, fallbackAllowed=false, ping interval=60000ms, bind address=null [org.apache.hadoop.conf.Configuration.deprecation]hadoop.native.lib is deprecated. Instead, use io.native.lib.available [org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper]Process identifier=catalogtracker-on-hconnection-0x18e4d5ba connecting to ZooKeeper ensemble=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181 [org.apache.zookeeper.ZooKeeper]Initiating client connection, connectString=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181 sessionTimeout=90000 watcher=catalogtracker-on-hconnection-0x18e4d5ba0x0, quorum=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181, baseZNode=/hbase [org.apache.zookeeper.ClientCnxn]Opening socket connection to server spark4/192.168.4.34:2181. Will not attempt to authenticate using SASL (unknown error) [org.apache.zookeeper.ClientCnxn]Socket connection established to spark4/192.168.4.34:2181, initiating session [org.apache.hadoop.hbase.catalog.CatalogTracker]Starting catalog tracker org.apache.hadoop.hbase.catalog.CatalogTracker@21b7e734 [org.apache.zookeeper.ClientCnxn]Session establishment request sent on spark4/192.168.4.34:2181 [org.apache.zookeeper.ClientCnxn]Session establishment complete on server spark4/192.168.4.34:2181, sessionid = 0x44f8787f7de000c, negotiated timeout = 40000 [org.apache.hadoop.hbase.zookeeper.ZooKeeperWatcher]catalogtracker-on-hconnection-0x18e4d5ba0x0, quorum=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181, baseZNode=/hbase Received ZooKeeper Event, type=None, state=SyncConnected, path=null [org.apache.hadoop.hbase.zookeeper.ZooKeeperWatcher]catalogtracker-on-hconnection-0x18e4d5ba-0x44f8787f7de000c connected [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x44f8787f7de000c, packet:: clientPath:null serverPath:null finished:false header:: 1,3 replyHeader:: 1,47244640503,0 request:: '/hbase/meta-region-server,T response:: s{47244640306,47244640306,1441088028291,1441088028291,0,0,0,0,61,0,47244640306} [org.apache.hadoop.hbase.zookeeper.ZKUtil]catalogtracker-on-hconnection-0x18e4d5ba-0x44f8787f7de000c, quorum=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181, baseZNode=/hbase Set watcher on existing znode=/hbase/meta-region-server [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x44f8787f7de000c, packet:: clientPath:null serverPath:null finished:false header:: 2,4 replyHeader:: 2,47244640503,0 request:: '/hbase/meta-region-server,T response:: #ffffffff0001a726567696f6e7365727665723a3630303230ffffff86ffffffeeffffffdd7f19ffffff85ffffffda1050425546a14a76861646f6f703110fffffff4ffffffd4318ffffffd4ffffffc8ffffffa8ffffffbcfffffff829100183,s{47244640306,47244640306,1441088028291,1441088028291,0,0,0,0,61,0,47244640306} [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x14f878806d5000c, packet:: clientPath:null serverPath:null finished:false header:: 3,4 replyHeader:: 3,47244640503,0 request:: '/hbase/meta-region-server,F response:: #ffffffff0001a726567696f6e7365727665723a3630303230ffffff86ffffffeeffffffdd7f19ffffff85ffffffda1050425546a14a76861646f6f703110fffffff4ffffffd4318ffffffd4ffffffc8ffffffa8ffffffbcfffffff829100183,s{47244640306,47244640306,1441088028291,1441088028291,0,0,0,0,61,0,47244640306} [org.apache.hadoop.ipc.RpcClient]Use SIMPLE authentication for service ClientService, sasl=false [org.apache.hadoop.ipc.RpcClient]Connecting to hadoop1/192.168.4.21:60020 [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: starting, connections 1 [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 0, totalSize: 12 bytes [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 0 method_name: "Scan" request_param: true [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 1 method_name: "Scan" request_param: true priority: 100 [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 1 cell_block_meta { length: 877 }, totalSize: 900 bytes [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 2 method_name: "Scan" request_param: true [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 2, totalSize: 8 bytes [org.apache.hadoop.hbase.catalog.CatalogTracker]Stopping catalog tracker org.apache.hadoop.hbase.catalog.CatalogTracker@21b7e734 [org.apache.zookeeper.ZooKeeper]Closing session: 0x44f8787f7de000c [org.apache.zookeeper.ClientCnxn]Closing client for session: 0x44f8787f7de000c [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x44f8787f7de000c, packet:: clientPath:null serverPath:null finished:false header:: 3,-11 replyHeader:: 3,47244640504,0 request:: null response:: null [org.apache.zookeeper.ClientCnxn]Disconnecting client for session: 0x44f8787f7de000c [org.apache.zookeeper.ZooKeeper]Session: 0x44f8787f7de000c closed [org.apache.zookeeper.ClientCnxn]EventThread shut down [org.apache.zookeeper.ClientCnxn]An exception was thrown while closing send thread for session 0x44f8787f7de000c : Unable to read additional data from server sessionid 0x44f8787f7de000c, likely server has closed socket [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x14f878806d5000c, packet:: clientPath:null serverPath:null finished:false header:: 4,3 replyHeader:: 4,47244640504,0 request:: '/hbase,F response:: s{42949673651,42949673651,1438670836103,1438670836103,0,33,0,0,0,15,47244640306} [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x14f878806d5000c, packet:: clientPath:null serverPath:null finished:false header:: 5,4 replyHeader:: 5,47244640504,0 request:: '/hbase/master,F response:: #ffffffff000146d61737465723a3630303030ffffffecffffffa5ffffffc86effffff96fffffffcffffffa24f50425546a14a76861646f6f703110ffffffe0ffffffd4318ffffff98ffffffc3ffffffa8ffffffbcfffffff829100,s{47244640263,47244640263,1441088021642,1441088021642,0,0,0,310615917360709632,53,0,47244640263} [org.apache.hadoop.ipc.RpcClient]Use SIMPLE authentication for service MasterService, sasl=false [org.apache.hadoop.ipc.RpcClient]Connecting to hadoop1/192.168.4.21:60000 [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60000 from hadoop: wrote request header call_id: 3 method_name: "IsMasterRunning" request_param: true [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60000 from hadoop: starting, connections 2 [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60000 from hadoop: got response header call_id: 3, totalSize: 6 bytes [org.apache.hadoop.hbase.client.HBaseAdmin]Started disable of user1 [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60000 from hadoop: wrote request header call_id: 4 method_name: "DisableTable" request_param: true [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60000 from hadoop: got response header call_id: 4, totalSize: 4 bytes [org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper]Process identifier=catalogtracker-on-hconnection-0x18e4d5ba connecting to ZooKeeper ensemble=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181 [org.apache.zookeeper.ZooKeeper]Initiating client connection, connectString=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181 sessionTimeout=90000 watcher=catalogtracker-on-hconnection-0x18e4d5ba0x0, quorum=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181, baseZNode=/hbase [org.apache.hadoop.hbase.catalog.CatalogTracker]Starting catalog tracker org.apache.hadoop.hbase.catalog.CatalogTracker@3c5d0d01 [org.apache.zookeeper.ClientCnxn]Opening socket connection to server spark3/192.168.4.33:2181. Will not attempt to authenticate using SASL (unknown error) [org.apache.zookeeper.ClientCnxn]Socket connection established to spark3/192.168.4.33:2181, initiating session [org.apache.zookeeper.ClientCnxn]Session establishment request sent on spark3/192.168.4.33:2181 [org.apache.zookeeper.ClientCnxn]Session establishment complete on server spark3/192.168.4.33:2181, sessionid = 0x34f8787fd3c0011, negotiated timeout = 40000 [org.apache.hadoop.hbase.zookeeper.ZooKeeperWatcher]catalogtracker-on-hconnection-0x18e4d5ba0x0, quorum=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181, baseZNode=/hbase Received ZooKeeper Event, type=None, state=SyncConnected, path=null [org.apache.hadoop.hbase.zookeeper.ZooKeeperWatcher]catalogtracker-on-hconnection-0x18e4d5ba-0x34f8787fd3c0011 connected [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x34f8787fd3c0011, packet:: clientPath:null serverPath:null finished:false header:: 1,3 replyHeader:: 1,47244640509,0 request:: '/hbase/meta-region-server,T response:: s{47244640306,47244640306,1441088028291,1441088028291,0,0,0,0,61,0,47244640306} [org.apache.hadoop.hbase.zookeeper.ZKUtil]catalogtracker-on-hconnection-0x18e4d5ba-0x34f8787fd3c0011, quorum=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181, baseZNode=/hbase Set watcher on existing znode=/hbase/meta-region-server [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x34f8787fd3c0011, packet:: clientPath:null serverPath:null finished:false header:: 2,4 replyHeader:: 2,47244640509,0 request:: '/hbase/meta-region-server,T response:: #ffffffff0001a726567696f6e7365727665723a3630303230ffffff86ffffffeeffffffdd7f19ffffff85ffffffda1050425546a14a76861646f6f703110fffffff4ffffffd4318ffffffd4ffffffc8ffffffa8ffffffbcfffffff829100183,s{47244640306,47244640306,1441088028291,1441088028291,0,0,0,0,61,0,47244640306} [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x14f878806d5000c, packet:: clientPath:null serverPath:null finished:false header:: 6,4 replyHeader:: 6,47244640509,0 request:: '/hbase/meta-region-server,F response:: #ffffffff0001a726567696f6e7365727665723a3630303230ffffff86ffffffeeffffffdd7f19ffffff85ffffffda1050425546a14a76861646f6f703110fffffff4ffffffd4318ffffffd4ffffffc8ffffffa8ffffffbcfffffff829100183,s{47244640306,47244640306,1441088028291,1441088028291,0,0,0,0,61,0,47244640306} [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 5 method_name: "Scan" request_param: true [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 5, totalSize: 12 bytes [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 6 method_name: "Scan" request_param: true priority: 100 [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 6 cell_block_meta { length: 877 }, totalSize: 900 bytes [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 7 method_name: "Scan" request_param: true [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 7, totalSize: 8 bytes [org.apache.hadoop.hbase.catalog.CatalogTracker]Stopping catalog tracker org.apache.hadoop.hbase.catalog.CatalogTracker@3c5d0d01 [org.apache.zookeeper.ZooKeeper]Closing session: 0x34f8787fd3c0011 [org.apache.zookeeper.ClientCnxn]Closing client for session: 0x34f8787fd3c0011 [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x34f8787fd3c0011, packet:: clientPath:null serverPath:null finished:false header:: 3,-11 replyHeader:: 3,47244640510,0 request:: null response:: null [org.apache.zookeeper.ClientCnxn]Disconnecting client for session: 0x34f8787fd3c0011 [org.apache.zookeeper.ZooKeeper]Session: 0x34f8787fd3c0011 closed [org.apache.zookeeper.ClientCnxn]An exception was thrown while closing send thread for session 0x34f8787fd3c0011 : Unable to read additional data from server sessionid 0x34f8787fd3c0011, likely server has closed socket [org.apache.zookeeper.ClientCnxn]EventThread shut down [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x14f878806d5000c, packet:: clientPath:null serverPath:null finished:false header:: 7,4 replyHeader:: 7,47244640511,0 request:: '/hbase/table/user1,F response:: #ffffffff000146d61737465723a3630303030fffffff9761832ffffffe05dffffff98ffffffa85042554682,s{47244640452,47244640507,1441089997328,1441093181341,4,0,0,0,31,0,47244640452} [org.apache.hadoop.hbase.client.HBaseAdmin]Sleeping= 100ms, waiting for all regions to be disabled in user1 [org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper]Process identifier=catalogtracker-on-hconnection-0x18e4d5ba connecting to ZooKeeper ensemble=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181 [org.apache.zookeeper.ZooKeeper]Initiating client connection, connectString=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181 sessionTimeout=90000 watcher=catalogtracker-on-hconnection-0x18e4d5ba0x0, quorum=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181, baseZNode=/hbase [org.apache.hadoop.hbase.catalog.CatalogTracker]Starting catalog tracker org.apache.hadoop.hbase.catalog.CatalogTracker@2356cab0 [org.apache.zookeeper.ClientCnxn]Opening socket connection to server spark5/192.168.4.35:2181. Will not attempt to authenticate using SASL (unknown error) [org.apache.zookeeper.ClientCnxn]Socket connection established to spark5/192.168.4.35:2181, initiating session [org.apache.zookeeper.ClientCnxn]Session establishment request sent on spark5/192.168.4.35:2181 [org.apache.zookeeper.ClientCnxn]Session establishment complete on server spark5/192.168.4.35:2181, sessionid = 0x54f8787fd180008, negotiated timeout = 40000 [org.apache.hadoop.hbase.zookeeper.ZooKeeperWatcher]catalogtracker-on-hconnection-0x18e4d5ba0x0, quorum=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181, baseZNode=/hbase Received ZooKeeper Event, type=None, state=SyncConnected, path=null [org.apache.hadoop.hbase.zookeeper.ZooKeeperWatcher]catalogtracker-on-hconnection-0x18e4d5ba-0x54f8787fd180008 connected [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x54f8787fd180008, packet:: clientPath:null serverPath:null finished:false header:: 1,3 replyHeader:: 1,47244640513,0 request:: '/hbase/meta-region-server,T response:: s{47244640306,47244640306,1441088028291,1441088028291,0,0,0,0,61,0,47244640306} [org.apache.hadoop.hbase.zookeeper.ZKUtil]catalogtracker-on-hconnection-0x18e4d5ba-0x54f8787fd180008, quorum=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181, baseZNode=/hbase Set watcher on existing znode=/hbase/meta-region-server [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x54f8787fd180008, packet:: clientPath:null serverPath:null finished:false header:: 2,4 replyHeader:: 2,47244640513,0 request:: '/hbase/meta-region-server,T response:: #ffffffff0001a726567696f6e7365727665723a3630303230ffffff86ffffffeeffffffdd7f19ffffff85ffffffda1050425546a14a76861646f6f703110fffffff4ffffffd4318ffffffd4ffffffc8ffffffa8ffffffbcfffffff829100183,s{47244640306,47244640306,1441088028291,1441088028291,0,0,0,0,61,0,47244640306} [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x14f878806d5000c, packet:: clientPath:null serverPath:null finished:false header:: 8,4 replyHeader:: 8,47244640513,0 request:: '/hbase/meta-region-server,F response:: #ffffffff0001a726567696f6e7365727665723a3630303230ffffff86ffffffeeffffffdd7f19ffffff85ffffffda1050425546a14a76861646f6f703110fffffff4ffffffd4318ffffffd4ffffffc8ffffffa8ffffffbcfffffff829100183,s{47244640306,47244640306,1441088028291,1441088028291,0,0,0,0,61,0,47244640306} [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 8 method_name: "Scan" request_param: true [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 8, totalSize: 12 bytes [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 9 method_name: "Scan" request_param: true priority: 100 [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 9 cell_block_meta { length: 877 }, totalSize: 900 bytes [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 10 method_name: "Scan" request_param: true [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 10, totalSize: 8 bytes [org.apache.hadoop.hbase.catalog.CatalogTracker]Stopping catalog tracker org.apache.hadoop.hbase.catalog.CatalogTracker@2356cab0 [org.apache.zookeeper.ZooKeeper]Closing session: 0x54f8787fd180008 [org.apache.zookeeper.ClientCnxn]Closing client for session: 0x54f8787fd180008 [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x54f8787fd180008, packet:: clientPath:null serverPath:null finished:false header:: 3,-11 replyHeader:: 3,47244640514,0 request:: null response:: null [org.apache.zookeeper.ClientCnxn]Disconnecting client for session: 0x54f8787fd180008 [org.apache.zookeeper.ClientCnxn]An exception was thrown while closing send thread for session 0x54f8787fd180008 : Unable to read additional data from server sessionid 0x54f8787fd180008, likely server has closed socket [org.apache.zookeeper.ZooKeeper]Session: 0x54f8787fd180008 closed [org.apache.zookeeper.ClientCnxn]EventThread shut down [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x14f878806d5000c, packet:: clientPath:null serverPath:null finished:false header:: 9,4 replyHeader:: 9,47244640514,0 request:: '/hbase/table/user1,F response:: #ffffffff000146d61737465723a3630303030fffffff9761832ffffffe05dffffff98ffffffa85042554682,s{47244640452,47244640507,1441089997328,1441093181341,4,0,0,0,31,0,47244640452} [org.apache.hadoop.hbase.client.HBaseAdmin]Sleeping= 200ms, waiting for all regions to be disabled in user1 [org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper]Process identifier=catalogtracker-on-hconnection-0x18e4d5ba connecting to ZooKeeper ensemble=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181 [org.apache.zookeeper.ZooKeeper]Initiating client connection, connectString=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181 sessionTimeout=90000 watcher=catalogtracker-on-hconnection-0x18e4d5ba0x0, quorum=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181, baseZNode=/hbase [org.apache.hadoop.hbase.catalog.CatalogTracker]Starting catalog tracker org.apache.hadoop.hbase.catalog.CatalogTracker@4bfed00f [org.apache.zookeeper.ClientCnxn]Opening socket connection to server spark4/192.168.4.34:2181. Will not attempt to authenticate using SASL (unknown error) [org.apache.zookeeper.ClientCnxn]Socket connection established to spark4/192.168.4.34:2181, initiating session [org.apache.zookeeper.ClientCnxn]Session establishment request sent on spark4/192.168.4.34:2181 [org.apache.zookeeper.ClientCnxn]Session establishment complete on server spark4/192.168.4.34:2181, sessionid = 0x44f8787f7de000d, negotiated timeout = 40000 [org.apache.hadoop.hbase.zookeeper.ZooKeeperWatcher]catalogtracker-on-hconnection-0x18e4d5ba0x0, quorum=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181, baseZNode=/hbase Received ZooKeeper Event, type=None, state=SyncConnected, path=null [org.apache.hadoop.hbase.zookeeper.ZooKeeperWatcher]catalogtracker-on-hconnection-0x18e4d5ba-0x44f8787f7de000d connected [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x44f8787f7de000d, packet:: clientPath:null serverPath:null finished:false header:: 1,3 replyHeader:: 1,47244640515,0 request:: '/hbase/meta-region-server,T response:: s{47244640306,47244640306,1441088028291,1441088028291,0,0,0,0,61,0,47244640306} [org.apache.hadoop.hbase.zookeeper.ZKUtil]catalogtracker-on-hconnection-0x18e4d5ba-0x44f8787f7de000d, quorum=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181, baseZNode=/hbase Set watcher on existing znode=/hbase/meta-region-server [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x44f8787f7de000d, packet:: clientPath:null serverPath:null finished:false header:: 2,4 replyHeader:: 2,47244640515,0 request:: '/hbase/meta-region-server,T response:: #ffffffff0001a726567696f6e7365727665723a3630303230ffffff86ffffffeeffffffdd7f19ffffff85ffffffda1050425546a14a76861646f6f703110fffffff4ffffffd4318ffffffd4ffffffc8ffffffa8ffffffbcfffffff829100183,s{47244640306,47244640306,1441088028291,1441088028291,0,0,0,0,61,0,47244640306} [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x14f878806d5000c, packet:: clientPath:null serverPath:null finished:false header:: 10,4 replyHeader:: 10,47244640515,0 request:: '/hbase/meta-region-server,F response:: #ffffffff0001a726567696f6e7365727665723a3630303230ffffff86ffffffeeffffffdd7f19ffffff85ffffffda1050425546a14a76861646f6f703110fffffff4ffffffd4318ffffffd4ffffffc8ffffffa8ffffffbcfffffff829100183,s{47244640306,47244640306,1441088028291,1441088028291,0,0,0,0,61,0,47244640306} [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 11 method_name: "Scan" request_param: true [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 11, totalSize: 12 bytes [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 12 method_name: "Scan" request_param: true priority: 100 [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 12 cell_block_meta { length: 877 }, totalSize: 900 bytes [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 13 method_name: "Scan" request_param: true [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 13, totalSize: 8 bytes [org.apache.hadoop.hbase.catalog.CatalogTracker]Stopping catalog tracker org.apache.hadoop.hbase.catalog.CatalogTracker@4bfed00f [org.apache.zookeeper.ZooKeeper]Closing session: 0x44f8787f7de000d [org.apache.zookeeper.ClientCnxn]Closing client for session: 0x44f8787f7de000d [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x44f8787f7de000d, packet:: clientPath:null serverPath:null finished:false header:: 3,-11 replyHeader:: 3,47244640516,0 request:: null response:: null [org.apache.zookeeper.ClientCnxn]Disconnecting client for session: 0x44f8787f7de000d [org.apache.zookeeper.ZooKeeper]Session: 0x44f8787f7de000d closed [org.apache.zookeeper.ClientCnxn]EventThread shut down [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x14f878806d5000c, packet:: clientPath:null serverPath:null finished:false header:: 11,4 replyHeader:: 11,47244640516,0 request:: '/hbase/table/user1,F response:: #ffffffff000146d61737465723a3630303030fffffff9761832ffffffe05dffffff98ffffffa85042554682,s{47244640452,47244640507,1441089997328,1441093181341,4,0,0,0,31,0,47244640452} [org.apache.hadoop.hbase.client.HBaseAdmin]Sleeping= 300ms, waiting for all regions to be disabled in user1 [org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper]Process identifier=catalogtracker-on-hconnection-0x18e4d5ba connecting to ZooKeeper ensemble=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181 [org.apache.zookeeper.ZooKeeper]Initiating client connection, connectString=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181 sessionTimeout=90000 watcher=catalogtracker-on-hconnection-0x18e4d5ba0x0, quorum=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181, baseZNode=/hbase [org.apache.hadoop.hbase.catalog.CatalogTracker]Starting catalog tracker org.apache.hadoop.hbase.catalog.CatalogTracker@608916f9 [org.apache.zookeeper.ClientCnxn]Opening socket connection to server spark3/192.168.4.33:2181. Will not attempt to authenticate using SASL (unknown error) [org.apache.zookeeper.ClientCnxn]Socket connection established to spark3/192.168.4.33:2181, initiating session [org.apache.zookeeper.ClientCnxn]Session establishment request sent on spark3/192.168.4.33:2181 [org.apache.zookeeper.ClientCnxn]Session establishment complete on server spark3/192.168.4.33:2181, sessionid = 0x34f8787fd3c0012, negotiated timeout = 40000 [org.apache.hadoop.hbase.zookeeper.ZooKeeperWatcher]catalogtracker-on-hconnection-0x18e4d5ba0x0, quorum=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181, baseZNode=/hbase Received ZooKeeper Event, type=None, state=SyncConnected, path=null [org.apache.hadoop.hbase.zookeeper.ZooKeeperWatcher]catalogtracker-on-hconnection-0x18e4d5ba-0x34f8787fd3c0012 connected [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x34f8787fd3c0012, packet:: clientPath:null serverPath:null finished:false header:: 1,3 replyHeader:: 1,47244640517,0 request:: '/hbase/meta-region-server,T response:: s{47244640306,47244640306,1441088028291,1441088028291,0,0,0,0,61,0,47244640306} [org.apache.hadoop.hbase.zookeeper.ZKUtil]catalogtracker-on-hconnection-0x18e4d5ba-0x34f8787fd3c0012, quorum=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181, baseZNode=/hbase Set watcher on existing znode=/hbase/meta-region-server [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x34f8787fd3c0012, packet:: clientPath:null serverPath:null finished:false header:: 2,4 replyHeader:: 2,47244640517,0 request:: '/hbase/meta-region-server,T response:: #ffffffff0001a726567696f6e7365727665723a3630303230ffffff86ffffffeeffffffdd7f19ffffff85ffffffda1050425546a14a76861646f6f703110fffffff4ffffffd4318ffffffd4ffffffc8ffffffa8ffffffbcfffffff829100183,s{47244640306,47244640306,1441088028291,1441088028291,0,0,0,0,61,0,47244640306} [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x14f878806d5000c, packet:: clientPath:null serverPath:null finished:false header:: 12,4 replyHeader:: 12,47244640517,0 request:: '/hbase/meta-region-server,F response:: #ffffffff0001a726567696f6e7365727665723a3630303230ffffff86ffffffeeffffffdd7f19ffffff85ffffffda1050425546a14a76861646f6f703110fffffff4ffffffd4318ffffffd4ffffffc8ffffffa8ffffffbcfffffff829100183,s{47244640306,47244640306,1441088028291,1441088028291,0,0,0,0,61,0,47244640306} [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 14 method_name: "Scan" request_param: true [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 14, totalSize: 12 bytes [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 15 cell_block_meta { length: 877 }, totalSize: 900 bytes [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 15 method_name: "Scan" request_param: true priority: 100 [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 16 method_name: "Scan" request_param: true [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 16, totalSize: 8 bytes [org.apache.hadoop.hbase.catalog.CatalogTracker]Stopping catalog tracker org.apache.hadoop.hbase.catalog.CatalogTracker@608916f9 [org.apache.zookeeper.ZooKeeper]Closing session: 0x34f8787fd3c0012 [org.apache.zookeeper.ClientCnxn]Closing client for session: 0x34f8787fd3c0012 [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x34f8787fd3c0012, packet:: clientPath:null serverPath:null finished:false header:: 3,-11 replyHeader:: 3,47244640518,0 request:: null response:: null [org.apache.zookeeper.ClientCnxn]An exception was thrown while closing send thread for session 0x34f8787fd3c0012 : Unable to read additional data from server sessionid 0x34f8787fd3c0012, likely server has closed socket [org.apache.zookeeper.ClientCnxn]Disconnecting client for session: 0x34f8787fd3c0012 [org.apache.zookeeper.ZooKeeper]Session: 0x34f8787fd3c0012 closed [org.apache.zookeeper.ClientCnxn]EventThread shut down [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x14f878806d5000c, packet:: clientPath:null serverPath:null finished:false header:: 13,4 replyHeader:: 13,47244640518,0 request:: '/hbase/table/user1,F response:: #ffffffff000146d61737465723a3630303030fffffff9761832ffffffe05dffffff98ffffffa85042554682,s{47244640452,47244640507,1441089997328,1441093181341,4,0,0,0,31,0,47244640452} [org.apache.hadoop.hbase.client.HBaseAdmin]Sleeping= 500ms, waiting for all regions to be disabled in user1 [org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper]Process identifier=catalogtracker-on-hconnection-0x18e4d5ba connecting to ZooKeeper ensemble=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181 [org.apache.zookeeper.ZooKeeper]Initiating client connection, connectString=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181 sessionTimeout=90000 watcher=catalogtracker-on-hconnection-0x18e4d5ba0x0, quorum=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181, baseZNode=/hbase [org.apache.zookeeper.ClientCnxn]Opening socket connection to server spark2/192.168.4.32:2181. Will not attempt to authenticate using SASL (unknown error) [org.apache.hadoop.hbase.catalog.CatalogTracker]Starting catalog tracker org.apache.hadoop.hbase.catalog.CatalogTracker@c6424e2 [org.apache.zookeeper.ClientCnxn]Socket connection established to spark2/192.168.4.32:2181, initiating session [org.apache.zookeeper.ClientCnxn]Session establishment request sent on spark2/192.168.4.32:2181 [org.apache.zookeeper.ClientCnxn]Session establishment complete on server spark2/192.168.4.32:2181, sessionid = 0x24f8787c243000d, negotiated timeout = 40000 [org.apache.hadoop.hbase.zookeeper.ZooKeeperWatcher]catalogtracker-on-hconnection-0x18e4d5ba0x0, quorum=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181, baseZNode=/hbase Received ZooKeeper Event, type=None, state=SyncConnected, path=null [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x24f8787c243000d, packet:: clientPath:null serverPath:null finished:false header:: 1,3 replyHeader:: 1,47244640521,0 request:: '/hbase/meta-region-server,T response:: s{47244640306,47244640306,1441088028291,1441088028291,0,0,0,0,61,0,47244640306} [org.apache.hadoop.hbase.zookeeper.ZKUtil]catalogtracker-on-hconnection-0x18e4d5ba0x0, quorum=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181, baseZNode=/hbase Set watcher on existing znode=/hbase/meta-region-server [org.apache.hadoop.hbase.zookeeper.ZooKeeperWatcher]catalogtracker-on-hconnection-0x18e4d5ba-0x24f8787c243000d connected [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x24f8787c243000d, packet:: clientPath:null serverPath:null finished:false header:: 2,4 replyHeader:: 2,47244640521,0 request:: '/hbase/meta-region-server,T response:: #ffffffff0001a726567696f6e7365727665723a3630303230ffffff86ffffffeeffffffdd7f19ffffff85ffffffda1050425546a14a76861646f6f703110fffffff4ffffffd4318ffffffd4ffffffc8ffffffa8ffffffbcfffffff829100183,s{47244640306,47244640306,1441088028291,1441088028291,0,0,0,0,61,0,47244640306} [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x14f878806d5000c, packet:: clientPath:null serverPath:null finished:false header:: 14,4 replyHeader:: 14,47244640521,0 request:: '/hbase/meta-region-server,F response:: #ffffffff0001a726567696f6e7365727665723a3630303230ffffff86ffffffeeffffffdd7f19ffffff85ffffffda1050425546a14a76861646f6f703110fffffff4ffffffd4318ffffffd4ffffffc8ffffffa8ffffffbcfffffff829100183,s{47244640306,47244640306,1441088028291,1441088028291,0,0,0,0,61,0,47244640306} [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 17 method_name: "Scan" request_param: true [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 17, totalSize: 12 bytes [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 18 method_name: "Scan" request_param: true priority: 100 [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 18 cell_block_meta { length: 877 }, totalSize: 900 bytes [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 19 method_name: "Scan" request_param: true [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 19, totalSize: 8 bytes [org.apache.hadoop.hbase.catalog.CatalogTracker]Stopping catalog tracker org.apache.hadoop.hbase.catalog.CatalogTracker@c6424e2 [org.apache.zookeeper.ZooKeeper]Closing session: 0x24f8787c243000d [org.apache.zookeeper.ClientCnxn]Closing client for session: 0x24f8787c243000d [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x24f8787c243000d, packet:: clientPath:null serverPath:null finished:false header:: 3,-11 replyHeader:: 3,47244640522,0 request:: null response:: null [org.apache.zookeeper.ClientCnxn]Disconnecting client for session: 0x24f8787c243000d [org.apache.zookeeper.ClientCnxn]An exception was thrown while closing send thread for session 0x24f8787c243000d : Unable to read additional data from server sessionid 0x24f8787c243000d, likely server has closed socket [org.apache.zookeeper.ZooKeeper]Session: 0x24f8787c243000d closed [org.apache.zookeeper.ClientCnxn]EventThread shut down [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x14f878806d5000c, packet:: clientPath:null serverPath:null finished:false header:: 15,4 replyHeader:: 15,47244640522,0 request:: '/hbase/table/user1,F response:: #ffffffff000146d61737465723a3630303030ffffff8b65ffffff9efffffff8ffffff8428ffffffdcfffffff95042554681,s{47244640452,47244640519,1441089997328,1441093182350,5,0,0,0,31,0,47244640452} [org.apache.hadoop.hbase.client.HBaseAdmin]Disabled user1 [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60000 from hadoop: wrote request header call_id: 20 method_name: "IsMasterRunning" request_param: true [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60000 from hadoop: got response header call_id: 20, totalSize: 6 bytes [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60000 from hadoop: wrote request header call_id: 21 method_name: "DeleteTable" request_param: true [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60000 from hadoop: got response header call_id: 21, totalSize: 4 bytes [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x14f878806d5000c, packet:: clientPath:null serverPath:null finished:false header:: 16,4 replyHeader:: 16,47244640523,0 request:: '/hbase/meta-region-server,F response:: #ffffffff0001a726567696f6e7365727665723a3630303230ffffff86ffffffeeffffffdd7f19ffffff85ffffffda1050425546a14a76861646f6f703110fffffff4ffffffd4318ffffffd4ffffffc8ffffffa8ffffffbcfffffff829100183,s{47244640306,47244640306,1441088028291,1441088028291,0,0,0,0,61,0,47244640306} [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 22 method_name: "Scan" request_param: true [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 22, totalSize: 10 bytes [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60000 from hadoop: wrote request header call_id: 23 method_name: "IsMasterRunning" request_param: true [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60000 from hadoop: got response header call_id: 23, totalSize: 6 bytes [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60000 from hadoop: wrote request header call_id: 24 method_name: "GetTableDescriptors" request_param: true [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60000 from hadoop: got response header call_id: 24, totalSize: 538 bytes [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x14f878806d5000c, packet:: clientPath:null serverPath:null finished:false header:: 17,4 replyHeader:: 17,47244640526,0 request:: '/hbase/meta-region-server,F response:: #ffffffff0001a726567696f6e7365727665723a3630303230ffffff86ffffffeeffffffdd7f19ffffff85ffffffda1050425546a14a76861646f6f703110fffffff4ffffffd4318ffffffd4ffffffc8ffffffa8ffffffbcfffffff829100183,s{47244640306,47244640306,1441088028291,1441088028291,0,0,0,0,61,0,47244640306} [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 25 method_name: "Scan" request_param: true [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 25, totalSize: 10 bytes [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60000 from hadoop: wrote request header call_id: 26 method_name: "IsMasterRunning" request_param: true [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60000 from hadoop: got response header call_id: 26, totalSize: 6 bytes [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60000 from hadoop: wrote request header call_id: 27 method_name: "GetTableDescriptors" request_param: true [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60000 from hadoop: got response header call_id: 27, totalSize: 4 bytes [org.apache.hadoop.hbase.client.HBaseAdmin]Deleted user1 [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60000 from hadoop: wrote request header call_id: 28 method_name: "IsMasterRunning" request_param: true [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60000 from hadoop: got response header call_id: 28, totalSize: 6 bytes [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60000 from hadoop: wrote request header call_id: 29 method_name: "CreateTable" request_param: true [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60000 from hadoop: got response header call_id: 29, totalSize: 4 bytes [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x14f878806d5000c, packet:: clientPath:null serverPath:null finished:false header:: 18,4 replyHeader:: 18,47244640531,0 request:: '/hbase/meta-region-server,F response:: #ffffffff0001a726567696f6e7365727665723a3630303230ffffff86ffffffeeffffffdd7f19ffffff85ffffffda1050425546a14a76861646f6f703110fffffff4ffffffd4318ffffffd4ffffffc8ffffffa8ffffffbcfffffff829100183,s{47244640306,47244640306,1441088028291,1441088028291,0,0,0,0,61,0,47244640306} [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 30 method_name: "Scan" request_param: true [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 30, totalSize: 12 bytes [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 31 method_name: "Scan" request_param: true priority: 100 [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 31 cell_block_meta { length: 436 }, totalSize: 457 bytes [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 32 method_name: "Scan" request_param: true [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 32, totalSize: 8 bytes [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x14f878806d5000c, packet:: clientPath:null serverPath:null finished:false header:: 19,4 replyHeader:: 19,47244640531,0 request:: '/hbase/meta-region-server,F response:: #ffffffff0001a726567696f6e7365727665723a3630303230ffffff86ffffffeeffffffdd7f19ffffff85ffffffda1050425546a14a76861646f6f703110fffffff4ffffffd4318ffffffd4ffffffc8ffffffa8ffffffbcfffffff829100183,s{47244640306,47244640306,1441088028291,1441088028291,0,0,0,0,61,0,47244640306} [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 33 method_name: "Scan" request_param: true [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 33, totalSize: 12 bytes [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 34 method_name: "Scan" request_param: true priority: 100 [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 34 cell_block_meta { length: 436 }, totalSize: 457 bytes [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 35 method_name: "Scan" request_param: true [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 35, totalSize: 8 bytes [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x14f878806d5000c, packet:: clientPath:null serverPath:null finished:false header:: 20,4 replyHeader:: 20,47244640531,0 request:: '/hbase/meta-region-server,F response:: #ffffffff0001a726567696f6e7365727665723a3630303230ffffff86ffffffeeffffffdd7f19ffffff85ffffffda1050425546a14a76861646f6f703110fffffff4ffffffd4318ffffffd4ffffffc8ffffffa8ffffffbcfffffff829100183,s{47244640306,47244640306,1441088028291,1441088028291,0,0,0,0,61,0,47244640306} [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 36 method_name: "Scan" request_param: true [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 36, totalSize: 12 bytes [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 37 method_name: "Scan" request_param: true priority: 100 [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 37 cell_block_meta { length: 436 }, totalSize: 457 bytes [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 38 method_name: "Scan" request_param: true [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 38, totalSize: 8 bytes [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x14f878806d5000c, packet:: clientPath:null serverPath:null finished:false header:: 21,4 replyHeader:: 21,47244640531,0 request:: '/hbase/meta-region-server,F response:: #ffffffff0001a726567696f6e7365727665723a3630303230ffffff86ffffffeeffffffdd7f19ffffff85ffffffda1050425546a14a76861646f6f703110fffffff4ffffffd4318ffffffd4ffffffc8ffffffa8ffffffbcfffffff829100183,s{47244640306,47244640306,1441088028291,1441088028291,0,0,0,0,61,0,47244640306} [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 39 method_name: "Scan" request_param: true [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 39, totalSize: 12 bytes [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 40 method_name: "Scan" request_param: true priority: 100 [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 40 cell_block_meta { length: 436 }, totalSize: 457 bytes [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 41 method_name: "Scan" request_param: true [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 41, totalSize: 8 bytes [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x14f878806d5000c, packet:: clientPath:null serverPath:null finished:false header:: 22,4 replyHeader:: 22,47244640532,0 request:: '/hbase/meta-region-server,F response:: #ffffffff0001a726567696f6e7365727665723a3630303230ffffff86ffffffeeffffffdd7f19ffffff85ffffffda1050425546a14a76861646f6f703110fffffff4ffffffd4318ffffffd4ffffffc8ffffffa8ffffffbcfffffff829100183,s{47244640306,47244640306,1441088028291,1441088028291,0,0,0,0,61,0,47244640306} [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 42 method_name: "Scan" request_param: true [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 42, totalSize: 12 bytes [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 43 method_name: "Scan" request_param: true priority: 100 [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 43 cell_block_meta { length: 565 }, totalSize: 588 bytes [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 44 method_name: "Scan" request_param: true [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 44, totalSize: 8 bytes [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x14f878806d5000c, packet:: clientPath:null serverPath:null finished:false header:: 23,4 replyHeader:: 23,47244640537,0 request:: '/hbase/meta-region-server,F response:: #ffffffff0001a726567696f6e7365727665723a3630303230ffffff86ffffffeeffffffdd7f19ffffff85ffffffda1050425546a14a76861646f6f703110fffffff4ffffffd4318ffffffd4ffffffc8ffffffa8ffffffbcfffffff829100183,s{47244640306,47244640306,1441088028291,1441088028291,0,0,0,0,61,0,47244640306} [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 45 method_name: "Scan" request_param: true [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 45, totalSize: 12 bytes [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 46 method_name: "Scan" request_param: true priority: 100 [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 46 cell_block_meta { length: 877 }, totalSize: 900 bytes [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 47 method_name: "Scan" request_param: true [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 47, totalSize: 8 bytes [org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper]Process identifier=catalogtracker-on-hconnection-0x18e4d5ba connecting to ZooKeeper ensemble=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181 [org.apache.zookeeper.ZooKeeper]Initiating client connection, connectString=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181 sessionTimeout=90000 watcher=catalogtracker-on-hconnection-0x18e4d5ba0x0, quorum=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181, baseZNode=/hbase [org.apache.hadoop.hbase.catalog.CatalogTracker]Starting catalog tracker org.apache.hadoop.hbase.catalog.CatalogTracker@20828fe4 [org.apache.zookeeper.ClientCnxn]Opening socket connection to server spark1/192.168.4.31:2181. Will not attempt to authenticate using SASL (unknown error) [org.apache.zookeeper.ClientCnxn]Socket connection established to spark1/192.168.4.31:2181, initiating session [org.apache.zookeeper.ClientCnxn]Session establishment request sent on spark1/192.168.4.31:2181 [org.apache.zookeeper.ClientCnxn]Session establishment complete on server spark1/192.168.4.31:2181, sessionid = 0x14f878806d5000d, negotiated timeout = 40000 [org.apache.hadoop.hbase.zookeeper.ZooKeeperWatcher]catalogtracker-on-hconnection-0x18e4d5ba0x0, quorum=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181, baseZNode=/hbase Received ZooKeeper Event, type=None, state=SyncConnected, path=null [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x14f878806d5000d, packet:: clientPath:null serverPath:null finished:false header:: 1,3 replyHeader:: 1,47244640538,0 request:: '/hbase/meta-region-server,T response:: s{47244640306,47244640306,1441088028291,1441088028291,0,0,0,0,61,0,47244640306} [org.apache.hadoop.hbase.zookeeper.ZKUtil]catalogtracker-on-hconnection-0x18e4d5ba0x0, quorum=spark4:2181,spark3:2181,spark2:2181,spark1:2181,spark5:2181, baseZNode=/hbase Set watcher on existing znode=/hbase/meta-region-server [org.apache.hadoop.hbase.zookeeper.ZooKeeperWatcher]catalogtracker-on-hconnection-0x18e4d5ba-0x14f878806d5000d connected [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x14f878806d5000d, packet:: clientPath:null serverPath:null finished:false header:: 2,4 replyHeader:: 2,47244640538,0 request:: '/hbase/meta-region-server,T response:: #ffffffff0001a726567696f6e7365727665723a3630303230ffffff86ffffffeeffffffdd7f19ffffff85ffffffda1050425546a14a76861646f6f703110fffffff4ffffffd4318ffffffd4ffffffc8ffffffa8ffffffbcfffffff829100183,s{47244640306,47244640306,1441088028291,1441088028291,0,0,0,0,61,0,47244640306} [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x14f878806d5000c, packet:: clientPath:null serverPath:null finished:false header:: 24,4 replyHeader:: 24,47244640538,0 request:: '/hbase/meta-region-server,F response:: #ffffffff0001a726567696f6e7365727665723a3630303230ffffff86ffffffeeffffffdd7f19ffffff85ffffffda1050425546a14a76861646f6f703110fffffff4ffffffd4318ffffffd4ffffffc8ffffffa8ffffffbcfffffff829100183,s{47244640306,47244640306,1441088028291,1441088028291,0,0,0,0,61,0,47244640306} [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 48 method_name: "Scan" request_param: true [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 48, totalSize: 13 bytes [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 49 method_name: "Scan" request_param: true priority: 100 [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 49 cell_block_meta { length: 877 }, totalSize: 901 bytes [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: wrote request header call_id: 50 method_name: "Scan" request_param: true [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: got response header call_id: 50, totalSize: 9 bytes [org.apache.hadoop.hbase.catalog.CatalogTracker]Stopping catalog tracker org.apache.hadoop.hbase.catalog.CatalogTracker@20828fe4 [org.apache.zookeeper.ZooKeeper]Closing session: 0x14f878806d5000d [org.apache.zookeeper.ClientCnxn]Closing client for session: 0x14f878806d5000d [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x14f878806d5000d, packet:: clientPath:null serverPath:null finished:false header:: 3,-11 replyHeader:: 3,47244640539,0 request:: null response:: null [org.apache.zookeeper.ClientCnxn]Disconnecting client for session: 0x14f878806d5000d [org.apache.zookeeper.ZooKeeper]Session: 0x14f878806d5000d closed [org.apache.zookeeper.ClientCnxn]EventThread shut down [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x14f878806d5000c, packet:: clientPath:null serverPath:null finished:false header:: 25,4 replyHeader:: 25,47244640539,0 request:: '/hbase/table/user1,F response:: #ffffffff000146d61737465723a36303030304b2fffffffc42632b4fffffffa75042554680,s{47244640530,47244640533,1441093183046,1441093184242,2,0,0,0,31,0,47244640530} [com.hua.hbase.test.TestHbase]Create table success! [org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation]Closing master protocol: MasterService [org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation]Closing zookeeper sessionid=0x14f878806d5000c [org.apache.zookeeper.ZooKeeper]Closing session: 0x14f878806d5000c [org.apache.zookeeper.ClientCnxn]Closing client for session: 0x14f878806d5000c [org.apache.zookeeper.ClientCnxn]Reading reply sessionid:0x14f878806d5000c, packet:: clientPath:null serverPath:null finished:false header:: 26,-11 replyHeader:: 26,47244640540,0 request:: null response:: null [org.apache.zookeeper.ClientCnxn]Disconnecting client for session: 0x14f878806d5000c [org.apache.zookeeper.ClientCnxn]An exception was thrown while closing send thread for session 0x14f878806d5000c : Unable to read additional data from server sessionid 0x14f878806d5000c, likely server has closed socket [org.apache.zookeeper.ClientCnxn]EventThread shut down [org.apache.zookeeper.ZooKeeper]Session: 0x14f878806d5000c closed [org.apache.hadoop.ipc.RpcClient]Stopping rpc client [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60000 from hadoop: closed [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: closed [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60020 from hadoop: stopped, connections 0 [org.apache.hadoop.ipc.RpcClient]IPC Client (791957027) connection to hadoop1/192.168.4.21:60000 from hadoop: stopped, connections 0 [org.apache.hadoop.ipc.Client]Stopping client
更多推荐
已为社区贡献1条内容
所有评论(0)