Failed
org.dataone.service.cn.replication.v2.ReplicationManagerTestUnit.testCreateAndQueueTasks (from org.dataone.service.cn.replication.v2.ReplicationManagerSuiteTest)
Failing for the past 1 build
(Since #24 )
Error Message
The number of tasks created should equal the replication policy numberOfReplicas. expected:<2> but was:<0>
Stacktrace
java.lang.AssertionError: The number of tasks created should equal the replication policy numberOfReplicas. expected:<2> but was:<0> at org.junit.Assert.fail(Assert.java:91) at org.junit.Assert.failNotEquals(Assert.java:645) at org.junit.Assert.assertEquals(Assert.java:126) at org.junit.Assert.assertEquals(Assert.java:470) at org.dataone.service.cn.replication.v2.ReplicationManagerTestUnit.testCreateAndQueueTasks(ReplicationManagerTestUnit.java:189) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28) at org.springframework.test.context.junit4.statements.RunBeforeTestMethodCallbacks.evaluate(RunBeforeTestMethodCallbacks.java:74) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31) at org.springframework.test.context.junit4.statements.RunAfterTestMethodCallbacks.evaluate(RunAfterTestMethodCallbacks.java:83) at org.springframework.test.context.junit4.statements.SpringRepeat.evaluate(SpringRepeat.java:72) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:231) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50) at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28) at org.springframework.test.context.junit4.statements.RunBeforeTestClassCallbacks.evaluate(RunBeforeTestClassCallbacks.java:61) at org.springframework.test.context.junit4.statements.RunAfterTestClassCallbacks.evaluate(RunAfterTestClassCallbacks.java:71) at org.junit.runners.ParentRunner.run(ParentRunner.java:236) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.run(SpringJUnit4ClassRunner.java:174) at org.junit.runners.Suite.runChild(Suite.java:128) at org.junit.runners.Suite.runChild(Suite.java:24) at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184) at org.junit.runners.ParentRunner.run(ParentRunner.java:236) at org.dataone.test.apache.directory.server.integ.ApacheDSSuiteRunner.run(ApacheDSSuiteRunner.java:208) at org.apache.maven.surefire.junit4.JUnit4TestSet.execute(JUnit4TestSet.java:53) at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:123) at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:104) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.maven.surefire.util.ReflectionUtils.invokeMethodWithArray(ReflectionUtils.java:164) at org.apache.maven.surefire.booter.ProviderFactory$ProviderProxy.invoke(ProviderFactory.java:110) at org.apache.maven.surefire.booter.SurefireStarter.invokeProvider(SurefireStarter.java:175) at org.apache.maven.surefire.booter.SurefireStarter.runSuitesInProcessWhenForked(SurefireStarter.java:107) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:68)
Standard Output
20170216-20:22:59: [INFO]: @TestExecutionListeners is not present for class [class org.dataone.service.cn.replication.v2.ReplicationManagerTestUnit]: using defaults. [org.springframework.test.context.TestContextManager] 20170216-20:22:59: [INFO]: Registered pre-bundled control factory: 1.3.6.1.4.1.18060.0.0.1 [org.apache.directory.api.ldap.codec.osgi.DefaultLdapCodecService] 20170216-20:22:59: [INFO]: Registered pre-bundled control factory: 2.16.840.1.113730.3.4.7 [org.apache.directory.api.ldap.codec.osgi.DefaultLdapCodecService] 20170216-20:22:59: [INFO]: Registered pre-bundled control factory: 2.16.840.1.113730.3.4.2 [org.apache.directory.api.ldap.codec.osgi.DefaultLdapCodecService] 20170216-20:22:59: [INFO]: Registered pre-bundled control factory: 2.16.840.1.113730.3.4.18 [org.apache.directory.api.ldap.codec.osgi.DefaultLdapCodecService] 20170216-20:22:59: [INFO]: Registered pre-bundled control factory: 1.2.840.113556.1.4.319 [org.apache.directory.api.ldap.codec.osgi.DefaultLdapCodecService] 20170216-20:22:59: [INFO]: Registered pre-bundled control factory: 2.16.840.1.113730.3.4.3 [org.apache.directory.api.ldap.codec.osgi.DefaultLdapCodecService] 20170216-20:22:59: [INFO]: Registered pre-bundled control factory: 1.3.6.1.4.1.4203.1.10.1 [org.apache.directory.api.ldap.codec.osgi.DefaultLdapCodecService] 20170216-20:22:59: [INFO]: Registered pre-bundled control factory: 1.3.6.1.4.1.18060.0.0.1 [org.apache.directory.api.ldap.codec.standalone.CodecFactoryUtil] 20170216-20:22:59: [INFO]: Registered pre-bundled control factory: 2.16.840.1.113730.3.4.7 [org.apache.directory.api.ldap.codec.standalone.CodecFactoryUtil] 20170216-20:22:59: [INFO]: Registered pre-bundled control factory: 2.16.840.1.113730.3.4.2 [org.apache.directory.api.ldap.codec.standalone.CodecFactoryUtil] 20170216-20:22:59: [INFO]: Registered pre-bundled control factory: 2.16.840.1.113730.3.4.18 [org.apache.directory.api.ldap.codec.standalone.CodecFactoryUtil] 20170216-20:22:59: [INFO]: Registered pre-bundled control factory: 1.2.840.113556.1.4.319 [org.apache.directory.api.ldap.codec.standalone.CodecFactoryUtil] 20170216-20:22:59: [INFO]: Registered pre-bundled control factory: 2.16.840.1.113730.3.4.3 [org.apache.directory.api.ldap.codec.standalone.CodecFactoryUtil] 20170216-20:22:59: [INFO]: Registered pre-bundled control factory: 1.3.6.1.4.1.4203.1.10.1 [org.apache.directory.api.ldap.codec.standalone.CodecFactoryUtil] 20170216-20:22:59: [INFO]: Registered pre-bundled control factory: 1.3.6.1.4.1.42.2.27.8.5.1 [org.apache.directory.api.ldap.codec.standalone.CodecFactoryUtil] 20170216-20:22:59: [INFO]: Registered pre-bundled control factory: 2.16.840.1.113730.3.4.9 [org.apache.directory.api.ldap.codec.standalone.CodecFactoryUtil] 20170216-20:22:59: [INFO]: Registered pre-bundled control factory: 2.16.840.1.113730.3.4.10 [org.apache.directory.api.ldap.codec.standalone.CodecFactoryUtil] 20170216-20:22:59: [INFO]: Registered pre-bundled control factory: 1.3.6.1.4.1.4203.1.9.1.3 [org.apache.directory.api.ldap.codec.standalone.CodecFactoryUtil] 20170216-20:22:59: [INFO]: Registered pre-bundled control factory: 1.3.6.1.4.1.4203.1.9.1.4 [org.apache.directory.api.ldap.codec.standalone.CodecFactoryUtil] 20170216-20:22:59: [INFO]: Registered pre-bundled control factory: 1.3.6.1.4.1.4203.1.9.1.1 [org.apache.directory.api.ldap.codec.standalone.CodecFactoryUtil] 20170216-20:22:59: [INFO]: Registered pre-bundled control factory: 1.3.6.1.4.1.4203.1.9.1.2 [org.apache.directory.api.ldap.codec.standalone.CodecFactoryUtil] 20170216-20:22:59: [INFO]: Registered pre-bundled control factory: 1.2.840.113556.1.4.473 [org.apache.directory.api.ldap.codec.standalone.CodecFactoryUtil] 20170216-20:22:59: [INFO]: Registered pre-bundled control factory: 1.2.840.113556.1.4.474 [org.apache.directory.api.ldap.codec.standalone.CodecFactoryUtil] 20170216-20:22:59: [INFO]: Registered pre-bundled control factory: 1.2.840.113556.1.4.841 [org.apache.directory.api.ldap.codec.standalone.CodecFactoryUtil] 20170216-20:22:59: [INFO]: Registered pre-bundled control factory: 1.2.840.113556.1.4.417 [org.apache.directory.api.ldap.codec.standalone.CodecFactoryUtil] 20170216-20:22:59: [INFO]: Registered pre-bundled extended operation factory: 1.3.6.1.1.8 [org.apache.directory.api.ldap.codec.standalone.CodecFactoryUtil] 20170216-20:22:59: [INFO]: Registered pre-bundled extended operation factory: 1.3.6.1.4.1.18060.0.1.8 [org.apache.directory.api.ldap.codec.standalone.CodecFactoryUtil] 20170216-20:22:59: [INFO]: Registered pre-bundled extended operation factory: 1.3.6.1.4.1.18060.0.1.3 [org.apache.directory.api.ldap.codec.standalone.CodecFactoryUtil] 20170216-20:22:59: [INFO]: Registered pre-bundled extended operation factory: 1.3.6.1.4.1.18060.0.1.6 [org.apache.directory.api.ldap.codec.standalone.CodecFactoryUtil] 20170216-20:22:59: [INFO]: Registered pre-bundled extended operation factory: 1.3.6.1.4.1.18060.0.1.5 [org.apache.directory.api.ldap.codec.standalone.CodecFactoryUtil] 20170216-20:22:59: [INFO]: Registered pre-bundled extended operation factory: 1.3.6.1.4.1.4203.1.11.1 [org.apache.directory.api.ldap.codec.standalone.CodecFactoryUtil] 20170216-20:22:59: [INFO]: Registered pre-bundled extended operation factory: 1.3.6.1.4.1.4203.1.11.3 [org.apache.directory.api.ldap.codec.standalone.CodecFactoryUtil] 20170216-20:22:59: [INFO]: Registered pre-bundled extended operation factory: 1.3.6.1.4.1.1466.20037 [org.apache.directory.api.ldap.codec.standalone.CodecFactoryUtil] 20170216-20:23:00: [INFO]: no custom cache configuration was set, loading the default cache configuration [org.apache.directory.server.core.api.CacheService] 20170216-20:23:00: [INFO]: Schema directory '/tmp/server-work-org/partitions/schema' does NOT exist: extracted state set to false. [org.apache.directory.api.ldap.schema.extractor.impl.DefaultSchemaLdifExtractor] 20170216-20:23:01: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader] 20170216-20:23:01: [INFO]: Loading system enabled schema: Schema Name: system Disabled: false Owner: uid=admin,ou=system Dependencies: [] [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager] 20170216-20:23:01: [INFO]: system schema has already been loaded [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager] 20170216-20:23:01: [INFO]: Loading apachemeta enabled schema: Schema Name: apachemeta Disabled: false Owner: uid=admin,ou=system Dependencies: [system] [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager] 20170216-20:23:01: [INFO]: system schema has already been loaded [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager] 20170216-20:23:01: [INFO]: system schema has already been loaded [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager] 20170216-20:23:01: [INFO]: Loading core enabled schema: Schema Name: core Disabled: false Owner: uid=admin,ou=system Dependencies: [system] [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager] 20170216-20:23:01: [INFO]: Loading apache enabled schema: Schema Name: apache Disabled: false Owner: uid=admin,ou=system Dependencies: [system, core] [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager] 20170216-20:23:01: [INFO]: core schema has already been loaded [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager] 20170216-20:23:01: [INFO]: Loading other enabled schema: Schema Name: other Disabled: false Owner: uid=admin,ou=system Dependencies: [system, apachemeta, apache, core] [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager] 20170216-20:23:01: [INFO]: system schema has already been loaded [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager] 20170216-20:23:01: [INFO]: core schema has already been loaded [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager] 20170216-20:23:01: [INFO]: Loading cosine enabled schema: Schema Name: cosine Disabled: false Owner: uid=admin,ou=system Dependencies: [system, core] [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager] 20170216-20:23:01: [INFO]: system schema has already been loaded [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager] 20170216-20:23:01: [INFO]: core schema has already been loaded [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager] 20170216-20:23:01: [INFO]: Loading collective enabled schema: Schema Name: collective Disabled: false Owner: uid=admin,ou=system Dependencies: [system, core] [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager] 20170216-20:23:01: [INFO]: system schema has already been loaded [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager] 20170216-20:23:01: [INFO]: cosine schema has already been loaded [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager] 20170216-20:23:01: [INFO]: core schema has already been loaded [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager] 20170216-20:23:01: [INFO]: Loading inetorgperson enabled schema: Schema Name: inetorgperson Disabled: false Owner: uid=admin,ou=system Dependencies: [system, cosine, core] [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager] 20170216-20:23:01: [INFO]: apache schema has already been loaded [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager] 20170216-20:23:01: [INFO]: system schema has already been loaded [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager] 20170216-20:23:01: [INFO]: system schema has already been loaded [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager] 20170216-20:23:01: [INFO]: Loading pwdpolicy enabled schema: Schema Name: pwdpolicy Disabled: false Owner: uid=admin,ou=system Dependencies: [system] [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager] 20170216-20:23:01: [INFO]: system schema has already been loaded [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager] 20170216-20:23:01: [INFO]: core schema has already been loaded [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager] 20170216-20:23:01: [INFO]: Loading krb5kdc enabled schema: Schema Name: krb5kdc Disabled: false Owner: uid=admin,ou=system Dependencies: [system, core] [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager] 20170216-20:23:01: [INFO]: system schema has already been loaded [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager] 20170216-20:23:01: [INFO]: core schema has already been loaded [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager] 20170216-20:23:01: [INFO]: Loading java enabled schema: Schema Name: java Disabled: false Owner: uid=admin,ou=system Dependencies: [system, core] [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager] 20170216-20:23:01: [INFO]: core schema has already been loaded [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager] 20170216-20:23:01: [INFO]: apachemeta schema has already been loaded [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager] 20170216-20:23:01: [INFO]: system schema has already been loaded [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager] 20170216-20:23:01: [INFO]: core schema has already been loaded [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager] 20170216-20:23:01: [INFO]: apache schema has already been loaded [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager] 20170216-20:23:01: [INFO]: Loading adsconfig enabled schema: Schema Name: adsconfig Disabled: false Owner: uid=admin,ou=system Dependencies: [system, core, apache] [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager] 20170216-20:23:01: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader] 20170216-20:23:01: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader] 20170216-20:23:01: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader] 20170216-20:23:01: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader] 20170216-20:23:01: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader] 20170216-20:23:01: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader] 20170216-20:23:01: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader] 20170216-20:23:01: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader] 20170216-20:23:01: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader] 20170216-20:23:01: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader] 20170216-20:23:01: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader] 20170216-20:23:02: [WARN]: ApacheDS shutdown hook has NOT been registered with the runtime. This default setting for standalone operation has been overriden. [org.apache.directory.server.core.DefaultDirectoryService] 20170216-20:23:02: [INFO]: fetching the cache named dnCache [org.apache.directory.server.core.api.CacheService] 20170216-20:23:02: [INFO]: fetching the cache named alias [org.apache.directory.server.core.api.CacheService] 20170216-20:23:02: [INFO]: No cache with name alias exists, creating one [org.apache.directory.server.core.api.CacheService] 20170216-20:23:02: [INFO]: fetching the cache named piar [org.apache.directory.server.core.api.CacheService] 20170216-20:23:02: [INFO]: No cache with name piar exists, creating one [org.apache.directory.server.core.api.CacheService] 20170216-20:23:02: [INFO]: fetching the cache named entryDn [org.apache.directory.server.core.api.CacheService] 20170216-20:23:02: [INFO]: No cache with name entryDn exists, creating one [org.apache.directory.server.core.api.CacheService] 20170216-20:23:02: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: corbaIor [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: corbaObject [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: corbaRepositoryId [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: automountInformation [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaShareName [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaDomainName [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaNTPassword [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaTrustFlags [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaPwdLastSet [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaSID [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: gidNumber [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaGroupType [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaSID [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaSIDList [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: gidNumber [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: uidNumber [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaSID [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: gidNumber [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: uidNumber [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaDomainName [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaSID [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaAlgorithmicRidBase [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaForceLogoff [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaLockoutDuration [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaLockoutObservationWindow [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaLockoutThreshold [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaLogonToChgPwd [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaMaxPwdAge [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaMinPwdAge [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaMinPwdLength [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaNextGroupRid [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaNextRid [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaNextUserRid [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaPwdHistoryLength [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaRefuseMachinePwdChange [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaSID [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaOptionName [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaBoolOption [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaIntegerOption [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaStringListOption [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaStringOption [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaSID [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaPrivilegeList [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaSID [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaAcctFlags [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaBadPasswordCount [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaBadPasswordTime [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaDomainName [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaHomeDrive [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaHomePath [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaKickoffTime [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaLMPassword [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaLogoffTime [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaLogonHours [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaLogonScript [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaLogonTime [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaMungedDial [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaNTPassword [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaPasswordHistory [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaPrimaryGroupSID [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaProfilePath [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaPwdCanChange [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaPwdLastSet [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaPwdMustChange [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: sambaUserWorkstations [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader] 20170216-20:23:02: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: nisPublicKey [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: nisSecretKey [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: uidNumber [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: nisDomain [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader] 20170216-20:23:02: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: caseExactIA5SubstringsMatch [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader] 20170216-20:23:02: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: caseExactIA5SubstringsMatch [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: xmozillanickname [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: xmozillausehtmlmail [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: mozillaSecondEmail [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: mozillaPostalAddress2 [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: mozillaHomePostalAddress2 [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: mozillaHomeLocalityName [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: mozillaHomeState [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: mozillaHomePostalCode [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: mozillaHomeCountryName [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: mozillaHomeFriendlyCountryName [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: homeurl [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: workurl [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: custom1 [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: custom2 [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: custom3 [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: custom4 [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: nsAIMid [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: nisMapName [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: gidNumber [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: memberUid [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: shadowLastChange [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: shadowMin [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: shadowMax [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: shadowWarning [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: shadowInactive [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: shadowExpire [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: shadowFlag [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: macAddress [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: nisNetgroupTriple [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: memberNisNetgroup [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: uidNumber [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: gidNumber [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: homeDirectory [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: loginShell [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: gecos [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: ipNetworkNumber [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: ipNetmaskNumber [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: nisMapEntry [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: nisMapName [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: oncRpcNumber [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: bootFile [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: bootParameter [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: ipProtocolNumber [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: ipHostNumber [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: ipServicePort [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: ipServiceProtocol [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:02: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: caseExactIA5SubstringsMatch [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: caseExactIA5SubstringsMatch [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: caseExactIA5SubstringsMatch [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: apacheDnsSoaMName [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: apacheDnsSoaRName [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: apacheDnsSoaMinimum [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: apacheDnsAbstractRecord [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: apacheDnsClass [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: apacheDnsSoaSerial [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: apacheDnsSoaRefresh [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: apacheDnsSoaRetry [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: apacheDnsSoaExpire [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: apacheDnsDomainName [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: apacheDnsAbstractRecord [org.apache.directory.api.ldap.model.entry.AbstractValue] 20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID nor ...[truncated 36835 chars]... ager... [org.dataone.client.utils.HttpConnectionMonitorService] 20170216-20:23:31: [INFO]: RestClient.doRequestNoBody, thread(1) call Info: GET https://cn-dev-ucsb-1.test.dataone.org/cn/v2/node [org.dataone.client.rest.RestClient] 20170216-20:23:31: [INFO]: response httpCode: 200 [org.dataone.service.util.ExceptionHandler] 20170216-20:23:31: [INFO]: Refreshing org.springframework.context.annotation.AnnotationConfigApplicationContext@712344d8: startup date [Thu Feb 16 20:23:31 UTC 2017]; root of context hierarchy [org.springframework.context.annotation.AnnotationConfigApplicationContext] 20170216-20:23:31: [INFO]: Overriding bean definition for bean 'replicationTaskRepository': replacing [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] with [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] [org.springframework.beans.factory.support.DefaultListableBeanFactory] 20170216-20:23:31: [INFO]: Overriding bean definition for bean 'replicationAttemptHistoryRepository': replacing [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] with [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] [org.springframework.beans.factory.support.DefaultListableBeanFactory] 20170216-20:23:31: [INFO]: Overriding bean definition for bean 'dataSource': replacing [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationPostgresRepositoryFactory; factoryMethodName=dataSource; initMethodName=null; destroyMethodName=(inferred); defined in class path resource [org/dataone/cn/data/repository/ReplicationPostgresRepositoryFactory.class]] with [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationH2RepositoryFactory; factoryMethodName=dataSource; initMethodName=null; destroyMethodName=(inferred); defined in class org.dataone.cn.data.repository.ReplicationH2RepositoryFactory] [org.springframework.beans.factory.support.DefaultListableBeanFactory] 20170216-20:23:31: [INFO]: Overriding bean definition for bean 'jpaVendorAdapter': replacing [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationPostgresRepositoryFactory; factoryMethodName=jpaVendorAdapter; initMethodName=null; destroyMethodName=(inferred); defined in class path resource [org/dataone/cn/data/repository/ReplicationPostgresRepositoryFactory.class]] with [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationH2RepositoryFactory; factoryMethodName=jpaVendorAdapter; initMethodName=null; destroyMethodName=(inferred); defined in class org.dataone.cn.data.repository.ReplicationH2RepositoryFactory] [org.springframework.beans.factory.support.DefaultListableBeanFactory] 20170216-20:23:31: [INFO]: Creating embedded database 'testdb' [org.springframework.jdbc.datasource.embedded.EmbeddedDatabaseFactory] 20170216-20:23:31: [INFO]: Building JPA container EntityManagerFactory for persistence unit 'default' [org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean] 20170216-20:23:31: [INFO]: Processing PersistenceUnitInfo [ name: default ...] [org.hibernate.ejb.Ejb3Configuration] 20170216-20:23:31: [INFO]: Binding entity from annotated class: org.dataone.cn.data.repository.ReplicationTask [org.hibernate.cfg.AnnotationBinder] 20170216-20:23:31: [INFO]: Bind entity org.dataone.cn.data.repository.ReplicationTask on table replication_task_queue [org.hibernate.cfg.annotations.EntityBinder] 20170216-20:23:31: [INFO]: Binding entity from annotated class: org.dataone.cn.data.repository.ReplicationAttemptHistory [org.hibernate.cfg.AnnotationBinder] 20170216-20:23:31: [INFO]: Bind entity org.dataone.cn.data.repository.ReplicationAttemptHistory on table replication_try_history [org.hibernate.cfg.annotations.EntityBinder] 20170216-20:23:31: [INFO]: Hibernate Validator not found: ignoring [org.hibernate.cfg.Configuration] 20170216-20:23:31: [INFO]: Unable to find org.hibernate.search.event.FullTextIndexEventListener on the classpath. Hibernate Search is not enabled. [org.hibernate.cfg.search.HibernateSearchEventListenerRegister] 20170216-20:23:31: [INFO]: Initializing connection provider: org.hibernate.ejb.connection.InjectedDataSourceConnectionProvider [org.hibernate.connection.ConnectionProviderFactory] 20170216-20:23:31: [INFO]: Using provided datasource [org.hibernate.ejb.connection.InjectedDataSourceConnectionProvider] 20170216-20:23:31: [INFO]: Using dialect: org.hibernate.dialect.H2Dialect [org.hibernate.dialect.Dialect] 20170216-20:23:31: [INFO]: Disabling contextual LOB creation as JDBC driver reported JDBC version [3] less than 4 [org.hibernate.engine.jdbc.JdbcSupportLoader] 20170216-20:23:31: [INFO]: Database -> name : H2 version : 1.3.163 (2011-12-30) major : 1 minor : 3 [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Driver -> name : H2 JDBC Driver version : 1.3.163 (2011-12-30) major : 1 minor : 3 [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Using default transaction strategy (direct JDBC transactions) [org.hibernate.transaction.TransactionFactoryFactory] 20170216-20:23:31: [INFO]: No TransactionManagerLookup configured (in JTA environment, use of read-write or transactional second-level cache is not recommended) [org.hibernate.transaction.TransactionManagerLookupFactory] 20170216-20:23:31: [INFO]: Automatic flush during beforeCompletion(): disabled [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Automatic session close at end of transaction: disabled [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: JDBC batch size: 15 [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: JDBC batch updates for versioned data: disabled [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Scrollable result sets: enabled [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: JDBC3 getGeneratedKeys(): enabled [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Connection release mode: auto [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Default batch fetch size: 1 [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Generate SQL with comments: disabled [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Order SQL updates by primary key: disabled [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Order SQL inserts for batching: disabled [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Query translator: org.hibernate.hql.ast.ASTQueryTranslatorFactory [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Using ASTQueryTranslatorFactory [org.hibernate.hql.ast.ASTQueryTranslatorFactory] 20170216-20:23:31: [INFO]: Query language substitutions: {} [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: JPA-QL strict compliance: enabled [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Second-level cache: enabled [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Query cache: disabled [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Cache region factory : org.hibernate.cache.impl.NoCachingRegionFactory [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Optimize cache for minimal puts: disabled [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Structured second-level cache entries: disabled [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Statistics: disabled [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Deleted entity synthetic identifier rollback: disabled [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Default entity-mode: pojo [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Named query checking : enabled [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Check Nullability in Core (should be disabled when Bean Validation is on): enabled [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: building session factory [org.hibernate.impl.SessionFactoryImpl] 20170216-20:23:31: [INFO]: Type registration [materialized_blob] overrides previous : org.hibernate.type.MaterializedBlobType@45a98cca [org.hibernate.type.BasicTypeRegistry] 20170216-20:23:31: [INFO]: Type registration [wrapper_characters_clob] overrides previous : org.hibernate.type.CharacterArrayClobType@7e5a4580 [org.hibernate.type.BasicTypeRegistry] 20170216-20:23:31: [INFO]: Type registration [wrapper_materialized_blob] overrides previous : org.hibernate.type.WrappedMaterializedBlobType@5889174e [org.hibernate.type.BasicTypeRegistry] 20170216-20:23:31: [INFO]: Type registration [clob] overrides previous : org.hibernate.type.ClobType@10592f4b [org.hibernate.type.BasicTypeRegistry] 20170216-20:23:31: [INFO]: Type registration [java.sql.Clob] overrides previous : org.hibernate.type.ClobType@10592f4b [org.hibernate.type.BasicTypeRegistry] 20170216-20:23:31: [INFO]: Type registration [blob] overrides previous : org.hibernate.type.BlobType@4f2fed4f [org.hibernate.type.BasicTypeRegistry] 20170216-20:23:31: [INFO]: Type registration [java.sql.Blob] overrides previous : org.hibernate.type.BlobType@4f2fed4f [org.hibernate.type.BasicTypeRegistry] 20170216-20:23:31: [INFO]: Type registration [materialized_clob] overrides previous : org.hibernate.type.MaterializedClobType@53850626 [org.hibernate.type.BasicTypeRegistry] 20170216-20:23:31: [INFO]: Type registration [characters_clob] overrides previous : org.hibernate.type.PrimitiveCharacterArrayClobType@4256d3a0 [org.hibernate.type.BasicTypeRegistry] 20170216-20:23:31: [INFO]: Not binding factory to JNDI, no JNDI name configured [org.hibernate.impl.SessionFactoryObjectFactory] 20170216-20:23:31: [INFO]: Running hbm2ddl schema update [org.hibernate.tool.hbm2ddl.SchemaUpdate] 20170216-20:23:31: [INFO]: fetching database metadata [org.hibernate.tool.hbm2ddl.SchemaUpdate] 20170216-20:23:31: [INFO]: updating schema [org.hibernate.tool.hbm2ddl.SchemaUpdate] 20170216-20:23:31: [INFO]: table found: TESTDB.PUBLIC.REPLICATION_TASK_QUEUE [org.hibernate.tool.hbm2ddl.TableMetadata] 20170216-20:23:31: [INFO]: columns: [id, nextexecution, status, pid, trycount] [org.hibernate.tool.hbm2ddl.TableMetadata] 20170216-20:23:31: [INFO]: foreign keys: [] [org.hibernate.tool.hbm2ddl.TableMetadata] 20170216-20:23:31: [INFO]: indexes: [index_pid_task, index_exec_task, primary_key_1] [org.hibernate.tool.hbm2ddl.TableMetadata] 20170216-20:23:31: [INFO]: table found: TESTDB.PUBLIC.REPLICATION_TRY_HISTORY [org.hibernate.tool.hbm2ddl.TableMetadata] 20170216-20:23:31: [INFO]: columns: [id, lastreplicationattemptdate, pid, replicationattempts, nodeid] [org.hibernate.tool.hbm2ddl.TableMetadata] 20170216-20:23:31: [INFO]: foreign keys: [] [org.hibernate.tool.hbm2ddl.TableMetadata] 20170216-20:23:31: [INFO]: indexes: [index_pid, primary_key_b] [org.hibernate.tool.hbm2ddl.TableMetadata] 20170216-20:23:31: [INFO]: schema update complete [org.hibernate.tool.hbm2ddl.SchemaUpdate] 20170216-20:23:31: [INFO]: Pre-instantiating singletons in org.springframework.beans.factory.support.DefaultListableBeanFactory@1b3e8303: defining beans [org.springframework.context.annotation.internalConfigurationAnnotationProcessor,org.springframework.context.annotation.internalAutowiredAnnotationProcessor,org.springframework.context.annotation.internalRequiredAnnotationProcessor,org.springframework.context.annotation.internalCommonAnnotationProcessor,org.springframework.context.annotation.internalPersistenceAnnotationProcessor,replicationH2RepositoryFactory,org.springframework.context.annotation.ConfigurationClassPostProcessor.importAwareProcessor,replicationPostgresRepositoryFactory,org.springframework.data.repository.core.support.RepositoryInterfaceAwareBeanPostProcessor#0,replicationAttemptHistoryRepository,replicationTaskRepository,org.springframework.data.repository.core.support.RepositoryInterfaceAwareBeanPostProcessor#1,dataSource,jpaVendorAdapter,entityManagerFactory,transactionManager]; root of factory hierarchy [org.springframework.beans.factory.support.DefaultListableBeanFactory] 20170216-20:23:31: [INFO]: Refreshing org.springframework.context.annotation.AnnotationConfigApplicationContext@5f1a797e: startup date [Thu Feb 16 20:23:31 UTC 2017]; root of context hierarchy [org.springframework.context.annotation.AnnotationConfigApplicationContext] 20170216-20:23:31: [INFO]: Overriding bean definition for bean 'replicationTaskRepository': replacing [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] with [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] [org.springframework.beans.factory.support.DefaultListableBeanFactory] 20170216-20:23:31: [INFO]: Overriding bean definition for bean 'replicationAttemptHistoryRepository': replacing [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] with [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] [org.springframework.beans.factory.support.DefaultListableBeanFactory] 20170216-20:23:31: [INFO]: Overriding bean definition for bean 'dataSource': replacing [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationH2RepositoryFactory; factoryMethodName=dataSource; initMethodName=null; destroyMethodName=(inferred); defined in class path resource [org/dataone/cn/data/repository/ReplicationH2RepositoryFactory.class]] with [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationPostgresRepositoryFactory; factoryMethodName=dataSource; initMethodName=null; destroyMethodName=(inferred); defined in class org.dataone.cn.data.repository.ReplicationPostgresRepositoryFactory] [org.springframework.beans.factory.support.DefaultListableBeanFactory] 20170216-20:23:31: [INFO]: Overriding bean definition for bean 'jpaVendorAdapter': replacing [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationH2RepositoryFactory; factoryMethodName=jpaVendorAdapter; initMethodName=null; destroyMethodName=(inferred); defined in class path resource [org/dataone/cn/data/repository/ReplicationH2RepositoryFactory.class]] with [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationPostgresRepositoryFactory; factoryMethodName=jpaVendorAdapter; initMethodName=null; destroyMethodName=(inferred); defined in class org.dataone.cn.data.repository.ReplicationPostgresRepositoryFactory] [org.springframework.beans.factory.support.DefaultListableBeanFactory] 20170216-20:23:31: [INFO]: Building JPA container EntityManagerFactory for persistence unit 'default' [org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean] 20170216-20:23:31: [INFO]: Processing PersistenceUnitInfo [ name: default ...] [org.hibernate.ejb.Ejb3Configuration] 20170216-20:23:31: [INFO]: Binding entity from annotated class: org.dataone.cn.data.repository.ReplicationTask [org.hibernate.cfg.AnnotationBinder] 20170216-20:23:31: [INFO]: Bind entity org.dataone.cn.data.repository.ReplicationTask on table replication_task_queue [org.hibernate.cfg.annotations.EntityBinder] 20170216-20:23:31: [INFO]: Binding entity from annotated class: org.dataone.cn.data.repository.ReplicationAttemptHistory [org.hibernate.cfg.AnnotationBinder] 20170216-20:23:31: [INFO]: Bind entity org.dataone.cn.data.repository.ReplicationAttemptHistory on table replication_try_history [org.hibernate.cfg.annotations.EntityBinder] 20170216-20:23:31: [INFO]: Hibernate Validator not found: ignoring [org.hibernate.cfg.Configuration] 20170216-20:23:31: [INFO]: Unable to find org.hibernate.search.event.FullTextIndexEventListener on the classpath. Hibernate Search is not enabled. [org.hibernate.cfg.search.HibernateSearchEventListenerRegister] 20170216-20:23:31: [INFO]: Initializing connection provider: org.hibernate.ejb.connection.InjectedDataSourceConnectionProvider [org.hibernate.connection.ConnectionProviderFactory] 20170216-20:23:31: [INFO]: Using provided datasource [org.hibernate.ejb.connection.InjectedDataSourceConnectionProvider] 20170216-20:23:31: [INFO]: Using dialect: org.hibernate.dialect.PostgreSQLDialect [org.hibernate.dialect.Dialect] 20170216-20:23:31: [INFO]: Disabling contextual LOB creation as JDBC driver reported JDBC version [3] less than 4 [org.hibernate.engine.jdbc.JdbcSupportLoader] 20170216-20:23:31: [INFO]: Database -> name : H2 version : 1.3.163 (2011-12-30) major : 1 minor : 3 [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Driver -> name : H2 JDBC Driver version : 1.3.163 (2011-12-30) major : 1 minor : 3 [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Using default transaction strategy (direct JDBC transactions) [org.hibernate.transaction.TransactionFactoryFactory] 20170216-20:23:31: [INFO]: No TransactionManagerLookup configured (in JTA environment, use of read-write or transactional second-level cache is not recommended) [org.hibernate.transaction.TransactionManagerLookupFactory] 20170216-20:23:31: [INFO]: Automatic flush during beforeCompletion(): disabled [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Automatic session close at end of transaction: disabled [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: JDBC batch size: 15 [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: JDBC batch updates for versioned data: disabled [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Scrollable result sets: enabled [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: JDBC3 getGeneratedKeys(): enabled [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Connection release mode: auto [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Default batch fetch size: 1 [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Generate SQL with comments: disabled [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Order SQL updates by primary key: disabled [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Order SQL inserts for batching: disabled [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Query translator: org.hibernate.hql.ast.ASTQueryTranslatorFactory [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Using ASTQueryTranslatorFactory [org.hibernate.hql.ast.ASTQueryTranslatorFactory] 20170216-20:23:31: [INFO]: Query language substitutions: {} [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: JPA-QL strict compliance: enabled [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Second-level cache: enabled [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Query cache: disabled [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Cache region factory : org.hibernate.cache.impl.NoCachingRegionFactory [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Optimize cache for minimal puts: disabled [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Structured second-level cache entries: disabled [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Statistics: disabled [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Deleted entity synthetic identifier rollback: disabled [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Default entity-mode: pojo [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Named query checking : enabled [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: Check Nullability in Core (should be disabled when Bean Validation is on): enabled [org.hibernate.cfg.SettingsFactory] 20170216-20:23:31: [INFO]: building session factory [org.hibernate.impl.SessionFactoryImpl] 20170216-20:23:31: [INFO]: Type registration [materialized_blob] overrides previous : org.hibernate.type.MaterializedBlobType@45a98cca [org.hibernate.type.BasicTypeRegistry] 20170216-20:23:31: [INFO]: Not binding factory to JNDI, no JNDI name configured [org.hibernate.impl.SessionFactoryObjectFactory] 20170216-20:23:31: [INFO]: Running hbm2ddl schema update [org.hibernate.tool.hbm2ddl.SchemaUpdate] 20170216-20:23:31: [INFO]: fetching database metadata [org.hibernate.tool.hbm2ddl.SchemaUpdate] 20170216-20:23:31: [ERROR]: could not get database metadata [org.hibernate.tool.hbm2ddl.SchemaUpdate] org.h2.jdbc.JdbcSQLException: Table "PG_CLASS" not found; SQL statement: select relname from pg_class where relkind='S' [42102-163] at org.h2.message.DbException.getJdbcSQLException(DbException.java:329) at org.h2.message.DbException.get(DbException.java:169) at org.h2.message.DbException.get(DbException.java:146) at org.h2.command.Parser.readTableOrView(Parser.java:4758) at org.h2.command.Parser.readTableFilter(Parser.java:1080) at org.h2.command.Parser.parseSelectSimpleFromPart(Parser.java:1686) at org.h2.command.Parser.parseSelectSimple(Parser.java:1793) at org.h2.command.Parser.parseSelectSub(Parser.java:1680) at org.h2.command.Parser.parseSelectUnion(Parser.java:1523) at org.h2.command.Parser.parseSelect(Parser.java:1511) at org.h2.command.Parser.parsePrepared(Parser.java:405) at org.h2.command.Parser.parse(Parser.java:279) at org.h2.command.Parser.parse(Parser.java:251) at org.h2.command.Parser.prepareCommand(Parser.java:217) at org.h2.engine.Session.prepareLocal(Session.java:415) at org.h2.engine.Session.prepareCommand(Session.java:364) at org.h2.jdbc.JdbcConnection.prepareCommand(JdbcConnection.java:1121) at org.h2.jdbc.JdbcStatement.executeQuery(JdbcStatement.java:70) at org.apache.commons.dbcp.DelegatingStatement.executeQuery(DelegatingStatement.java:208) at org.hibernate.tool.hbm2ddl.DatabaseMetadata.initSequences(DatabaseMetadata.java:151) at org.hibernate.tool.hbm2ddl.DatabaseMetadata.<init>(DatabaseMetadata.java:69) at org.hibernate.tool.hbm2ddl.DatabaseMetadata.<init>(DatabaseMetadata.java:62) at org.hibernate.tool.hbm2ddl.SchemaUpdate.execute(SchemaUpdate.java:170) at org.hibernate.impl.SessionFactoryImpl.<init>(SessionFactoryImpl.java:375) at org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.java:1872) at org.hibernate.ejb.Ejb3Configuration.buildEntityManagerFactory(Ejb3Configuration.java:906) at org.hibernate.ejb.HibernatePersistence.createContainerEntityManagerFactory(HibernatePersistence.java:74) at org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean.createNativeEntityManagerFactory(LocalContainerEntityManagerFactoryBean.java:287) at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.afterPropertiesSet(AbstractEntityManagerFactoryBean.java:310) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1514) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1452) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:519) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456) at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193) at org.springframework.context.support.AbstractApplicationContext.getBean(AbstractApplicationContext.java:1105) at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:915) at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:472) at org.springframework.context.annotation.AnnotationConfigApplicationContext.<init>(AnnotationConfigApplicationContext.java:73) at org.dataone.cn.model.repository.D1BaseJpaRepositoryConfiguration.initContext(D1BaseJpaRepositoryConfiguration.java:39) at org.dataone.cn.data.repository.ReplicationPostgresRepositoryFactory.getReplicationTaskRepository(ReplicationPostgresRepositoryFactory.java:68) at org.dataone.service.cn.replication.ReplicationFactory.getReplicationTaskRepository(ReplicationFactory.java:74) at org.dataone.service.cn.replication.ReplicationTaskProcessor.<clinit>(ReplicationTaskProcessor.java:20) at org.dataone.service.cn.replication.ReplicationManager.startReplicationTaskProcessing(ReplicationManager.java:1028) at org.dataone.service.cn.replication.ReplicationManager.<init>(ReplicationManager.java:183) at org.dataone.service.cn.replication.v2.ReplicationManagerTestUnit.testCreateAndQueueTasks(ReplicationManagerTestUnit.java:175) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28) at org.springframework.test.context.junit4.statements.RunBeforeTestMethodCallbacks.evaluate(RunBeforeTestMethodCallbacks.java:74) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31) at org.springframework.test.context.junit4.statements.RunAfterTestMethodCallbacks.evaluate(RunAfterTestMethodCallbacks.java:83) at org.springframework.test.context.junit4.statements.SpringRepeat.evaluate(SpringRepeat.java:72) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:231) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50) at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28) at org.springframework.test.context.junit4.statements.RunBeforeTestClassCallbacks.evaluate(RunBeforeTestClassCallbacks.java:61) at org.springframework.test.context.junit4.statements.RunAfterTestClassCallbacks.evaluate(RunAfterTestClassCallbacks.java:71) at org.junit.runners.ParentRunner.run(ParentRunner.java:236) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.run(SpringJUnit4ClassRunner.java:174) at org.junit.runners.Suite.runChild(Suite.java:128) at org.junit.runners.Suite.runChild(Suite.java:24) at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184) at org.junit.runners.ParentRunner.run(ParentRunner.java:236) at org.dataone.test.apache.directory.server.integ.ApacheDSSuiteRunner.run(ApacheDSSuiteRunner.java:208) at org.apache.maven.surefire.junit4.JUnit4TestSet.execute(JUnit4TestSet.java:53) at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:123) at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:104) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.maven.surefire.util.ReflectionUtils.invokeMethodWithArray(ReflectionUtils.java:164) at org.apache.maven.surefire.booter.ProviderFactory$ProviderProxy.invoke(ProviderFactory.java:110) at org.apache.maven.surefire.booter.SurefireStarter.invokeProvider(SurefireStarter.java:175) at org.apache.maven.surefire.booter.SurefireStarter.runSuitesInProcessWhenForked(SurefireStarter.java:107) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:68) 20170216-20:23:31: [ERROR]: could not complete schema update [org.hibernate.tool.hbm2ddl.SchemaUpdate] org.h2.jdbc.JdbcSQLException: Table "PG_CLASS" not found; SQL statement: select relname from pg_class where relkind='S' [42102-163] at org.h2.message.DbException.getJdbcSQLException(DbException.java:329) at org.h2.message.DbException.get(DbException.java:169) at org.h2.message.DbException.get(DbException.java:146) at org.h2.command.Parser.readTableOrView(Parser.java:4758) at org.h2.command.Parser.readTableFilter(Parser.java:1080) at org.h2.command.Parser.parseSelectSimpleFromPart(Parser.java:1686) at org.h2.command.Parser.parseSelectSimple(Parser.java:1793) at org.h2.command.Parser.parseSelectSub(Parser.java:1680) at org.h2.command.Parser.parseSelectUnion(Parser.java:1523) at org.h2.command.Parser.parseSelect(Parser.java:1511) at org.h2.command.Parser.parsePrepared(Parser.java:405) at org.h2.command.Parser.parse(Parser.java:279) at org.h2.command.Parser.parse(Parser.java:251) at org.h2.command.Parser.prepareCommand(Parser.java:217) at org.h2.engine.Session.prepareLocal(Session.java:415) at org.h2.engine.Session.prepareCommand(Session.java:364) at org.h2.jdbc.JdbcConnection.prepareCommand(JdbcConnection.java:1121) at org.h2.jdbc.JdbcStatement.executeQuery(JdbcStatement.java:70) at org.apache.commons.dbcp.DelegatingStatement.executeQuery(DelegatingStatement.java:208) at org.hibernate.tool.hbm2ddl.DatabaseMetadata.initSequences(DatabaseMetadata.java:151) at org.hibernate.tool.hbm2ddl.DatabaseMetadata.<init>(DatabaseMetadata.java:69) at org.hibernate.tool.hbm2ddl.DatabaseMetadata.<init>(DatabaseMetadata.java:62) at org.hibernate.tool.hbm2ddl.SchemaUpdate.execute(SchemaUpdate.java:170) at org.hibernate.impl.SessionFactoryImpl.<init>(SessionFactoryImpl.java:375) at org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.java:1872) at org.hibernate.ejb.Ejb3Configuration.buildEntityManagerFactory(Ejb3Configuration.java:906) at org.hibernate.ejb.HibernatePersistence.createContainerEntityManagerFactory(HibernatePersistence.java:74) at org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean.createNativeEntityManagerFactory(LocalContainerEntityManagerFactoryBean.java:287) at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.afterPropertiesSet(AbstractEntityManagerFactoryBean.java:310) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1514) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1452) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:519) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456) at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193) at org.springframework.context.support.AbstractApplicationContext.getBean(AbstractApplicationContext.java:1105) at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:915) at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:472) at org.springframework.context.annotation.AnnotationConfigApplicationContext.<init>(AnnotationConfigApplicationContext.java:73) at org.dataone.cn.model.repository.D1BaseJpaRepositoryConfiguration.initContext(D1BaseJpaRepositoryConfiguration.java:39) at org.dataone.cn.data.repository.ReplicationPostgresRepositoryFactory.getReplicationTaskRepository(ReplicationPostgresRepositoryFactory.java:68) at org.dataone.service.cn.replication.ReplicationFactory.getReplicationTaskRepository(ReplicationFactory.java:74) at org.dataone.service.cn.replication.ReplicationTaskProcessor.<clinit>(ReplicationTaskProcessor.java:20) at org.dataone.service.cn.replication.ReplicationManager.startReplicationTaskProcessing(ReplicationManager.java:1028) at org.dataone.service.cn.replication.ReplicationManager.<init>(ReplicationManager.java:183) at org.dataone.service.cn.replication.v2.ReplicationManagerTestUnit.testCreateAndQueueTasks(ReplicationManagerTestUnit.java:175) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28) at org.springframework.test.context.junit4.statements.RunBeforeTestMethodCallbacks.evaluate(RunBeforeTestMethodCallbacks.java:74) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31) at org.springframework.test.context.junit4.statements.RunAfterTestMethodCallbacks.evaluate(RunAfterTestMethodCallbacks.java:83) at org.springframework.test.context.junit4.statements.SpringRepeat.evaluate(SpringRepeat.java:72) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:231) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50) at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28) at org.springframework.test.context.junit4.statements.RunBeforeTestClassCallbacks.evaluate(RunBeforeTestClassCallbacks.java:61) at org.springframework.test.context.junit4.statements.RunAfterTestClassCallbacks.evaluate(RunAfterTestClassCallbacks.java:71) at org.junit.runners.ParentRunner.run(ParentRunner.java:236) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.run(SpringJUnit4ClassRunner.java:174) at org.junit.runners.Suite.runChild(Suite.java:128) at org.junit.runners.Suite.runChild(Suite.java:24) at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184) at org.junit.runners.ParentRunner.run(ParentRunner.java:236) at org.dataone.test.apache.directory.server.integ.ApacheDSSuiteRunner.run(ApacheDSSuiteRunner.java:208) at org.apache.maven.surefire.junit4.JUnit4TestSet.execute(JUnit4TestSet.java:53) at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:123) at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:104) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.maven.surefire.util.ReflectionUtils.invokeMethodWithArray(ReflectionUtils.java:164) at org.apache.maven.surefire.booter.ProviderFactory$ProviderProxy.invoke(ProviderFactory.java:110) at org.apache.maven.surefire.booter.SurefireStarter.invokeProvider(SurefireStarter.java:175) at org.apache.maven.surefire.booter.SurefireStarter.runSuitesInProcessWhenForked(SurefireStarter.java:107) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:68) 20170216-20:23:31: [INFO]: Pre-instantiating singletons in org.springframework.beans.factory.support.DefaultListableBeanFactory@38517f22: defining beans [org.springframework.context.annotation.internalConfigurationAnnotationProcessor,org.springframework.context.annotation.internalAutowiredAnnotationProcessor,org.springframework.context.annotation.internalRequiredAnnotationProcessor,org.springframework.context.annotation.internalCommonAnnotationProcessor,org.springframework.context.annotation.internalPersistenceAnnotationProcessor,replicationPostgresRepositoryFactory,org.springframework.context.annotation.ConfigurationClassPostProcessor.importAwareProcessor,replicationH2RepositoryFactory,org.springframework.data.repository.core.support.RepositoryInterfaceAwareBeanPostProcessor#0,replicationAttemptHistoryRepository,replicationTaskRepository,org.springframework.data.repository.core.support.RepositoryInterfaceAwareBeanPostProcessor#1,dataSource,jpaVendorAdapter,entityManagerFactory,transactionManager]; root of factory hierarchy [org.springframework.beans.factory.support.DefaultListableBeanFactory] 20170216-20:23:31: [INFO]: selectSession: Using client certificate location: /etc/dataone/client/certs/null [org.dataone.client.auth.CertificateManager] 20170216-20:23:31: [WARN]: SQL Error: 42102, SQLState: 42S02 [org.hibernate.util.JDBCExceptionReporter] 20170216-20:23:31: [ERROR]: Table "REPLICATION_TASK_QUEUE" not found; SQL statement: select replicatio0_.id as id48_, replicatio0_.nextExecution as nextExec2_48_, replicatio0_.pid as pid48_, replicatio0_.status as status48_, replicatio0_.tryCount as tryCount48_ from replication_task_queue replicatio0_ [42102-163] [org.hibernate.util.JDBCExceptionReporter] 20170216-20:23:31: [INFO]: selectSession: Using client certificate location: /etc/dataone/client/certs/null [org.dataone.client.auth.CertificateManager] 20170216-20:23:31: [INFO]: Did not find a certificate for the subject specified: null [org.dataone.client.auth.CertificateManager] 20170216-20:23:31: [INFO]: ...trying SSLContext protocol: TLSv1.2 [org.dataone.client.auth.CertificateManager] 20170216-20:23:31: [INFO]: ...setting SSLContext with protocol: TLSv1.2 [org.dataone.client.auth.CertificateManager] 20170216-20:23:31: [INFO]: creating custom TrustManager [org.dataone.client.auth.CertificateManager] 20170216-20:23:31: [INFO]: getSSLConnectionSocketFactory: using allow-all hostname verifier [org.dataone.client.auth.CertificateManager] 20170216-20:23:31: [WARN]: registering ConnectionManager... [org.dataone.client.utils.HttpConnectionMonitorService] 20170216-20:23:31: [INFO]: ReplicationManager is using an X509 certificate from /etc/dataone/client/certs/null [org.dataone.service.cn.replication.ReplicationManager] 20170216-20:23:31: [INFO]: initialization [org.dataone.service.cn.replication.ReplicationManager] 20170216-20:23:31: [INFO]: ReplicationManager D1Client base_url is: https://cn-dev-ucsb-1.test.dataone.org/cn/v2 [org.dataone.service.cn.replication.ReplicationManager] 20170216-20:23:31: [INFO]: selectSession: Using client certificate location: /etc/dataone/client/certs/null [org.dataone.client.auth.CertificateManager] 20170216-20:23:31: [INFO]: ReplicationManager is using an X509 certificate from /etc/dataone/client/certs/null [org.dataone.service.cn.replication.ReplicationManager] 20170216-20:23:31: [INFO]: initialization [org.dataone.service.cn.replication.ReplicationManager] 20170216-20:23:31: [INFO]: ReplicationManager D1Client base_url is: https://cn-dev-ucsb-1.test.dataone.org/cn/v2 [org.dataone.service.cn.replication.ReplicationManager] 20170216-20:23:31: [WARN]: SQL Error: 42102, SQLState: 42S02 [org.hibernate.util.JDBCExceptionReporter] 20170216-20:23:31: [ERROR]: Table "REPLICATION_TASK_QUEUE" not found; SQL statement: select replicatio0_.id as id48_, replicatio0_.nextExecution as nextExec2_48_, replicatio0_.pid as pid48_, replicatio0_.status as status48_, replicatio0_.tryCount as tryCount48_ from replication_task_queue replicatio0_ where replicatio0_.status=? and replicatio0_.nextExecution<? order by replicatio0_.nextExecution asc limit ? [42102-163] [org.hibernate.util.JDBCExceptionReporter] 20170216-20:23:33: [INFO]: testCreateAndQueueTask replicationManager.createAndQueueTask [org.dataone.service.cn.replication.v2.ReplicationManagerTestUnit] 20170216-20:23:41: [INFO]: selectSession: Using client certificate location: /etc/dataone/client/certs/null [org.dataone.client.auth.CertificateManager] 20170216-20:23:41: [INFO]: Did not find a certificate for the subject specified: null [org.dataone.client.auth.CertificateManager] 20170216-20:23:41: [INFO]: ...trying SSLContext protocol: TLSv1.2 [org.dataone.client.auth.CertificateManager] 20170216-20:23:41: [INFO]: ...setting SSLContext with protocol: TLSv1.2 [org.dataone.client.auth.CertificateManager] 20170216-20:23:41: [INFO]: creating custom TrustManager [org.dataone.client.auth.CertificateManager] 20170216-20:23:41: [INFO]: getSSLConnectionSocketFactory: using allow-all hostname verifier [org.dataone.client.auth.CertificateManager] 20170216-20:23:41: [WARN]: registering ConnectionManager... [org.dataone.client.utils.HttpConnectionMonitorService] 20170216-20:23:41: [WARN]: In Replication Manager, task that should exist 'in process' does not exist. Creating new task for pid: 42 [org.dataone.service.cn.replication.ReplicationManager] 20170216-20:23:41: [ERROR]: Unhandled Exception for pid: 42. Error is : Could not retreive sysmeta from map for pid 42 [org.dataone.service.cn.replication.ReplicationManager] org.dataone.service.exceptions.NotFound: Could not retreive sysmeta from map for pid 42 at org.dataone.service.cn.replication.ReplicationManager.processPid(ReplicationManager.java:409) at org.dataone.service.cn.replication.ReplicationManager.createAndQueueTasks(ReplicationManager.java:366) at org.dataone.service.cn.replication.v2.ReplicationManagerTestUnit.testCreateAndQueueTasks(ReplicationManagerTestUnit.java:186) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28) at org.springframework.test.context.junit4.statements.RunBeforeTestMethodCallbacks.evaluate(RunBeforeTestMethodCallbacks.java:74) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31) at org.springframework.test.context.junit4.statements.RunAfterTestMethodCallbacks.evaluate(RunAfterTestMethodCallbacks.java:83) at org.springframework.test.context.junit4.statements.SpringRepeat.evaluate(SpringRepeat.java:72) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:231) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50) at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28) at org.springframework.test.context.junit4.statements.RunBeforeTestClassCallbacks.evaluate(RunBeforeTestClassCallbacks.java:61) at org.springframework.test.context.junit4.statements.RunAfterTestClassCallbacks.evaluate(RunAfterTestClassCallbacks.java:71) at org.junit.runners.ParentRunner.run(ParentRunner.java:236) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.run(SpringJUnit4ClassRunner.java:174) at org.junit.runners.Suite.runChild(Suite.java:128) at org.junit.runners.Suite.runChild(Suite.java:24) at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184) at org.junit.runners.ParentRunner.run(ParentRunner.java:236) at org.dataone.test.apache.directory.server.integ.ApacheDSSuiteRunner.run(ApacheDSSuiteRunner.java:208) at org.apache.maven.surefire.junit4.JUnit4TestSet.execute(JUnit4TestSet.java:53) at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:123) at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:104) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.maven.surefire.util.ReflectionUtils.invokeMethodWithArray(ReflectionUtils.java:164) at org.apache.maven.surefire.booter.ProviderFactory$ProviderProxy.invoke(ProviderFactory.java:110) at org.apache.maven.surefire.booter.SurefireStarter.invokeProvider(SurefireStarter.java:175) at org.apache.maven.surefire.booter.SurefireStarter.runSuitesInProcessWhenForked(SurefireStarter.java:107) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:68) 20170216-20:23:41: [INFO]: Added 0 MNReplicationTasks to the queue for 42 [org.dataone.service.cn.replication.ReplicationManager] 20170216-20:23:41: [INFO]: RestClient.doRequestNoBody, thread(90) call Info: GET https://cn-dev-ucsb-1.test.dataone.org/cn/v1/node [org.dataone.client.rest.RestClient] 20170216-20:23:41: [INFO]: response httpCode: 200 [org.dataone.service.util.ExceptionHandler]
Standard Error
Feb 16, 2017 8:23:10 PM com.hazelcast.config.ClasspathXmlConfig INFO: Configuring Hazelcast from 'org/dataone/configuration/hazelcast.xml'. Feb 16, 2017 8:23:10 PM com.hazelcast.impl.AddressPicker INFO: Interfaces is disabled, trying to pick one address from TCP-IP config addresses: [127.0.0.1] Feb 16, 2017 8:23:10 PM com.hazelcast.impl.AddressPicker WARNING: Picking loopback address [127.0.0.1]; setting 'java.net.preferIPv4Stack' to true. Feb 16, 2017 8:23:10 PM com.hazelcast.impl.AddressPicker INFO: Picked Address[127.0.0.1]:5730, using socket ServerSocket[addr=/0:0:0:0:0:0:0:0,localport=5730], bind any local is true Feb 16, 2017 8:23:11 PM com.hazelcast.system INFO: [127.0.0.1]:5730 [DataONEBuildTest] Hazelcast Community Edition 2.4.1 (20121213) starting at Address[127.0.0.1]:5730 Feb 16, 2017 8:23:11 PM com.hazelcast.system INFO: [127.0.0.1]:5730 [DataONEBuildTest] Copyright (C) 2008-2012 Hazelcast.com Feb 16, 2017 8:23:11 PM com.hazelcast.impl.LifecycleServiceImpl INFO: [127.0.0.1]:5730 [DataONEBuildTest] Address[127.0.0.1]:5730 is STARTING Feb 16, 2017 8:23:11 PM com.hazelcast.impl.TcpIpJoiner INFO: [127.0.0.1]:5730 [DataONEBuildTest] Connecting to possible member: Address[127.0.0.1]:5732 Feb 16, 2017 8:23:11 PM com.hazelcast.impl.TcpIpJoiner INFO: [127.0.0.1]:5730 [DataONEBuildTest] Connecting to possible member: Address[127.0.0.1]:5731 Feb 16, 2017 8:23:11 PM com.hazelcast.nio.SocketConnector INFO: [127.0.0.1]:5730 [DataONEBuildTest] Could not connect to: /127.0.0.1:5731. Reason: ConnectException[Connection refused] Feb 16, 2017 8:23:11 PM com.hazelcast.nio.SocketConnector INFO: [127.0.0.1]:5730 [DataONEBuildTest] Could not connect to: /127.0.0.1:5732. Reason: ConnectException[Connection refused] Feb 16, 2017 8:23:12 PM com.hazelcast.nio.SocketConnector INFO: [127.0.0.1]:5730 [DataONEBuildTest] Could not connect to: /127.0.0.1:5732. Reason: ConnectException[Connection refused] Feb 16, 2017 8:23:12 PM com.hazelcast.nio.SocketConnector INFO: [127.0.0.1]:5730 [DataONEBuildTest] Could not connect to: /127.0.0.1:5731. Reason: ConnectException[Connection refused] Feb 16, 2017 8:23:12 PM com.hazelcast.impl.TcpIpJoiner INFO: [127.0.0.1]:5730 [DataONEBuildTest] Members [1] { Member [127.0.0.1]:5730 this } Feb 16, 2017 8:23:12 PM com.hazelcast.impl.LifecycleServiceImpl INFO: [127.0.0.1]:5730 [DataONEBuildTest] Address[127.0.0.1]:5730 is STARTED Feb 16, 2017 8:23:12 PM com.hazelcast.impl.AddressPicker INFO: Interfaces is disabled, trying to pick one address from TCP-IP config addresses: [127.0.0.1] Feb 16, 2017 8:23:12 PM com.hazelcast.impl.AddressPicker INFO: Picked Address[127.0.0.1]:5731, using socket ServerSocket[addr=/0:0:0:0:0:0:0:0,localport=5731], bind any local is true Feb 16, 2017 8:23:12 PM com.hazelcast.system INFO: [127.0.0.1]:5731 [DataONEBuildTest] Hazelcast Community Edition 2.4.1 (20121213) starting at Address[127.0.0.1]:5731 Feb 16, 2017 8:23:12 PM com.hazelcast.system INFO: [127.0.0.1]:5731 [DataONEBuildTest] Copyright (C) 2008-2012 Hazelcast.com Feb 16, 2017 8:23:12 PM com.hazelcast.impl.LifecycleServiceImpl INFO: [127.0.0.1]:5731 [DataONEBuildTest] Address[127.0.0.1]:5731 is STARTING Feb 16, 2017 8:23:12 PM com.hazelcast.impl.TcpIpJoiner INFO: [127.0.0.1]:5731 [DataONEBuildTest] Connecting to possible member: Address[127.0.0.1]:5730 Feb 16, 2017 8:23:12 PM com.hazelcast.impl.TcpIpJoiner INFO: [127.0.0.1]:5731 [DataONEBuildTest] Connecting to possible member: Address[127.0.0.1]:5732 Feb 16, 2017 8:23:12 PM com.hazelcast.nio.SocketConnector INFO: [127.0.0.1]:5731 [DataONEBuildTest] Could not connect to: /127.0.0.1:5732. Reason: ConnectException[Connection refused] Feb 16, 2017 8:23:12 PM com.hazelcast.nio.SocketAcceptor INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 is accepting socket connection from /127.0.0.1:33554 Feb 16, 2017 8:23:12 PM com.hazelcast.nio.ConnectionManager INFO: [127.0.0.1]:5731 [DataONEBuildTest] 33554 accepted socket connection from /127.0.0.1:5730 Feb 16, 2017 8:23:12 PM com.hazelcast.nio.ConnectionManager INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 accepted socket connection from /127.0.0.1:33554 Feb 16, 2017 8:23:13 PM com.hazelcast.cluster.ClusterManager INFO: [127.0.0.1]:5731 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730 Feb 16, 2017 8:23:13 PM com.hazelcast.nio.SocketConnector INFO: [127.0.0.1]:5731 [DataONEBuildTest] Could not connect to: /127.0.0.1:5732. Reason: ConnectException[Connection refused] Feb 16, 2017 8:23:13 PM com.hazelcast.cluster.ClusterManager INFO: [127.0.0.1]:5731 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730 Feb 16, 2017 8:23:18 PM com.hazelcast.cluster.ClusterManager INFO: [127.0.0.1]:5731 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730 Feb 16, 2017 8:23:18 PM com.hazelcast.cluster.ClusterManager INFO: [127.0.0.1]:5730 [DataONEBuildTest] Members [2] { Member [127.0.0.1]:5730 this Member [127.0.0.1]:5731 } Feb 16, 2017 8:23:18 PM com.hazelcast.cluster.ClusterManager INFO: [127.0.0.1]:5731 [DataONEBuildTest] Members [2] { Member [127.0.0.1]:5730 Member [127.0.0.1]:5731 this } Feb 16, 2017 8:23:19 PM com.hazelcast.cluster.ClusterManager INFO: [127.0.0.1]:5731 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730 Feb 16, 2017 8:23:20 PM com.hazelcast.impl.LifecycleServiceImpl INFO: [127.0.0.1]:5731 [DataONEBuildTest] Address[127.0.0.1]:5731 is STARTED Feb 16, 2017 8:23:20 PM com.hazelcast.impl.AddressPicker INFO: Interfaces is disabled, trying to pick one address from TCP-IP config addresses: [127.0.0.1] Feb 16, 2017 8:23:20 PM com.hazelcast.impl.AddressPicker INFO: Picked Address[127.0.0.1]:5732, using socket ServerSocket[addr=/0:0:0:0:0:0:0:0,localport=5732], bind any local is true Feb 16, 2017 8:23:20 PM com.hazelcast.system INFO: [127.0.0.1]:5732 [DataONEBuildTest] Hazelcast Community Edition 2.4.1 (20121213) starting at Address[127.0.0.1]:5732 Feb 16, 2017 8:23:20 PM com.hazelcast.system INFO: [127.0.0.1]:5732 [DataONEBuildTest] Copyright (C) 2008-2012 Hazelcast.com Feb 16, 2017 8:23:20 PM com.hazelcast.impl.LifecycleServiceImpl INFO: [127.0.0.1]:5732 [DataONEBuildTest] Address[127.0.0.1]:5732 is STARTING Feb 16, 2017 8:23:20 PM com.hazelcast.impl.TcpIpJoiner INFO: [127.0.0.1]:5732 [DataONEBuildTest] Connecting to possible member: Address[127.0.0.1]:5730 Feb 16, 2017 8:23:20 PM com.hazelcast.impl.TcpIpJoiner INFO: [127.0.0.1]:5732 [DataONEBuildTest] Connecting to possible member: Address[127.0.0.1]:5731 Feb 16, 2017 8:23:20 PM com.hazelcast.nio.ConnectionManager INFO: [127.0.0.1]:5732 [DataONEBuildTest] 50155 accepted socket connection from /127.0.0.1:5730 Feb 16, 2017 8:23:20 PM com.hazelcast.nio.SocketAcceptor INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 is accepting socket connection from /127.0.0.1:50155 Feb 16, 2017 8:23:20 PM com.hazelcast.nio.SocketAcceptor INFO: [127.0.0.1]:5731 [DataONEBuildTest] 5731 is accepting socket connection from /127.0.0.1:57834 Feb 16, 2017 8:23:20 PM com.hazelcast.nio.ConnectionManager INFO: [127.0.0.1]:5731 [DataONEBuildTest] 5731 accepted socket connection from /127.0.0.1:57834 Feb 16, 2017 8:23:20 PM com.hazelcast.nio.ConnectionManager INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 accepted socket connection from /127.0.0.1:50155 Feb 16, 2017 8:23:20 PM com.hazelcast.nio.ConnectionManager INFO: [127.0.0.1]:5732 [DataONEBuildTest] 57834 accepted socket connection from /127.0.0.1:5731 Feb 16, 2017 8:23:21 PM com.hazelcast.cluster.ClusterManager INFO: [127.0.0.1]:5732 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730 Feb 16, 2017 8:23:21 PM com.hazelcast.cluster.ClusterManager INFO: [127.0.0.1]:5732 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5731 Feb 16, 2017 8:23:21 PM com.hazelcast.cluster.ClusterManager INFO: [127.0.0.1]:5732 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730 Feb 16, 2017 8:23:21 PM com.hazelcast.cluster.ClusterManager INFO: [127.0.0.1]:5732 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730 Feb 16, 2017 8:23:26 PM com.hazelcast.cluster.ClusterManager INFO: [127.0.0.1]:5732 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730 Feb 16, 2017 8:23:26 PM com.hazelcast.cluster.ClusterManager INFO: [127.0.0.1]:5730 [DataONEBuildTest] Members [3] { Member [127.0.0.1]:5730 this Member [127.0.0.1]:5731 Member [127.0.0.1]:5732 } Feb 16, 2017 8:23:26 PM com.hazelcast.cluster.ClusterManager INFO: [127.0.0.1]:5731 [DataONEBuildTest] Members [3] { Member [127.0.0.1]:5730 Member [127.0.0.1]:5731 this Member [127.0.0.1]:5732 } Feb 16, 2017 8:23:26 PM com.hazelcast.cluster.ClusterManager INFO: [127.0.0.1]:5732 [DataONEBuildTest] Members [3] { Member [127.0.0.1]:5730 Member [127.0.0.1]:5731 Member [127.0.0.1]:5732 this } Feb 16, 2017 8:23:27 PM com.hazelcast.cluster.ClusterManager INFO: [127.0.0.1]:5732 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730 Feb 16, 2017 8:23:28 PM com.hazelcast.impl.LifecycleServiceImpl INFO: [127.0.0.1]:5732 [DataONEBuildTest] Address[127.0.0.1]:5732 is STARTED Feb 16, 2017 8:23:28 PM com.hazelcast.config.ClasspathXmlConfig INFO: Configuring Hazelcast from 'org/dataone/configuration/hazelcastTestClientConf.xml'. Feb 16, 2017 8:23:28 PM com.hazelcast.client.LifecycleServiceClientImpl INFO: HazelcastClient is STARTING Feb 16, 2017 8:23:28 PM com.hazelcast.nio.SocketAcceptor INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 is accepting socket connection from /127.0.0.1:47371 Feb 16, 2017 8:23:28 PM com.hazelcast.nio.ConnectionManager INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 accepted socket connection from /127.0.0.1:47371 Feb 16, 2017 8:23:28 PM com.hazelcast.impl.ClientHandlerService INFO: [127.0.0.1]:5730 [DataONEBuildTest] received auth from Connection [/127.0.0.1:47371 -> null] live=true, client=true, type=CLIENT, successfully authenticated Feb 16, 2017 8:23:28 PM com.hazelcast.client.LifecycleServiceClientImpl INFO: HazelcastClient is CLIENT_CONNECTION_OPENING Feb 16, 2017 8:23:28 PM com.hazelcast.client.LifecycleServiceClientImpl INFO: HazelcastClient is CLIENT_CONNECTION_OPENED Feb 16, 2017 8:23:28 PM com.hazelcast.client.LifecycleServiceClientImpl INFO: HazelcastClient is STARTED Feb 16, 2017 8:23:28 PM com.hazelcast.client.LifecycleServiceClientImpl INFO: HazelcastClient is STARTING Feb 16, 2017 8:23:28 PM com.hazelcast.nio.SocketAcceptor INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 is accepting socket connection from /127.0.0.1:47372 Feb 16, 2017 8:23:28 PM com.hazelcast.nio.ConnectionManager INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 accepted socket connection from /127.0.0.1:47372 Feb 16, 2017 8:23:28 PM com.hazelcast.impl.ClientHandlerService INFO: [127.0.0.1]:5730 [DataONEBuildTest] received auth from Connection [/127.0.0.1:47372 -> null] live=true, client=true, type=CLIENT, successfully authenticated Feb 16, 2017 8:23:28 PM com.hazelcast.client.LifecycleServiceClientImpl INFO: HazelcastClient is CLIENT_CONNECTION_OPENING Feb 16, 2017 8:23:28 PM com.hazelcast.client.LifecycleServiceClientImpl INFO: HazelcastClient is CLIENT_CONNECTION_OPENED Feb 16, 2017 8:23:28 PM com.hazelcast.client.LifecycleServiceClientImpl INFO: HazelcastClient is STARTED Feb 16, 2017 8:23:31 PM com.hazelcast.impl.PartitionManager INFO: [127.0.0.1]:5730 [DataONEBuildTest] Initializing cluster partition table first arrangement... Feb 16, 2017 8:23:39 PM com.hazelcast.impl.ConcurrentMapManager WARNING: [127.0.0.1]:5730 [DataONEBuildTest] Caller -> RedoLog{name=c:__hz_Locks, redoType=REDO_TARGET_WRONG, operation=CONCURRENT_MAP_LOCK, target=Address[127.0.0.1]:5731 / connected=true, redoCount=16, migrating=null partition=Partition [187]{ 0:Address[127.0.0.1]:5731 1:Address[127.0.0.1]:5732 2:Address[127.0.0.1]:5730 } } Feb 16, 2017 8:23:39 PM com.hazelcast.impl.ConcurrentMapManager WARNING: [127.0.0.1]:5730 [DataONEBuildTest] Caller -> RedoLog{name=c:hzSystemMetadata, redoType=REDO_TARGET_WRONG, operation=CONCURRENT_MAP_PUT, target=Address[127.0.0.1]:5731 / connected=true, redoCount=16, migrating=null partition=Partition [269]{ 0:Address[127.0.0.1]:5731 1:Address[127.0.0.1]:5732 2:Address[127.0.0.1]:5730 } } Feb 16, 2017 8:23:39 PM com.hazelcast.impl.ConcurrentMapManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Handler -> RedoLog{name=c:__hz_Locks, redoType=REDO_TARGET_WRONG, operation=CONCURRENT_MAP_LOCK, caller=Address[127.0.0.1]:5730 / connected=true, redoCount=16, migrating=null partition=Partition [187]{ } } Feb 16, 2017 8:23:39 PM com.hazelcast.impl.ConcurrentMapManager WARNING: [127.0.0.1]:5730 [DataONEBuildTest] Caller -> RedoLog{name=c:__hz_Locks, redoType=REDO_TARGET_WRONG, operation=CONCURRENT_MAP_LOCK, target=Address[127.0.0.1]:5731 / connected=true, redoCount=17, migrating=null partition=Partition [187]{ 0:Address[127.0.0.1]:5731 1:Address[127.0.0.1]:5732 2:Address[127.0.0.1]:5730 } } Feb 16, 2017 8:23:39 PM com.hazelcast.impl.ConcurrentMapManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Handler -> RedoLog{name=c:hzSystemMetadata, redoType=REDO_TARGET_WRONG, operation=CONCURRENT_MAP_PUT, caller=Address[127.0.0.1]:5730 / connected=true, redoCount=16, migrating=null partition=Partition [269]{ } } Feb 16, 2017 8:23:39 PM com.hazelcast.impl.ConcurrentMapManager WARNING: [127.0.0.1]:5730 [DataONEBuildTest] Caller -> RedoLog{name=c:hzSystemMetadata, redoType=REDO_TARGET_WRONG, operation=CONCURRENT_MAP_PUT, target=Address[127.0.0.1]:5731 / connected=true, redoCount=17, migrating=null partition=Partition [269]{ 0:Address[127.0.0.1]:5731 1:Address[127.0.0.1]:5732 2:Address[127.0.0.1]:5730 } } Feb 16, 2017 8:23:40 PM com.hazelcast.impl.ConcurrentMapManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Handler -> RedoLog{name=c:__hz_Locks, redoType=REDO_TARGET_WRONG, operation=CONCURRENT_MAP_LOCK, caller=Address[127.0.0.1]:5730 / connected=true, redoCount=17, migrating=null partition=Partition [187]{ } } Feb 16, 2017 8:23:40 PM com.hazelcast.impl.ConcurrentMapManager WARNING: [127.0.0.1]:5730 [DataONEBuildTest] Caller -> RedoLog{name=c:__hz_Locks, redoType=REDO_TARGET_WRONG, operation=CONCURRENT_MAP_LOCK, target=Address[127.0.0.1]:5731 / connected=true, redoCount=18, migrating=null partition=Partition [187]{ 0:Address[127.0.0.1]:5731 1:Address[127.0.0.1]:5732 2:Address[127.0.0.1]:5730 } } Feb 16, 2017 8:23:40 PM com.hazelcast.impl.ConcurrentMapManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Handler -> RedoLog{name=c:hzSystemMetadata, redoType=REDO_TARGET_WRONG, operation=CONCURRENT_MAP_PUT, caller=Address[127.0.0.1]:5730 / connected=true, redoCount=17, migrating=null partition=Partition [269]{ } } Feb 16, 2017 8:23:40 PM com.hazelcast.impl.ConcurrentMapManager WARNING: [127.0.0.1]:5730 [DataONEBuildTest] Caller -> RedoLog{name=c:hzSystemMetadata, redoType=REDO_TARGET_WRONG, operation=CONCURRENT_MAP_PUT, target=Address[127.0.0.1]:5731 / connected=true, redoCount=18, migrating=null partition=Partition [269]{ 0:Address[127.0.0.1]:5731 1:Address[127.0.0.1]:5732 2:Address[127.0.0.1]:5730 } } Feb 16, 2017 8:23:40 PM com.hazelcast.impl.ConcurrentMapManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Handler -> RedoLog{name=c:__hz_Locks, redoType=REDO_TARGET_WRONG, operation=CONCURRENT_MAP_LOCK, caller=Address[127.0.0.1]:5730 / connected=true, redoCount=18, migrating=null partition=Partition [187]{ } } Feb 16, 2017 8:23:40 PM com.hazelcast.impl.ConcurrentMapManager WARNING: [127.0.0.1]:5730 [DataONEBuildTest] Caller -> RedoLog{name=c:__hz_Locks, redoType=REDO_TARGET_WRONG, operation=CONCURRENT_MAP_LOCK, target=Address[127.0.0.1]:5731 / connected=true, redoCount=19, migrating=null partition=Partition [187]{ 0:Address[127.0.0.1]:5731 1:Address[127.0.0.1]:5732 2:Address[127.0.0.1]:5730 } } Feb 16, 2017 8:23:40 PM com.hazelcast.impl.ConcurrentMapManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Handler -> RedoLog{name=c:hzSystemMetadata, redoType=REDO_TARGET_WRONG, operation=CONCURRENT_MAP_PUT, caller=Address[127.0.0.1]:5730 / connected=true, redoCount=18, migrating=null partition=Partition [269]{ } } Feb 16, 2017 8:23:40 PM com.hazelcast.impl.ConcurrentMapManager WARNING: [127.0.0.1]:5730 [DataONEBuildTest] Caller -> RedoLog{name=c:hzSystemMetadata, redoType=REDO_TARGET_WRONG, operation=CONCURRENT_MAP_PUT, target=Address[127.0.0.1]:5731 / connected=true, redoCount=19, migrating=null partition=Partition [269]{ 0:Address[127.0.0.1]:5731 1:Address[127.0.0.1]:5732 2:Address[127.0.0.1]:5730 } } Feb 16, 2017 8:23:41 PM com.hazelcast.impl.LifecycleServiceImpl INFO: [127.0.0.1]:5730 [DataONEBuildTest] Address[127.0.0.1]:5730 is SHUTTING_DOWN Feb 16, 2017 8:23:41 PM com.hazelcast.cluster.ClusterManager INFO: [127.0.0.1]:5732 [DataONEBuildTest] Removing Address Address[127.0.0.1]:5730 Feb 16, 2017 8:23:41 PM com.hazelcast.cluster.ClusterManager INFO: [127.0.0.1]:5731 [DataONEBuildTest] Removing Address Address[127.0.0.1]:5730 Feb 16, 2017 8:23:41 PM com.hazelcast.nio.Connection INFO: [127.0.0.1]:5730 [DataONEBuildTest] Connection [Address[127.0.0.1]:5732] lost. Reason: java.io.EOFException[null] Feb 16, 2017 8:23:41 PM com.hazelcast.nio.Connection INFO: [127.0.0.1]:5730 [DataONEBuildTest] Connection [Address[127.0.0.1]:5731] lost. Reason: java.io.EOFException[null] Feb 16, 2017 8:23:41 PM com.hazelcast.nio.Connection INFO: [127.0.0.1]:5732 [DataONEBuildTest] Connection [Address[127.0.0.1]:5730] lost. Reason: Explicit close Feb 16, 2017 8:23:41 PM com.hazelcast.nio.ReadHandler WARNING: [127.0.0.1]:5730 [DataONEBuildTest] hz.hzProcessInstance.IO.thread-2 Closing socket to endpoint Address[127.0.0.1]:5732, Cause:java.io.EOFException Feb 16, 2017 8:23:41 PM com.hazelcast.nio.Connection INFO: [127.0.0.1]:5731 [DataONEBuildTest] Connection [Address[127.0.0.1]:5730] lost. Reason: Explicit close Feb 16, 2017 8:23:41 PM com.hazelcast.nio.ReadHandler WARNING: [127.0.0.1]:5730 [DataONEBuildTest] hz.hzProcessInstance.IO.thread-1 Closing socket to endpoint Address[127.0.0.1]:5731, Cause:java.io.EOFException Feb 16, 2017 8:23:41 PM com.hazelcast.cluster.ClusterManager INFO: [127.0.0.1]:5732 [DataONEBuildTest] Members [2] { Member [127.0.0.1]:5731 Member [127.0.0.1]:5732 this } Feb 16, 2017 8:23:41 PM com.hazelcast.impl.PartitionManager INFO: [127.0.0.1]:5731 [DataONEBuildTest] Starting to send partition replica diffs...true Feb 16, 2017 8:23:42 PM com.hazelcast.cluster.ClusterManager INFO: [127.0.0.1]:5731 [DataONEBuildTest] Members [2] { Member [127.0.0.1]:5731 this Member [127.0.0.1]:5732 } Feb 16, 2017 8:23:42 PM com.hazelcast.cluster.ClusterManager INFO: [127.0.0.1]:5732 [DataONEBuildTest] Removing Address Address[127.0.0.1]:5730 Feb 16, 2017 8:23:42 PM com.hazelcast.nio.Connection INFO: [127.0.0.1]:5730 [DataONEBuildTest] Connection [Address[127.0.0.1]:47372] lost. Reason: Explicit close Feb 16, 2017 8:23:42 PM com.hazelcast.client.ConnectionManager WARNING: Connection to Connection [0] [localhost/127.0.0.1:5730 -> 127.0.0.1:5730] is lost Feb 16, 2017 8:23:42 PM com.hazelcast.client.LifecycleServiceClientImpl INFO: HazelcastClient is CLIENT_CONNECTION_LOST Feb 16, 2017 8:23:42 PM com.hazelcast.nio.Connection INFO: [127.0.0.1]:5730 [DataONEBuildTest] Connection [Address[127.0.0.1]:47371] lost. Reason: Explicit close Feb 16, 2017 8:23:42 PM com.hazelcast.client.ConnectionManager WARNING: Connection to Connection [0] [localhost/127.0.0.1:5730 -> 127.0.0.1:5730] is lost Feb 16, 2017 8:23:42 PM com.hazelcast.client.LifecycleServiceClientImpl INFO: HazelcastClient is CLIENT_CONNECTION_LOST Feb 16, 2017 8:23:42 PM com.hazelcast.nio.SocketAcceptor INFO: [127.0.0.1]:5732 [DataONEBuildTest] 5732 is accepting socket connection from /127.0.0.1:49987 Feb 16, 2017 8:23:42 PM com.hazelcast.nio.SocketAcceptor INFO: [127.0.0.1]:5732 [DataONEBuildTest] 5732 is accepting socket connection from /127.0.0.1:49988 Feb 16, 2017 8:23:42 PM com.hazelcast.nio.ConnectionManager INFO: [127.0.0.1]:5732 [DataONEBuildTest] 5732 accepted socket connection from /127.0.0.1:49987 Feb 16, 2017 8:23:42 PM com.hazelcast.nio.ConnectionManager INFO: [127.0.0.1]:5732 [DataONEBuildTest] 5732 accepted socket connection from /127.0.0.1:49988 Feb 16, 2017 8:23:42 PM com.hazelcast.impl.ClientHandlerService INFO: [127.0.0.1]:5732 [DataONEBuildTest] received auth from Connection [/127.0.0.1:49988 -> null] live=true, client=true, type=CLIENT, successfully authenticated Feb 16, 2017 8:23:42 PM com.hazelcast.impl.ClientHandlerService INFO: [127.0.0.1]:5732 [DataONEBuildTest] received auth from Connection [/127.0.0.1:49987 -> null] live=true, client=true, type=CLIENT, successfully authenticated Feb 16, 2017 8:23:42 PM com.hazelcast.client.LifecycleServiceClientImpl INFO: HazelcastClient is CLIENT_CONNECTION_OPENING Feb 16, 2017 8:23:42 PM com.hazelcast.client.LifecycleServiceClientImpl INFO: HazelcastClient is CLIENT_CONNECTION_OPENING Feb 16, 2017 8:23:42 PM com.hazelcast.client.LifecycleServiceClientImpl INFO: HazelcastClient is CLIENT_CONNECTION_OPENED Feb 16, 2017 8:23:42 PM com.hazelcast.client.LifecycleServiceClientImpl INFO: HazelcastClient is CLIENT_CONNECTION_OPENED Feb 16, 2017 8:23:42 PM com.hazelcast.initializer INFO: [127.0.0.1]:5730 [DataONEBuildTest] Destroying node initializer. Feb 16, 2017 8:23:42 PM com.hazelcast.impl.Node INFO: [127.0.0.1]:5730 [DataONEBuildTest] Hazelcast Shutdown is completed in 876 ms. Feb 16, 2017 8:23:42 PM com.hazelcast.impl.LifecycleServiceImpl INFO: [127.0.0.1]:5730 [DataONEBuildTest] Address[127.0.0.1]:5730 is SHUTDOWN Feb 16, 2017 8:23:42 PM com.hazelcast.impl.LifecycleServiceImpl INFO: [127.0.0.1]:5732 [DataONEBuildTest] Address[127.0.0.1]:5732 is SHUTTING_DOWN Feb 16, 2017 8:23:42 PM com.hazelcast.cluster.ClusterManager INFO: [127.0.0.1]:5731 [DataONEBuildTest] Removing Address Address[127.0.0.1]:5732 Feb 16, 2017 8:23:42 PM com.hazelcast.nio.Connection INFO: [127.0.0.1]:5731 [DataONEBuildTest] Connection [Address[127.0.0.1]:5732] lost. Reason: Explicit close Feb 16, 2017 8:23:42 PM com.hazelcast.nio.Connection INFO: [127.0.0.1]:5732 [DataONEBuildTest] Connection [Address[127.0.0.1]:5731] lost. Reason: java.io.EOFException[null] Feb 16, 2017 8:23:42 PM com.hazelcast.nio.ReadHandler WARNING: [127.0.0.1]:5732 [DataONEBuildTest] hz.hzProcessInstance2.IO.thread-2 Closing socket to endpoint Address[127.0.0.1]:5731, Cause:java.io.EOFException Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[17]. PartitionReplicaChangeEvent{partitionId=17, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[31]. PartitionReplicaChangeEvent{partitionId=31, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[33]. PartitionReplicaChangeEvent{partitionId=33, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[36]. PartitionReplicaChangeEvent{partitionId=36, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[38]. PartitionReplicaChangeEvent{partitionId=38, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[39]. PartitionReplicaChangeEvent{partitionId=39, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[42]. PartitionReplicaChangeEvent{partitionId=42, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[44]. PartitionReplicaChangeEvent{partitionId=44, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[51]. PartitionReplicaChangeEvent{partitionId=51, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[53]. PartitionReplicaChangeEvent{partitionId=53, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[59]. PartitionReplicaChangeEvent{partitionId=59, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[61]. PartitionReplicaChangeEvent{partitionId=61, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[65]. PartitionReplicaChangeEvent{partitionId=65, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[66]. PartitionReplicaChangeEvent{partitionId=66, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[67]. PartitionReplicaChangeEvent{partitionId=67, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[69]. PartitionReplicaChangeEvent{partitionId=69, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[70]. PartitionReplicaChangeEvent{partitionId=70, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[74]. PartitionReplicaChangeEvent{partitionId=74, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[75]. PartitionReplicaChangeEvent{partitionId=75, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[76]. PartitionReplicaChangeEvent{partitionId=76, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[77]. PartitionReplicaChangeEvent{partitionId=77, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[79]. PartitionReplicaChangeEvent{partitionId=79, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[82]. PartitionReplicaChangeEvent{partitionId=82, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[87]. PartitionReplicaChangeEvent{partitionId=87, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[88]. PartitionReplicaChangeEvent{partitionId=88, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[92]. PartitionReplicaChangeEvent{partitionId=92, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[97]. PartitionReplicaChangeEvent{partitionId=97, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[98]. PartitionReplicaChangeEvent{partitionId=98, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[102]. PartitionReplicaChangeEvent{partitionId=102, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[105]. PartitionReplicaChangeEvent{partitionId=105, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[106]. PartitionReplicaChangeEvent{partitionId=106, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[110]. PartitionReplicaChangeEvent{partitionId=110, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[113]. PartitionReplicaChangeEvent{partitionId=113, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[114]. PartitionReplicaChangeEvent{partitionId=114, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[117]. PartitionReplicaChangeEvent{partitionId=117, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[119]. PartitionReplicaChangeEvent{partitionId=119, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[122]. PartitionReplicaChangeEvent{partitionId=122, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[123]. PartitionReplicaChangeEvent{partitionId=123, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[131]. PartitionReplicaChangeEvent{partitionId=131, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[133]. PartitionReplicaChangeEvent{partitionId=133, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[134]. PartitionReplicaChangeEvent{partitionId=134, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[136]. PartitionReplicaChangeEvent{partitionId=136, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[137]. PartitionReplicaChangeEvent{partitionId=137, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[140]. PartitionReplicaChangeEvent{partitionId=140, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[142]. PartitionReplicaChangeEvent{partitionId=142, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[143]. PartitionReplicaChangeEvent{partitionId=143, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[144]. PartitionReplicaChangeEvent{partitionId=144, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[151]. PartitionReplicaChangeEvent{partitionId=151, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[154]. PartitionReplicaChangeEvent{partitionId=154, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[155]. PartitionReplicaChangeEvent{partitionId=155, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[156]. PartitionReplicaChangeEvent{partitionId=156, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[157]. PartitionReplicaChangeEvent{partitionId=157, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[158]. PartitionReplicaChangeEvent{partitionId=158, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[162]. PartitionReplicaChangeEvent{partitionId=162, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[163]. PartitionReplicaChangeEvent{partitionId=163, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[165]. PartitionReplicaChangeEvent{partitionId=165, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[170]. PartitionReplicaChangeEvent{partitionId=170, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[173]. PartitionReplicaChangeEvent{partitionId=173, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[174]. PartitionReplicaChangeEvent{partitionId=174, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[175]. PartitionReplicaChangeEvent{partitionId=175, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[176]. PartitionReplicaChangeEvent{partitionId=176, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[178]. PartitionReplicaChangeEvent{partitionId=178, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[182]. PartitionReplicaChangeEvent{partitionId=182, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[190]. PartitionReplicaChangeEvent{partitionId=190, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[196]. PartitionReplicaChangeEvent{partitionId=196, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[197]. PartitionReplicaChangeEvent{partitionId=197, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[204]. PartitionReplicaChangeEvent{partitionId=204, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[205]. PartitionReplicaChangeEvent{partitionId=205, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[206]. PartitionReplicaChangeEvent{partitionId=206, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[207]. PartitionReplicaChangeEvent{partitionId=207, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[213]. PartitionReplicaChangeEvent{partitionId=213, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[214]. PartitionReplicaChangeEvent{partitionId=214, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[221]. PartitionReplicaChangeEvent{partitionId=221, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[222]. PartitionReplicaChangeEvent{partitionId=222, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[225]. PartitionReplicaChangeEvent{partitionId=225, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[228]. PartitionReplicaChangeEvent{partitionId=228, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[238]. PartitionReplicaChangeEvent{partitionId=238, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[247]. PartitionReplicaChangeEvent{partitionId=247, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[248]. PartitionReplicaChangeEvent{partitionId=248, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[249]. PartitionReplicaChangeEvent{partitionId=249, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[253]. PartitionReplicaChangeEvent{partitionId=253, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[256]. PartitionReplicaChangeEvent{partitionId=256, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[257]. PartitionReplicaChangeEvent{partitionId=257, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[258]. PartitionReplicaChangeEvent{partitionId=258, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[263]. PartitionReplicaChangeEvent{partitionId=263, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[264]. PartitionReplicaChangeEvent{partitionId=264, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[266]. PartitionReplicaChangeEvent{partitionId=266, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[270]. PartitionReplicaChangeEvent{partitionId=270, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null} Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager INFO: [127.0.0.1]:5731 [DataONEBuildTest] Starting to send partition replica diffs...true Feb 16, 2017 8:23:42 PM com.hazelcast.cluster.ClusterManager INFO: [127.0.0.1]:5731 [DataONEBuildTest] Members [1] { Member [127.0.0.1]:5731 this } Feb 16, 2017 8:23:42 PM com.hazelcast.nio.Connection INFO: [127.0.0.1]:5732 [DataONEBuildTest] Connection [Address[127.0.0.1]:49988] lost. Reason: Explicit close Feb 16, 2017 8:23:42 PM com.hazelcast.client.ConnectionManager WARNING: Connection to Connection [1] [localhost/127.0.0.1:5732 -> 127.0.0.1:5732] is lost Feb 16, 2017 8:23:42 PM com.hazelcast.nio.Connection INFO: [127.0.0.1]:5732 [DataONEBuildTest] Connection [Address[127.0.0.1]:49987] lost. Reason: Explicit close Feb 16, 2017 8:23:42 PM com.hazelcast.client.ConnectionManager WARNING: Connection to Connection [1] [localhost/127.0.0.1:5732 -> 127.0.0.1:5732] is lost Feb 16, 2017 8:23:42 PM com.hazelcast.client.LifecycleServiceClientImpl INFO: HazelcastClient is CLIENT_CONNECTION_LOST Feb 16, 2017 8:23:42 PM com.hazelcast.client.LifecycleServiceClientImpl INFO: HazelcastClient is CLIENT_CONNECTION_LOST Feb 16, 2017 8:23:42 PM com.hazelcast.nio.SocketAcceptor INFO: [127.0.0.1]:5731 [DataONEBuildTest] 5731 is accepting socket connection from /127.0.0.1:49090 Feb 16, 2017 8:23:42 PM com.hazelcast.nio.SocketAcceptor INFO: [127.0.0.1]:5731 [DataONEBuildTest] 5731 is accepting socket connection from /127.0.0.1:49091 Feb 16, 2017 8:23:42 PM com.hazelcast.nio.ConnectionManager INFO: [127.0.0.1]:5731 [DataONEBuildTest] 5731 accepted socket connection from /127.0.0.1:49090 Feb 16, 2017 8:23:42 PM com.hazelcast.nio.ConnectionManager INFO: [127.0.0.1]:5731 [DataONEBuildTest] 5731 accepted socket connection from /127.0.0.1:49091 Feb 16, 2017 8:23:42 PM com.hazelcast.impl.ClientHandlerService INFO: [127.0.0.1]:5731 [DataONEBuildTest] received auth from Connection [/127.0.0.1:49090 -> null] live=true, client=true, type=CLIENT, successfully authenticated Feb 16, 2017 8:23:42 PM com.hazelcast.impl.ClientHandlerService INFO: [127.0.0.1]:5731 [DataONEBuildTest] received auth from Connection [/127.0.0.1:49091 -> null] live=true, client=true, type=CLIENT, successfully authenticated Feb 16, 2017 8:23:42 PM com.hazelcast.client.LifecycleServiceClientImpl INFO: HazelcastClient is CLIENT_CONNECTION_OPENING Feb 16, 2017 8:23:42 PM com.hazelcast.client.LifecycleServiceClientImpl INFO: HazelcastClient is CLIENT_CONNECTION_OPENING Feb 16, 2017 8:23:42 PM com.hazelcast.client.LifecycleServiceClientImpl INFO: HazelcastClient is CLIENT_CONNECTION_OPENED Feb 16, 2017 8:23:43 PM com.hazelcast.client.LifecycleServiceClientImpl INFO: HazelcastClient is CLIENT_CONNECTION_OPENED Feb 16, 2017 8:23:43 PM com.hazelcast.initializer INFO: [127.0.0.1]:5732 [DataONEBuildTest] Destroying node initializer. Feb 16, 2017 8:23:43 PM com.hazelcast.impl.Node INFO: [127.0.0.1]:5732 [DataONEBuildTest] Hazelcast Shutdown is completed in 1612 ms. Feb 16, 2017 8:23:43 PM com.hazelcast.impl.LifecycleServiceImpl INFO: [127.0.0.1]:5732 [DataONEBuildTest] Address[127.0.0.1]:5732 is SHUTDOWN Feb 16, 2017 8:23:43 PM com.hazelcast.impl.LifecycleServiceImpl INFO: [127.0.0.1]:5731 [DataONEBuildTest] Address[127.0.0.1]:5731 is SHUTTING_DOWN Feb 16, 2017 8:23:44 PM com.hazelcast.nio.Connection INFO: [127.0.0.1]:5731 [DataONEBuildTest] Connection [Address[127.0.0.1]:49091] lost. Reason: Explicit close Feb 16, 2017 8:23:44 PM com.hazelcast.client.ConnectionManager WARNING: Connection to Connection [2] [localhost/127.0.0.1:5731 -> 127.0.0.1:5731] is lost Feb 16, 2017 8:23:44 PM com.hazelcast.client.ConnectionManager WARNING: Connection to Connection [2] [localhost/127.0.0.1:5731 -> 127.0.0.1:5731] is lost Feb 16, 2017 8:23:44 PM com.hazelcast.client.LifecycleServiceClientImpl INFO: HazelcastClient is CLIENT_CONNECTION_LOST Feb 16, 2017 8:23:44 PM com.hazelcast.nio.Connection INFO: [127.0.0.1]:5731 [DataONEBuildTest] Connection [Address[127.0.0.1]:49090] lost. Reason: Explicit close Feb 16, 2017 8:23:44 PM com.hazelcast.client.LifecycleServiceClientImpl INFO: HazelcastClient is CLIENT_CONNECTION_LOST Feb 16, 2017 8:23:44 PM com.hazelcast.client.ConnectionManager INFO: Unable to get alive cluster connection, try in 4,999 ms later, attempt 1 of 1. Feb 16, 2017 8:23:44 PM com.hazelcast.client.ConnectionManager INFO: Unable to get alive cluster connection, try in 4,998 ms later, attempt 1 of 1. Feb 16, 2017 8:23:45 PM com.hazelcast.initializer INFO: [127.0.0.1]:5731 [DataONEBuildTest] Destroying node initializer. Feb 16, 2017 8:23:45 PM com.hazelcast.impl.Node INFO: [127.0.0.1]:5731 [DataONEBuildTest] Hazelcast Shutdown is completed in 1072 ms. Feb 16, 2017 8:23:45 PM com.hazelcast.impl.LifecycleServiceImpl INFO: [127.0.0.1]:5731 [DataONEBuildTest] Address[127.0.0.1]:5731 is SHUTDOWN